Data Science Day 2020
Monday, September 14, 2020
10:00 am - 12:00 pm
Monday, September 14, 2020
10:00 am - 12:00 pm
On Monday, September 14, 2020, the Data Science Institute hosted Ethics & Privacy: Terms of Usage to bring together thought leaders driving the discussion around privacy and emerging technologies. As part of Data Science Day, the Institute’s flagship annual event, the event provided a forum for innovators in academia, industry, and government to connect.
All speakers and their respected roles/titles are accurate to time of the event (2020)
In conversation with Jeannette M. Wing, Avanessians Director of the Data Science Institute and Professor of Computer Science at Columbia University
Kelvin J. Lancaster Professor of Economic Theory, Department of Economics, Columbia University
Talk Title: The Effect of Privacy Regulation on the Data Industry: Empirical Evidence from GDPR
Abstract: Utilizing a novel dataset from an online travel intermediary, we study the effects of EU’s General Data Protection Regulation (GDPR). The opt-in requirement of GDPR resulted in 12.5% drop in the intermediary-observed consumers, but the remaining consumers are trackable for a longer period of time. These findings are consistent with privacy-conscious consumers substituting away from less efficient privacy protection (e.g, cookie deletion) to explicit opt-out—a process that would make opt-in consumers more predictable. Consistent with this hypothesis, the average value of the remaining consumers to advertisers has increased, offsetting some of the losses from consumer opt-outs. Joint research with Guy Aridor (PhD Candidate, Economics, Columbia University) and Tobias Salz (Assistant Professor of Economics, MIT)
Associate Professor, Department of Computer Science, Columbia Engineering
Talk Title: Security and Privacy Guarantees in Machine Learning with Differential Privacy
Abstract: Machine learning (ML) is driving many of our applications and life-changing decisions. Yet, it is often brittle and unstable, making decisions that are hard to understand or can be exploited. Tiny changes to an input can cause dramatic changes in predictions; this results in decisions that surprise, appear unfair, or enable attack vectors such as adversarial examples. Moreover, models trained on users’ data can encode not only general trends from large datasets but also very specific, personal information from these datasets; this threatens to expose users’ secrets through ML models or predictions. This talk positions differential privacy—a rigorous privacy theory—as a powerful, common foundation for building into ML much-needed guarantees of security, stability, privacy, and fairness alike. We draw upon our recent results using differential privacy to secure ML models against adversarial example attacks and to build a privacy-preserving Tensorflow-based platform that stops the leakage of training data through the models it pushes into production.
Associate Professor, Department of Biostatistics, Columbia University Mailman School of Public Health
Talk Title: Building a More Ethical Data Science: Lessons From Public Health
Abstract: Goldsmith will discuss a public health perspective on ethical questions related to data science, which is shaped by a number of factors. In particular, research on human subjects imposes clear responsibilities, including respect for persons, beneficence, and justice, among others. More practically, public health researchers learn from data that can be messy and imperfect, and question whether an observation is a valid measure for a construct of interest, or if selection biases create a mismatch between the sample and target population. A shift towards complex data and analytic approaches makes these more critical than ever.
Professor, Department of Biological Sciences, Faculty of Arts and Sciences, Columbia University
Talk Title: The NeuroRights Initiative: Human Rights Guidelines for Neurotechnology and AI in a Post-COVID World
Abstract: In my talk I will review the proposal that was made by the Morningside Group to introduce five new Human Rights into the Universal Declaration of Human Rights (1). These rights (“NeuroRights”) will protect mental privacy, personal identity, personal agency, equal access to cognitive augmentation and protection from algorithmic biases. Recently, protection of mental privacy has become particularly urgent, because of the fast development of brain-computer interfaces (BCIs) and the increased attacks to privacy due to COVID-related governmental measures. I will also review our proposal to follow a medical model, introducing a “Technocratic Oath” as a deontology in the Neurotech and data industry and using existing societal mechanisms similar to those already implemented in the medical industry to regulate future development of Neurotech and AI (2). Finally, I will discuss current advocacy efforts for NeuroRights in the US and different countries.
1. Yuste, R., Goering, S. and the Morningside Alliance Group (2017). Four ethical priorities for neurotechnologies and artificial intelligence. Nature 551, 159–163; 2017.
2. Goering, S. and Yuste, R. (2016). On the Necessity of Ethical Guidelines for Novel Neurotechnologies. Cell 167: 882-885.
Assistant Professor, School of International and Public Affairs, Columbia University
On Tuesday, March 31, 2020, the Data Science Institute hosted a virtual sneak preview of Data Science Day. This interactive gathering showcased 27 data science research posters and videos created by Columbia University faculty and students.
The Data Science Day 2020 virtual sneak preview brought together hundreds of remote attendees from around the world:
The event has resulted in new partnership introductions and industry opportunities for its participating research teams. Further, these activities helped to bring more than 400 new people into the extended Data Science Institute community, who have signed up to learn more about admissions, student services, and upcoming events.
DSI Industry Affiliates have access to Data Science Day recordings after the event. If you are a current DSI Industry Affiliate please contact us at email@example.com for a link to the videos.