Augustin Chaintreau thinks discussions of ethics should be woven into every aspect of Columbia’s computer science curriculum.

As it is now, computer science majors at Columbia take a stand alone ethics course. But since computer scientists build platforms that affect people’s privacy and security, every computing science class should have an ethical component, says Chaintreau, an associate professor of computer science. And this summer, he’s doing just that: Leading a team of professors and students who are creating an ethics curriculum for Columbia’s computer science classes.

The team is creating an “ethical companion,” a chapter on ethics intended to complement the widely used textbook “Networks, Crowds, and Markets: Reasoning about a Highly Connected World.” They are also building a website with slides, lectures and case studies that probe the ethical dilemmas in areas such as computer vision, social networks and algorithm design. Though most computer science textbooks are excellent, what they tend to lack are discussions of ethics, says Chaintreau, who is a member of the Data Science Institute. “So we are writing a chapter on ethics,” he adds, “which we hope professors will use to complement the textbooks and fill the gap on ethics.”

Last semester, the team began a major initiative: modifying computer science classes to emphasize ethics and fairness. Lydia Chilton, an assistant professor of computer science, piloted a modified course called User Interface Design. In the class, students discussed how computer applications are engineered to induce users to increase their clicks and engagement, and the ethical boundaries surrounding such intent. Chilton says the students grappled with questions such as, “Do we really want to build apps that people use purely out of habit, and not because the app delivers value?” And since clicks are easy to define and measure but true value is not, another question the students probed was, “How do we design and measure systems that deliver real value to users?” Chilton says the class discussed positive and negative examples of such design and proposed solutions for some of the unethical applications.

“With so many ethical issues in the news around the way apps and social networks may be manipulating,” adds Chilton, “the students were very engaged in these tough issues. These students will be the people who build the next generation of applications and they don’t want to make the same mistakes.”

Nakul Verma, a lecturer in discipline of computer science, also piloted a class – modified machine learning course that discussed how bias can creep into automated systems. He led discussions on how social bias can inadvertently occur while training automated systems to function. The class also focused on how to design algorithms that circumvent bias and promote fairness in automated decision making.

“This pilot class provided an excellent opportunity to distill advances in ethical computing and seamlessly integrate it into the current teaching curriculum,” says Verma. “It will enable students to understand and address societal consequences of the technology they would help develop in the future.”

Chaintreau, who himself taught a customized class on the ethics of social networks and crowds, says the development of a computer science curriculum focused on ethics has wide support from faculty throughout Columbia. And he hopes all computer science courses at the university will soon be infused with a consideration of ethics. He’ is also the principal investigator on a grant the team received from the Responsible Computer Science Challenge program. The Challenge is an initiative by Omidyar Network, Mozilla, Schmidt Futures, and Craig Newmark Philanthropies that aims to integrate ethics into undergraduate computer science curricula at universities. The grant is supporting the team’s effort to design the ethical companion for computer science textbooks.

Kathy Pham, a Mozilla Fellow who co-leads the Responsible Computer Science Challenge program, says today’s advanced technology can influence what journalism we read, what political discussions we have, and whether we qualify for a mortgage or an insurance policy.

 

“The grant winners are taking crucial steps to integrate ethics and responsibility into core courses like algorithms, neural networks, and data structures,’ says Pham. “And they will release their materials in the open, allowing other professors at other universities to use them. By deeply integrating ethics into computer science curricula and sharing the content openly, we can create more responsible technology from the start.”

Chaintreau entered computer science in 1998, when the Internet was beginning to reshape the world. He and his colleagues talked at that time about the ethics of computing, he recalls, but the talk centered around technical problems such as how to equitably share applications. When he realized a few years later, though, that personal data – how we collect and use them – would create tectonic changes in the culture, he decided to become a university professor. “That’s the only position from which I can help students understand what’s at stake ethically in contemporary computing,” he says, “and how to frame those moral questions.”

This summer, Columbia is also helping young students understand the ethics of computing. The university is hosting a summer program for 20 underrepresented high school students aimed at teaching them to navigate the ethical questions surrounding artificial intelligence. The program will have the students work on developing a “data for good” curriculum. Professor Desmond Patton, who is co-directing the program with Chaintreau, is a leading expert on how social media affects the wellbeing of young people of color. And the two are working with Al4ALL, a nonprofit that funds programs to increase gender and racial diversity in high-tech fields.

In 2017, Patton founded a similar summer program at Columbia, the Digital Scholars Lab, which served as a pipeline for high school and college-aged students from marginalized communities in New York City to enter the field of technology. The lab was a success and Patton wanted to bring a like-minded program to Columbia. He did some research and discovered the AI4ALL program, which he arranged to host this summer at Columbia.

Patton, associate dean for curriculum innovation and academic affairs at Columbia’s School of Social Work and a member at the Data Science Institute, says developing equitable technology is dependent upon the diversity of the people working on the technology.

“We need diverse opinions, perspectives, and ideas in order to develop equitable technology,” says Patton. “Individuals of colors are dramatically missing from AI development, training, and jobs. And I believe Al4ALL is instrumental in changing that, which is why we are delighted to have the program hosted this summer at Columbia. It’s part of the overall effort at Columbia and at the Data Science Institute to ensure that technology is created equitably and used ethically for the good of society.”

— Robert Florida