“The question for me is what is the role of the social worker in the technological era and how can I use data techniques and AI to enhance and modernize the profession for the 21st century.” – Aviv Landau

These days, many adolescents lead two parallel and interchangeable lives; one in the real world, and another online. That duality makes it doubly hard for them to cope with rejection, especially when social media amplifies the real world phenomenon of interpersonal rejection—a harsh experience that undermines their sense of belonging, according to Aviv Landau, a postdoctoral research fellow at the Data Science Institute (DSI) at Columbia University.

Landau became a social worker to work with marginalized youth and enhance equity; he studied data science, he says, to mine the ocean of data that offers deep insight into the adolescent mind. Today, as a DSI postdoctoral fellow, he calls upon his varied skills to produce research he hopes will help young people overcome some of the obstacles in their lives.  “My fervent hope is that this research contributes to the development of intervention and detection methods to solve some of the problems faced by society’s most vulnerable citizens.”

Before coming to DSI, Landau worked as a social worker in Israel and focused on adolescents from marginalized groups. He completed his doctoral dissertation at the University of Haifa on young people who suffered social rejection on social media. At the time, social workers were starting to study how the internet and social media exacerbates social problems, and Landau distinguished himself as a researcher in the subfield. Today, he continues to focus on young people from marginalized groups, particularly how they cope with the compounded stress of growing up in the age of social media, where issues of identity, social status, and rejection play out hourly and sometimes savagely.

Landau is also associate director of Columbia’s SAFElab, where he supervises and collaborates with a team of graduate students, staff, and affiliates who use AI to study interactions on social media. Thier current projects include discerning how friendship, grief, and social status play out online in Black youth culture and how youth are rejected online by their peers, and how that rejection affects their self-esteem. The SAFElab was founded by Landau’s advisor, Desmond Patton, an associate professor of social work and DSI member who is nationally known for studying how young marginalized people express themselves online. He says he is delighted to have Landau as both a research partner and lab manager.

“Dr. Landau is the future of socialwork science, pairing data science with years of experience as a social work professional to answer new questions at the intersections of abuse and neglect, health, and computational methods,” Patton says. “He’s an innovative and promising scholar and I’m excited to work with him in the SAFElab.”

Landau has a talent for partnering with experts from disparate fields—computer scientists, nurses and social workers—who are adept at using data science. In addition to Patton, who is a social worker and data scientist, Landau also partners with Max Topaz, a nursing professor and DSI member who uses data science to enhance nursing. The trio is currently collaborating on a project using online datasets to identify the urgent needs of marginalized groups affected by coronavirus/COVID-19. Black and Latinx communities have suffered disproportionately high rates of COVID-19-related infections and deathsevidence of a long history of racial disparities in the U.S., Landau says. “Our ability to identify the immediate needs of these communities can help provide community leaders with valuable information that can have an impact and even save lives in times of uncertainty,” he adds.

Landau, Patton, and Topaz are also developing an AI system that will help substantiate reports of child abuse and neglect in hospitals and clinics. Such reports have reached epidemic proportions, says Landau, with Black and Latinx caregivers often unfairly accused of abusing children. The team is designing an algorithm that can scan electronic health records for keywords and phrases that flag child abuse. They are also creating a taxonomy of risk factors to help officials detect instances of abuse and neglect without bias. The project recently received a DSI seed fund grant.

“We are using AI to create a clinical-decision support algorithm that will foster objectivity in the detection of child abuse and neglect,” Landau says. “Our intent is to use the tool to reduce racial bias in making determinations of child abuse. It’s a painful process to go through and every parent or guardian, no matter their race, should be treated fairly.”

— Robert Florida