The Intelligent and Connected Systems Lab (ICSL) at Columbia University leverages insights from a variety of networked sensing technologies to improve people’s lives.
The group’s most recent development, SIFTER, is a low-cost imaging system that enables continuous fever screening, in real time, at a distance, of multiple people at once, without the need for human intervention or checkpoints.
We spoke with ICSL leader Xiaofan (Fred) Jiang, an associate professor of electrical engineering at Columbia Engineering and co-chair of the Data Science Institute’s Smart Cities research center, about the lab’s work and its recent efforts to ensure more robust temperature measurements.
What kind of work does the ICSL do?
We develop new theories and technologies that are solving real problems in the world. We’re always looking for new ways we can contribute. Our work cuts across emerging fields of IoT, AI at the edge, and cyber-physical systems to explore topics such as smart and sustainable buildings, mobile and wearable systems, environmental monitoring and control, connected health and fitness, and urban safety. We work on intelligent systems with humans in the loop. This can be interpreted in a range of ways, from having humans as end users as in the case of urban safety, to having them be an active participants in helping reduce energy consumption, to having them be the subject of sensing, for example, sensing fevers.
How did the SIFTER system originate?
We realized that fever screening is important, so we challenged ourselves to come up with algorithms or models that could help make imaging accurate at a distance—and affordable. Our system is innovative in how it combines thermal imaging, RGB imaging, machine learning, and 3D modeling to detect and track individual heads in the images, and then classify people as having or not having fever. Essentially, the system continuously measures the temperature of people in the camera’s field of view. It works by obtaining the key temperature features of people’s heads to produce fever screening predictions in real-time, significantly improving screening throughput while minimizing disruption to normal activities.
What are the applications?
Our system can help track and prevent the spread of disease, with applications around the entrances of public buildings, such as hospitals, schools, businesses, cafes, restaurants, mass transit, toll booths, etc. Widespread, continuous fever screening can also provide an early warning system for future pandemics that have fever as a symptom, and for monitoring and responding to seasonal flu.
Has the system been tested?
Yes, we tested SIFTER in two locations—at a local restaurant and a medical practice—with excellent results. We screened over 4,000 people, with a measurement error of .4 to .6 degrees at up to 3.5 meters. In comparison, regular infrared scanners, which can cost thousands of dollars, have a measurement error of about 1 degree at .5 meters.
What are other important features?
The system is designed so that the threshold value for fever can be tailored to the specific use case. It can be set lower to act as an initial screening step, to be followed by manual screening of positive cases. Or it can be adjusted, for extremely busy environments, for instance, to be particularly sensitive to high fevers.
How will the system be deployed—and can others access it?
We are collaborating with epidemiologist Andrew Rundle from the Mailman School of Public Health in applying this technology towards developing a worldwide flu tracking system. We also have made the system available to the public. We want as many people as possible to be able to use it. We’ve created a website that teaches others how to use it, and have made the sensor code open source.
The SIFTER system consists of three parts—the hardware [camera and embedded edge device], the cloud server, where the data is processed, and a dashboard for monitoring results in real time. The hardware SIFTER uses only costs around $300 to set up and operate. The front end software—the code that runs on the camera—is open source. The server, hosted in our lab, performs the back-end calculations and then makes the results available.
A paper on this project has been accepted to the ACM/IEEE Conference on Information Processing in Sensor Networks (IPSN 2022).
— Karina Alexanyan, Ph.D.