Part of Climate Week at Columbia Engineering

Hosted by: Data Science Institute, Columbia Engineering, and IBM

Cloud datacenters already consume ~2% of the world’s electricity. With the exponential increase in compute and data, especially given the recent explosion of generative AI, the world’s computing infrastructure has become a significant source of carbon emissions. In order to avoid an exponential rise in computing-related carbon emissions, we must rethink and retool cloud hardware, system software and web applications to become energy and carbon aware. The sustainable cloud computing and AI session at the NYC climate week will feature talks and a panel with researchers from industry and academia who are actively working on solving these hard and pressing problems.

Register


Details & Agenda

Monday, September 23, 2024 (2:00 PM – 5:00 PM ET)

Location: Davis Auditorium (CEPSR) – 4th Floor (Campus Level) – 530 W 120th St, New York, NY 10027

AGENDA (Tentative)

Doors will open by 1:45 PM. Please arrive early to check in and find your seat.

2:00 PM: Welcome

2:10 PM: Opening Remarks

  • Vishal Misra, Professor of Computer Science; and Vice Dean of Computing and Artificial Intelligence, Columbia Engineering
  • Asaf Cidon, Associate Professor of Electrical Engineering, Columbia Engineering

2:20 PM: Presentation: Tamar Eilam, IBM Research (25 min)

  •  Talk: The Challenge of AI Sustainability

2:45 PM: Presentation: Mosharaf Chowdhury, University of Michigan (25 min)

  • Talk: Toward Energy-Optimal AI Systems

3:10 PM: Presentation: Benjamin Lee, University of Pennsylvania (25 min)

  • Talk: Towards Sustainable Artificial Intelligence and Datacenters

3:35 PM: Break (10 min)

3:45 PM: Panel on Sustainable AI (35 min panel, 10 min Q&A)

  • Ramya Raghavendra, Meta Fundamental AI Research (FAIR)
  • Udit Gupta, Cornell Tech
  • Mosharaf Chowdhury, University of Michigan
  • Benjamin Lee, University of Pennsylvania
  • Moderator: Asaf Cidon

4:30 PM: Open Networking

5:00 PM: End


Speaker & Talk Information

Tamar EilamTalk & Panelist
Tamar Eilam, IBM Fellow, Chief Scientist for Sustainable Computing, IBM Research

Talk Title: The Challenge of AI Sustainability

Abstract: Artificial intelligence (AI) offers immense potential to accelerate scientific discoveries crucial for combating climate change. However, this powerful tool comes with a significant environmental cost due to its substantial energy consumption and carbon emissions. This talk explores the critical challenge of harnessing AI’s capabilities while minimizing its ecological footprint. We will examine the current landscape, discuss emerging strategies for sustainable AI, and consider the delicate balance between technological advancement and environmental responsibility. By addressing this issue, we aim to pave the way for more eco-friendly AI applications in climate science and beyond.

Mosharaf ChowdhuryTalk & Panelist
Associate Professor, Computer Science and Engineering, University of Michigan, Ann Arbor

Talk Title: Toward Energy-Optimal AI Systems

Abstract: Generative AI adoption and its energy consumption are skyrocketing. For instance, training GPT-3, a precursor to ChatGPT, consumed an estimated 1.3 GWh of electricity in 2020. By 2022, Amazon trained a large language model (LLM) that consumed 11.9 GWh, enough to power over a thousand U.S. households for a year. AI inference consumes even more energy, because a model trained once serves millions. This surge has broad implications. First, energy-intensive AI workloads inflate carbon offsetting costs for entities with Net Zero commitments. Second, power delivery is now the gating factor toward building new AI supercomputers. Finally, this hinders deploying AI services in places without high-capacity electricity grids, leading to inequitable access to AI services.

In this talk, I will introduce the ML Energy Initiative, our effort to understand AI’s energy consumption and build a sustainable future by curtailing AI’s runaway energy demands. I will introduce tools to precisely measure AI’s energy consumption and findings from using them on open-weights models, algorithms to find and navigate the Pareto frontier of AI’s energy consumption, and the tradeoff between performance and energy consumption during model training. I will also touch upon our solutions to make AI systems failure-resilient to reduce energy waste from idling. This talk is a call to arms to collaboratively build energy-optimal AI systems for a sustainable and equitable future.

Benjamin LeeTalk & Panelist
Professor of Electrical and Systems Engineering and Computer and Information Science, University of Pennsylvania

Talk Title: Towards Sustainable Artificial Intelligence and Datacenters

Abstract: As the impact of artificial intelligence (AI) continues to proliferate, computer architects must assess and mitigate its environmental impact. This talk will survey strategies for reducing the carbon footprint of AI computation and datacenter infrastructure, drawing on data and experiences from industrial, hyperscale systems. First, we analyze the embodied and operational carbon implications of super-linear AI growth. Second, we re-think datacenter infrastructure and define a solution space for carbon-free computation with renewable energy, utility-scale batteries, and job scheduling. Finally, we develop strategies for datacenter demand response, incentivizing both batch and real-time workloads to modulate power usage in ways that reflect their performance costs. In summary, the talk provides a broad perspective on sustainable computing and outlines the many remaining directions for future work. 

Ramya Raghavendra – Panelist
Director, Meta Fundamental AI Research (FAIR)

Udit Gupta – Panelist
Assistant Professor, Department of Electrical and Computer Engineering, Cornell Tech