Discourse: Analysis of the Environmental Footprint Induced by Generative Artificial Intelligence Systems
Interview with Vijay Gadepally, AI Wunderkind at MIT's Lincoln Lab:
Vijay Gadepally, a movers-and-shakers at MIT Lincoln Lab, masterminds a slew of projects at the Lincoln Lab Supercomputing Center (LLSC). He's on a mission to dial up the efficiency of computing systems and the AI engines driving them. Here, he shares his insights on the booming trend of generative AI, its overlooked carbon footprint, and ways to keep Mother Earth as green as a shamrock.
Q: What's the latest in the realm of generative AI?
A: Generative AI's party trick involves learning from data (machine learning, or ML for short) to create new content like pics and prose. The LLSC is home to some of the world's biggest academic computing stacks, and we've been experiencing a wild surge in projects thirsting after high-performance computing for generative AI. It's then on to transforming industries, like ChatGPT storming its way into classrooms and offices, faster than guard rails can keep up.
Q: So, what's the eco-friendly playbook at the LLSC?
A: Our goal is to make computing more slick and efficient, a deal that benefits both our data center resources and our science folk pushing boundaries.
One strategy is tweaking our hardware's power consumption, like dimming the lights or turning off the telly when you step out. In one of our experiments, we managed to slice power consumption by 20-30 percent in a group of graphics processing units, with only minor dings to performance, by implementing a power cap.
Another idea is getting smart with our habits. At home, some of us might choose green energy sources or smart scheduling. We're employing similar tactics at the LLSC, like training AI models when it's cooler or when grid demand is low.
We've also clocked that a lot of energy is wasted through tasks that don't bear fruit. We cooked up new methods to monitor workloads while they're running and shut down the duds. Believe it or not, in numerous instances, we found that the bulk of computations could be chucked without messing with the end product.
Q: Can you share an example of a project driving down the energy hogging of generative AI?
A: We recently whipped up an energy-savvy computer vision tool. Computer vision's all about using AI for images, so identifying whether a picture shows a cat or a dog, labeling objects, or zeroing in on items of interest.
Our tool came loaded with real-time carbon emissions telemetry, which provided data on emissions from our local grid as the model ran. Depending on this data, our system would switch to a more energy-efficient version of the model at high carbon intensity or a higher-precision version at low carbon intensity.
The results were chilling – we clocked an almost 80 percent carbon reduction for one to two days. We've since extended this plan to other generative AI tasks like text summaries, and sometimes, performance even grew better with our method!
Q: As a consumer, how can I help lessen the environmental impact of generative AI?
A: As users, we can demand more transparency from our AI suppliers. Platforms like Google Flights provide insight on fuel consumption for flights, so let us get similar eco stats from our generative AI tools. Knowledge is power, after all!
Education is the name of the game, too. Many of us are familiar with car emissions, and comparing generative AI emissions to everyday activities can help drive the point home. For instance, one image-generation task is on par with driving four miles in a gas-guzzler, or the amount of juice needed to charge an electric car equals about 1,500 text summaries.
Q: What lies ahead?
A: Emission-friendly generative AI is a worldwide mission, pursued by folks all over the map. We're merely scratching the surface here at the LLSC, but collaboration is key to making progress. In the long haul, partnerships between data centers, AI developers, and power grids will be essential to perform "energy audits" for finding new ways to improve computing efficiencies. If you're keen to know more or join the mission, feel free to hit up Vijay Gadepally!
- Vijay Gadepally, at MIT's Lincoln Lab, focuses on enhancing the efficiency of computing systems and AI engines.
- Generative AI learns from data to produce new content and is experiencing a surge in academic computing projects at the LLSC.
- The eco-friendly strategy at the LLSC involves optimizing power consumption and smart habits to save resources.
- An experiment at the LLSC reduced power consumption in graphics processing units by 20-30 percent with minor performance impacts.
- The LLSC employs tactics like training AI models during cooler or low-demand times to conserve energy.
- They developed a computer vision tool that uses carbon emissions telemetry to reduce energy consumption during high carbon intensity.
- Users can demand transparency from AI suppliers for eco-stats and compare generative AI emissions to everyday activities for better understanding.
- The future holds collaboration between data centers, AI developers, and power grids to perform energy audits for improved computing efficiencies, with Vijay Gadepally open to conversations about this mission.