Microchips.

Craig Dennis/Pexels

Science & Tech

Smaller, faster, greener

5 min read

Examining the environmental impact of computation and the future of green computing

When you think about your carbon footprint, what comes to mind? Driving and flying, probably. Perhaps home energy consumption or those daily Amazon deliveries. But what about watching Netflix or having Zoom meetings? Ever thought about the carbon footprint of the silicon chips inside your phone, smartwatch or the countless other devices inside your home?

Every aspect of modern computing, from the smallest chip to the largest data center comes with a carbon price tag. For the better part of a century, the tech industry and the field of computation as a whole have focused on building smaller, faster, more powerful devices — but few have considered their overall environmental impact.

Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) are trying to change that.

“Over the next decade, the demand, number and types of devices is only going to grow,” said Udit Gupta, a Ph.D. candidate in Computer Science at SEAS. “We want to know what impact that will have on the environment and how we, as a field, should be thinking about how we adopt more sustainable practices.”

Gupta, along with Gu-Yeon Wei, the Robert and Suzanne Case Professor of Electrical Engineering and Computer Science, and David Brooks, the Haley Family Professor of Computer Science, will present a paper on the environmental footprint of computing at the IEEE International Symposium on High-Performance Computer Architecture on March 3, 2021.

The SEAS research is part of a collaboration with Facebook, where Gupta is an intern, and Arizona State University.

The team not only explored every aspect of computing, from chip architecture to data center design, but also mapped the entire lifetime of a device, from manufacturing to recycling, to identify the stages where the most emissions occur.

They found that most emissions related to modern mobile and data-center equipment come from hardware manufacturing and infrastructure.

“A lot of the focus has been on how we reduce the amount of energy used by computers, but we found that it’s also really important to think about the emissions from just building these processors,” said Brooks.  “If manufacturing is really important to emissions, can we design better processors? Can we reduce the complexity of our devices so that manufacturing emissions are lower?”

Take chip design, for example.

Today’s chips are optimized for size, performance and battery life. The typical chip is about 100 square millimeters of silicon and houses billions of transistors. But at any given time, only a portion of that silicon is being used. In fact, if all the transistors were fired up at the same time, the device would exhaust its battery life and overheat. This so-called dark silicon improves a device’s performance and battery life but it’s wildly inefficient if you consider the carbon footprint that goes into manufacturing the chip.

“You have to ask yourself, what is the carbon impact of that added performance,” said Wei. “Dark silicon offers a boost in energy efficiency but what’s the cost in terms of manufacturing? Is there a way to design a smaller and smarter chip that uses all of the silicon available? That is a really intricate, interesting, and exciting problem.”

The same issues face data centers. Today, data centers, some of which span many millions of square feet, account for 1 percent of global energy consumption, a number that is expected to grow.

As cloud computing continues to grow, decisions about where to run applications — on a device or in a data center — are being made based on performance and battery life, not carbon footprint.

We need to be asking what’s greener, running applications on the device or in a data center,” said Gupta. “These decisions must optimize for global carbon emissions by taking into account application characteristics, efficiency of each hardware device, and varying power grids over the day.”

The researchers are also challenging industry to look at the chemicals used in manufacturing. 

Adding environmental impact to the parameters of computational design requires a massive cultural shift in every level of the field, from undergraduate CS students to CEOs.

To that end, Brooks has partnered with Embedded EthiCS, a Harvard program that embeds philosophers directly into computer science courses to teach students how to think through the ethical and social implications of their work. Brooks is including an Embedded EthiCS module on computational sustainability in “COMPSCI 146: Computer Architecture” this spring.

The researchers also hope to partner with faculty from Environmental Science and Engineering at SEAS and the Harvard University Center for the Environment to explore how to enact change at the policy level.

“The goal of this paper is to raise awareness of the carbon footprint associated with computing and to challenge the field to add carbon footprint to the list of metrics we consider when designing new processes, new computing systems, new hardware, and new ways to use devices. We need this to be a primary objective in the development of computing overall,” said Wei.

The paper was co-authored by Sylvia Lee, Jordan Tse, Hsien-Hsin S. Lee and Carole-Jean Wu from Facebook and Young Geun Kim from Arizona State University.