TDWI Articles

Powering a Greener Future: How Data Centers Can Slash Emissions

It is not only AI models impacting earth’s carbon footprint. Any workload involving large data sets can have the same effect. Here’s what data centers can do to mitigate carbon emissions.

We often overlook the fact that every digital action has a carbon footprint -- chatting online, running a query, watching a video, and so on. The more power consumption needed for any given digital task, the larger its carbon footprint. Even sending a standard email yields about 4g of CO2 emissions. Attach a picture, and the emissions rise to an average of 50g. AI workloads, now increasingly common due to the generative AI boom, are adding to this issue. Researchers at Cornell University assessed the carbon footprint of training an intensive machine learning model with 176 billion parameters and found that the process can emit over 50.5 tons of CO2. That’s 25% more than the lifetime emissions of the average American car.

For Further Reading:

Compliance Impediments for Government Data Centers

3 Data Management Rules to Live By

How the Right Data Management Foundation Fosters Digital Transformation Success

This staggering environmental toll is not only true for AI models. Any workload involving large data sets or complex queries requires immense computing power. A team from Cambridge University determined that a workload analyzing large data sets can emit the carbon equivalent of 346 flights between Paris and London.

Pollution Solutions

As data and analytics have inarguably become the fuel of business success, the rise of data centers is outpacing our ability to mitigate the resultant carbon emissions. If data industry leaders don’t seek new methods of carbon reduction and embrace more energy-efficient processing, the costs will quickly become insurmountable.

Thankfully, companies are increasingly setting specific carbon emission targets, either because of their own environmental, social, and governance (ESG) goals or due to legal requirements or regulations. In fact, these targets may even be good for business. A recent McKinsey study found that companies with products with ESG-related claims saw 8% more cumulative growth than companies that did not associate their products with ESG. A recent poll of American consumers found that, despite inflation, 66% of consumers are willing to pay more for sustainable products and services.

Many organizations already aim to have net-zero emissions by 2050, but most are focusing on alternative and renewable energies, which is good but insufficient because it misses the core of the problem: the overconsumption of energy in the data center due to misused infrastructure.

Far more action is needed in the short term.

The chip approach, which can complement other forms of green technology and requires far less infrastructure reconfiguration than alternative energy grids or cooling systems, is ripe for expansion into other areas of processing that have large carbon footprints. Consider that most data centers still conduct analytics and database workloads with CPUs, a processor that was in no way designed to handle such large amounts of data, resulting in excessive power consumption and data center footprint, not to mention CPUs’ relatively slow processing speeds

The fact that Moore’s law is slowing necessitates a new solution, especially for analytics, which is the largest workload in the data center and the likely cause of most of the energy consumption contributing to carbon emissions. The integration of specialized hardware will not only be more environmentally friendly; it will improve general capabilities and lower data center costs for enterprises everywhere.

There’s No Time to Waste

For all the good that big data and AI can do for countless industries, data centers’ carbon emissions have the potential to cause enormous damage to the environment.

With the increasing impacts of climate change weighing on us, enterprises, cloud providers, and data center operators alike simply cannot afford to put off aligning their processing capabilities with their most energy-intensive workloads to both lower their cost structure and reduce carbon emissions.

After all, what good is innovative technology if it doesn’t actually change the world?

About the Author

Jonathan Friedmann is the co-founder and CEO at Speedata. Previously, Friedman was CEO and co-founder of Centipede, which developed IP for general purpose processors. He also served as COO and VP R&D at Provigent, a cellular infrastructure semiconductor company acquired by Broadcom. You can contact the author via LinkedIn.


TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.