Skip to main content

Sponsored Article

Addressing AI’s sustainability conundrum

Sponsored: We may never fully realize AI’s potential to solve profound global challenges if we don’t first address its environmental impact today.

Glowing multicolored leaf with computer chip laying on top of it.

Image courtesy of WEKA.

This article is sponsored by WEKA.

Artificial intelligence (AI) is transforming our world by dramatically increasing the pace of modern research, discovery and scientific breakthroughs while fueling an unprecedented wave of innovation.

In January, the World Economic Forum heralded AI as a key pillar of "the global growth story of the 21st century" with the promise of not only contributing to the global GDP but also helping to combat global climate change.   

There’s just one problem — AI is contributing to exponential annual increases in global power consumption and carbon emissions.

While there has been robust societal discourse around the ethics of AI, it typically focuses on possible negative societal consequences such as privacy issues, unintentional biases or the potential for bad actors to use it to create chaos. Rarely, if ever, does it touch on AI’s environmental impacts. 

The inconvenient truth is that AI, one of our most powerful tools in the fight against climate change, is also one of its worst offenders. Without intervention, AI will only accelerate the climate crisis if we do not commit to quickly tame its insatiable energy demands and carbon footprint. 

But it’s not too late. Curtailing AI’s environmental impact is possible by rethinking how to manage the massive amounts of data and energy required to fuel it with more climate-friendly solutions we can implement today.

AI’s massive appetite for energy

AI and its siblings, machine learning (ML) and high-performance computing (HPC), are exceptionally energy-hungry and performance-intensive. To reach their full productivity and potential, these digital transformation engines require a near-endless supply of data and a significant amount of power to run. 

What’s worse, traditional data architectures only compound the issue, causing latency and bottlenecks in the data pipeline because they weren’t designed to deliver data smoothly and continuously. According to recent research, the graphical processing units (GPUs) that power AI and ML workloads are typically underused up to 70 percent of the time, sitting idle while waiting for data to process. As a result, training an AI model can take days, even weeks, to complete.

From a sustainability perspective, this is a huge problem since underused GPUs consume enormous amounts of energy and spew needless carbon while they idle. While industry estimates vary, roughly 3 percent of global energy consumption today can be attributed to the world’s data centers — double what it was just 10 years ago. The explosion of generative AI, ML and HPC in modern enterprises and research organizations is causing that to accelerate faster than anyone could have anticipated. 

In October, independent research firm Gartner Inc. predicted: "By 2025, without sustainable AI practices, AI will consume more energy than the human workforce, significantly offsetting carbon-zero gains."

Curbing AI’s energy consumption and carbon footprint are issues we must collectively commit to solving with urgency. As AI and HPC adoption accelerates at breakneck speed, we can no longer ignore their environmental impact.

Rethinking the modern data stack

A primary culprit exacerbating AI’s inefficiencies is traditional data infrastructure and data management approaches, which aren’t equipped to support AI workloads simply because they weren’t built to support next-generation technologies like GPUs with a steady barrage of data moving at impossible speeds efficiently.

In the era of cloud and AI, enterprise data stacks need a complete rethink. To harness next-generation workloads such as AI, ML, and HPC, they need to be capable of running seamlessly anywhere data is created, lives, or needs to go — whether on-premises, in the cloud, at the edge or in hybrid and multicloud environments. This requires that they be architected for hybrid cloud and software-defined.

Rethinking the data stack requires revisiting and reevaluating the data lake. While data lakes proved useful in the past decade, providing a central location to access data more efficiently without creating multiple copies, GPU appetites for data often exceed what’s available in the average data lake to fuel workloads such as generative AI's large-scale data processing requirements.

It’s time to start rearchitecting the stack to support datasets that are orders of magnitude larger than what today’s data lakes can deliver. While we’re at it, we must abandon data storage silos in favor of more dynamic systems that can pipeline data in a continuous, steady stream to meet an AI engine’s insatiable data requirements. This isn’t just another bigger, better data lake — processes must be implemented to better manage the flood of data servicing the ever-hungry GPUs so they’re never left idle again, increasing their efficiency and sustainability.

Charting a path forward in the cloud

Another solution is to integrate the cloud into modern enterprise data architectures. Incorporating a hybrid cloud approach makes infinite sense as our world becomes increasingly distributed. Migrating even some applications and workloads to the cloud can have an immediate and outsized impact on an organization’s energy and carbon impact in the short term, especially as more public cloud providers are building their hyperscale data centers to be ultra-efficient and powered by part or all renewable energy sources.

According to a recent study by McKinsey & Company: "With thoughtful migration to and optimized usage of the cloud, companies could reduce the carbon emissions from their data centers by more than 55 percent — about 40 megatons of CO2e worldwide, the equivalent of the total carbon emissions from Switzerland."

Now that’s a tangible impact.

Taking the first step for a positive impact

Reversing climate change will require global action on many fronts. Abatement of the energy use and greenhouse gas emissions associated with AI and enterprise technology stacks is one way that CEOs, CIOs, CDOs and other business and research leaders can reduce their companies’ carbon footprints to support their organization’s — and the world’s — sustainability goals. But this is only the first step.

It’s time we balance AI’s clear potential with raising more awareness for its environmental impact and unite the scientific, business, political and technology communities in finding solutions to harness it more efficiently and sustainably.

More on this topic

More by This Author