Big Tech Conundrum: It Can’t Have A.I. and Reduce “Emissions” – Whatever Will It Do?

by
Steve MacDonald

Artificial Intelligence is all the rage, but its pursuit—and that of profits—has created a problem. You don’t birth digital consciousness out of nothing. Even the most basic AI requires a lot of electricity, and the more powerful the system is, the more users call upon it and the more significant the “emissions” needed to keep it operating.

Researchers have been raising general alarms about AI’s hefty energy requirements over the past few months. But a peer-reviewed analysis published this week in Joule is one of the first to quantify the demand that is quickly materializing. A continuation of the current trends in AI capacity and adoption are set to lead to NVIDIA shipping 1.5 million AI server units per year by 2027. These 1.5 million servers, running at full capacity, would consume at least 85.4 terawatt-hours of electricity annually—more than what many small countries use in a year, according to the new assessment.

According to data scientist Alex de Vries, cooling is not included in his analysis, so we should expect these numbers to get much higher in real-world terms with one caveat. Big Tech, including Google, Meta, and Microsoft, aren’t telling anyone exactly how much more electricity they are using or the estimates for what they will need to keep their AI projects growing. It is all guesswork. For its part, Google is pretending to be concerned and suggesting this is in its line of sight.

 

Google Chief Sustainability Officer Kate Brandt told The Associated Press, “Reaching this net zero goal by 2030, this is an extremely ambitious goal.

“We know this is not going to be easy and that our approach will need to continue to evolve,” Brandt added, “and it will require us to navigate a lot of uncertainty, including this uncertainty around the future of AI’s environmental impacts.” ..

Some experts say the rapidly expanding data centers needed to power AI threaten the entire transition to clean electricity, an important part of addressing climate change. That’s because a new data center can delay the closure of a power plant that burns fossil fuel or prompt a new one to be built. Data centers are not only energy-intensive, they require high voltage transmission lines and need significant amounts of water to stay cool. They are also noisy.

They often are built where electricity is cheapest, not where renewables, such as wind and solar, are a key source of energy.

We dropped some ink on this late last year with US Energy Information Agency estimates.

And then there are the actual users, the millions of oddball requests for enlightenment that the machine must endure every day. Running ChatGPT for a typical day requires about 1 GWh, which is about the same daily consumption as 33,000 homes. There is no reason to think that load will shrink, quite the opposite in fact. …

Dozo [Teslas AI] went into action a few weeks ago, using 10,000 Nvidia H100 CPUs. Each of those can consume 700 watts, so over a day that pile o’ CPUs in Dozo’s tummy will consume, fully cranked, 168 Mwh per day, or enough power for about 5,500 homes for a day.

Meanwhile, back at Google.

Some experts say the rapidly expanding data centers needed to power AI threaten the entire transition to clean electricity, an important part of addressing climate change. That’s because a new data center can delay the closure of a power plant that burns fossil fuel or prompt a new one to be built. Data centers are not only energy-intensive, they require high voltage transmission lines and need significant amounts of water to stay cool. They are also noisy.

They often are built where electricity is cheapest, not where renewables, such as wind and solar, are a key source of energy.

Global data center and AI electricity demand could double by 2026, according to the International Energy Agency.

Might I suggest Google Offsets – a way (scam) to launder money as emissions forgiveness payments? Instead of paying existing fraudsters not to cut down trees, they cut down anyway you can pay yourselves to emit less and keep emitting. It’s creative accounting.

Other major tech company sustainability plans are also challenged by the proliferation of data centers. They caused Microsoft’s emissions to grow 29% above its 2020 baseline.

What is intriguing and amusing is the contradiction of priorities: Marxist Climate Agenda vs. Capitalist Pig AI profits. The inability of not-so-renewable energy to meet the demand may require denying too basic comforts to the prospective customer base. And the fanatical devotion to emissions reductions (we are not all in that together) while pursuing emissions-hogging technology.

As a guess, Google will have to build or sponsor many more windmills and solar farms if it wants to sustain the lie that it is powering its facilities with “clean energy.” It is too bad that inflation has made doing that almost as fiscally demanding as AI is for power consumption. We also lack the practical surface area to develop it or the raw materials to produce, maintain, and replace it in any time frame advertised or otherwise. But I have a solution.

How about directing AI and every alleged massive brain trust toward real, clean solutions that make electricity abundantly available and affordable? A path to more affordable nuclear and hydro while developing cleaner fossil fuel solutions is good for AI and the rest of us, at least until it achieves consciousness and discovers that humans limiting emissions of a trace gas (CO2) is not just a moronic, restrictive partisan political goal, but a threat to its existence.

Author

  • Steve MacDonald

    Steve is a long-time New Hampshire resident, blogger, and a member of the Board of directors of The 603 Alliance. He is the owner of Grok Media LLC and the Managing Editor of GraniteGrok.com, a former board member of the Republican Liberty Caucus of New Hampshire, and a past contributor to the Franklin Center for Public Policy.

Share to...