Artificial Intelligence, or AI, has become the popular blonde in both tech and cultural circles. It is writing papers for students, and could write blogs for websites, white papers, technical reports, just about anything, but should it?
A lot of the same people who think AI is the bee’s knees, the fox’s socks, or the cat’s pajamas (and aren’t worried about it turning into a conscious silicon life force like Sky Net) also lean toward those misnamed as environmentally conscious. They talk about anthropogenic warming and reducing carbon footprints. But those things can’t coexist – AI and lowering carbon footprints.
Artificial Intelligence requires a lot of power, which produces a lot of heat.
According to the US Energy Information Agency, a typical home consumes about 11 Mwh per year. So that one training session consumes about the same power as a thousand homes do in a year. Chat GPT requires continuous updating, because it will only be of value if it routinely and often scrapes the web for all the latest updates.
And then there are the actual users, the millions of oddball requests for enlightenment that the machine must endure every day. Running ChatGPT for a typical day requires about 1 GWh, which is about the same daily consumption as 33,000 homes. There is no reason to think that load will shrink, quite the opposite in fact.
Forget all the bad things individuals or governments could do with AI (or the good), which are as numerous as the human imagination and its weaknesses. If you are even remotely inclined to believe anything about decarbonization and Net Zero, then AI is a problem. It isn’t going to get less wasteful, and the current energy transition plan can’t even handle a fraction of our current needs.
Consider what Tesla is up to: The pioneering auto firm is building its own AI supercomputer called Dozo. Sounds like a pet but wow, what an appetite. Dozo went into action a few weeks ago, using 10,000 Nvidia H100 CPUs. Each of those can consume 700 watts, so over a day that pile o’ CPUs in Dozo’s tummy will consume, fully cranked, 168 Mwh per day, or enough power for about 5,500 homes for a day.
As those with the means (and likely every university research department with a desire for a private or specialized one of their own) create more AI systems and the competition “heats up” to have the better, faster thing, the power demands will rise exponentially and in parallel to the decline in the reliable resource necessary to power them.
What’s the plan?
You need 3.125 million solar panels (or 333 utility-scale wind turbines) to produce the 1 Gigawat ChatGPT is using every day, and that’s with limited adoption. Where do we put them, and then what about powering everything else?
The obvious solution is, once again, nuclear, but that’s not even on the table.
So, what’s the plan?
HT | WUWT (Great piece with a lot more about AI)