Microsoft's ChatGPT powered Copilot presents a 'fundamental mismatch' between technology and environment according to research

Reading time icon 5 min. read


Readers help support Windows Report. We may get a commission if you buy through our links. Tooltip Icon

Read our disclosure page to find out how can you help Windows Report sustain the editorial team. Read more

Microsoft Server farm - Energy consumption

While Microsoft plows forward with its vision of pre-generative artificially intelligent platforms, it does so with the use of technology that leverages unsustainable bitcoin mining levels of electricity according to Dutch researcher Alex de Vries.

In a recent The New Yorker piece authored by Elizabeth Kolbert; she writes about de Vries indexing of AI energy consumption spurred by his earlier work with his own Bitcoin Energy Consumption Index posted to Digiconomist.

de Vries charted Bitcoin mining at consuming forty-five-billion-kilowatt hours of electricity per year dwarfing the use of the entire nation his home nation, the Netherlands. In addition to its draw on the electrical grid, Bitcoin mining also results in eighty-one million tons of CO2 emissions which is equivalent the North African country of Morocco.

With Bitcoin as the energy sink barometer, de Vries noticed the alarming similarities in AI energy consumption and began tracking its potential trajectory.

In the journal Joule, dedicated to research on sustainable energy, de Vries writes about the similar trajectory AI and bitcoin mining are on in terms of energy-intensive consumption. de Vries estimates that if Google gets fully up and going with Gemini incorporated into or replacing its current search services and platforms, that AI platform could account for more energy consumed by countries such as Croatia, Guatemala, and Kenya at twenty-nine billion kilowatt-hours per year.

As it pertains to Microsoft and ChatGPT, ramping up use of OpenAI’s pre-generative platform authors a similar energy sink as its estimated to be processing two hundred million requests per day that results in roughly half a million kilowatt-hours of electricity consumed.

Kolbert offers a comparison between the average household’s use of electricity in a day to that of ChatGPT with the average household peaking at twenty-nine kilowatt hours a day versus ChatGPT’s five hundred thousand.

The Verge also carried a piece that attempts to quantify AI usage following the known quantities of large language models like GPT-3, as such,

Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.

James Vincent – The Verge

de Vries research has led him to the conclusion that mirrors that of even OpenAI’s CEO Sam Altman, which he believes “There’s a fundamental mismatch between this technology and environmental sustainability.”

The Verge also speaks with de Vries about his methods for calculating the energy consumption of LLMs such as ChatGPT.

To estimate the sector’s global energy usage. As de Vries explains in a commentary published in Joule last year, Nvidia accounts for roughly 95 percent of sales in the AI market. The company also releases energy specs for its hardware and sales projections. 

By combining this data, de Vries calculates that by 2027 the AI sector could consume between 85 to 134 terawatt hours each year. That’s about the same as the annual energy demand of de Vries’ home country, the Netherlands. 

“You’re talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027,” de Vries tells The Verge. “I think that’s a pretty significant number.”

Altman was a bit more diplomatic with his concerns when speaking with the press at Davos, stating that “I think we still don’t appreciate the energy needs of this technology,” and that “We need fusion or we need, like, radically cheaper solar plus storage, or something, at massive scale-like a scale that no one is really planning for.”

Independent of the language used by Altman and de Vries’, it’s clear that companies investing in massive server farms that are expelling extensive amounts of CO2 when cooled at this nascent state in AI adotpion, will come to a crossroads, soon where the questions of growth and sustainability cross paths.

Kolbert notes that US datacenters alone account for four per cent of electricity consumed gloablly with that number expected tip six percent by 2026.

Beyond being disheartened by seeing recent history sort of repeat itself between bitcoin mining and AI energy consumption lessons, de Vries also questions whether AI should be adopted at scale given the finite resource its gobbling up.

“Because considering all the limitations AI has, it’s probably not going to be the right solution in a lot of places, and we’re going to be wasting a lot of time and resources figuring that out the hard way,”

Perhaps as a society, we’ll eventually settle on the most optimized use cases for AI and put our collective resources to supporting those, the question remains on how much energy will be used between now and then with compelling examples such as bitcoin mining still running almost unadulterated.

More about the topics: ChatGPT

User forum

0 messages