AI impact on global energy demand. Further incentive to think like the Wright brothers.

On July 11 my InBox contained this contribution from a New York Times (subscription) service, a thoughtful piece by David Gelles entitled A.I.’s insatiable appetite for energy. He cited an interview that he and other reporters had held some weeks prior with Bill Gates on aspects of climate change. He mentioned that Gates was “surprisingly sanguine” when it came to AI’s energy use, while noting that Gates had billions of dollars in vested interest in the issue. Gelles provided a nice overview of that particular AI challenge, covering the amounts of energy involved, the sources of energy that were available or contemplated, moves made by the tech firms themselves to minimize this growing energy impact, and more. Along the way he cited an estimate by Goldman Sachs to the effect that data centers would constitute 8% of US energy use by 2030, compared with only 3% today.

The Goldman Sachs piece saw the increasing energy demand as an unappreciated opportunity for investors in the energy sector (fossil fuels as well as renewables), noting: While investor interest in the AI revolution theme is not new, we believe downstream investment opportunities in utilities, renewable generation and industrials whose investment and products will be needed to support this growth are underappreciated.

The Times is not alone in devoting (virtual) ink and directing eyeballs to the problem. Back in April, The Economist ran a piece entitled Generative AI has a clean-energy problem, concluding:

If shortages of renewable energy occur, it will come at a cost. No one knows yet how generative ai will make money. What people do know is that the cost of acquiring gpus [graphic processing units] is rocketing. If the energy costs of running them soar, too, it could put the brakes on expansion. In addition, the electrification of the rest of the economy is highly cost-dependent; an ai v ev scramble for clean power would push up prices and serve neither industry well. By all means keep your fingers crossed for Mr Altman’s [Sam Altman, the OpenAI entrepreneur] rose-tinted fusion dream to become reality. But don’t count on it

Other examples abound. A check on your favorite news source, whatever it may be, will almost certainly surface similar material.

A serious problem? Or not so much? I decided to go to the source.

I asked the ChatOnAI app on my cellphone: will AI be just another drain on the world’s electrical energy? Or will it help us save more energy than it uses?

ChatOnAI was on the fence: AI has the potential to both consume and save energy, depending on how it is utilized. While AI systems require a significant amount of computational power and electricity to function, they can also help optimize energy usage and reduce waste in various industries. For example, AI can be used to improve energy efficiency in manufacturing processes, optimize transportation routes to reduce fuel consumption, and enable smart grids to better manage and distribute electricity. With careful implementation and sustainable practices, AI has the potential to help us save more energy than it uses.

Hmm. I followed up: Since AI is using so much energy, should we slow down its implementation?

ChatOnAI didn’t exactly rise to the bait: The energy consumption of AI is indeed a concern, as it requires significant computational power. However, there are also efforts being made to develop more energy-efficient AI models and infrastructure. Slowing down its implementation entirely may not be necessary, but it is important to continue researching and adopting sustainable practices in AI development to minimize its environmental impact.

I tried to push it a bit further: Wouldn’t that also reduce or slow the stream of ai benefits?

ChatOnAI was bland but soothing: While adopting sustainable practices in AI development may require some adjustments and considerations, it does not necessarily mean a reduction or slowdown in the stream of AI benefits. By prioritizing energy efficiency and exploring alternative energy sources, we can continue to leverage the benefits of AI while minimizing its environmental impact. Additionally, advancements in AI technology can also contribute to finding innovative solutions for energy efficiency and sustainability in various industries. It is about finding a balance between reaping the benefits of AI and being conscious of its environmental implications. [1]

(Hmm. From my perspective, my cellphone AI has all the emotional range of Leonard Nimoy’s Spock in the old Star Trek series. On the other hand, I get the sense my cellphone AI doesn’t exactly see me as the most stimulating conversationalist either. Need serious work on my prompting.)

Bottom line? The AI-energy use problem comes down to a matter of control: whether we control AI, or AI controls us, bringing us back to the Wright brothers. This issue of control matters at several levels. At the individual level, do we get so absorbed in generative AI that we spend inordinate time merely playing with it? Or do we harness it to our day job? At the societal level, AI uptake by some sectors – tech, educational, finance, healthcare, and retail – probably represent net consumption of energy. AI application to manufacturing, transportation, and agriculture sectors may lead to varying degrees of energy conservation. Is AI uptake by the power sector, with its potential for energy savings, an early adopter in this race? Or a slow plodder? And how rapidly can energy-saving AI ideas be implemented in practice across the gargantuan, high-inertia energy sector? Somehow it feels like the energy impact of AI should ultimately be a net savings (the Gates-Altman view), but in the short-term AI-use will add to the emissions problem. It would therefore seem to be to global benefit to accelerate the business-as-usual investment in AI application to the energy sector.

In any event, expect the AI impacts to shape long- and short-term emissions scenarios for future IPCC assessments.    


[1]Those interested in putting some flesh on these bones might take a look at the International Energy Agency (IEA) Electricity 2024 report or this IEA 50 piece explaining  Why AI and energy are the new power couple. An excerpt:

Power systems are becoming vastly more complex as demand for electricity grows and decarbonisation efforts ramp up. In the past, grids directed energy from centralised power stations. Now, power systems increasingly need to support multi-directional flows of electricity between distributed generators, the grid and users. The rising number of grid-connected devices, from electric vehicle (EV) charging stations to residential solar installations, makes flows less predictable. Meanwhile, links are deepening between the power system and the transportation, industry, building and industrial sectors. The result is a vastly greater need for information exchange – and more powerful tools to plan and operate power systems as they keep evolving.

This need arrives just as the capabilities of artificial intelligence (AI) applications are rapidly progressing. As machine learning models have become more advanced, the computational power required to develop them has doubled every five to six months since 2010. AI models can now reliably provide language or image recognition, transform audio sounds into analysable data, power chatbots and automate simple tasks. AI mimics aspects of human intelligence by analysing data and inputs – generating outputs more quickly and at greater volume than a human operator could. Some AI algorithms are even able to self-programme and modify their own code.

It is therefore unsurprising that the energy sector is taking early steps to harness the power of AI to boost efficiency and accelerate innovation. The technology is uniquely placed to support the simultaneous growth of smart grids and the massive quantities of data they generate. Smart meters produce and send several thousand times more data points to utilities than their analogue predecessors. New devices for monitoring grid power flows funnel more than an order of magnitude more data to operators than the technologies they are replacing. And the global fleet of wind turbines is estimated to produce more than 400 billion data points per year.

This volume is a key reason energy firms see AI as an increasingly critical resource. A recent estimate suggests that AI already serves more than 50 different uses in the energy system, and that the market for the technology in the sector could be worth up to USD 13 billion.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *