China Carbon Credit Platform

Will the rapid rise in AI electricity consumption lead to energy shortages?

SourceCenewsComCn
Release Time6 months ago

With the development of artificial intelligence, the rapid rise in electricity consumption is causing global vigilance, will there be a "power shortage" in the future, and will the energy consumption problem become a "stumbling block" to the development of AI?

Rising steadily, AI has become an "electric tiger"?

At present, with the rapid development of artificial intelligence technology, the demand for chips has increased sharply, which in turn has led to a surge in power demand. According to public data, the global data center market has increased from 10 billion watts a decade ago to 100 billion watts today.

OpenAI's ChatGPT chatbot consumes more than 500,000 kilowatt hours of electricity per day to process about 200 million user requests, equivalent to more than 17,000 times the daily electricity consumption of American homes, according to the Global Times, citing The New Yorker magazine.

Recently, Elon Musk, the founder of Tesla, publicly stated that the artificial intelligence industry will change from "lack of silicon" to "lack of electricity" in the next two years. Earlier this year, OpenAI's founder, Sam Altman, also admitted that we do need more energy than we previously thought, which will force us to invest more in technology that can provide this energy.

According to the Uptime Institute, the share of AI business in global data center electricity consumption will increase from 2% to 10% by 2025.

The research report of Guojin Securities pointed out that with the iteration of the model, the expansion of the number of parameters, and the expansion of the number of daily active users, the demand for related computing power will increase exponentially.

Li Xiuquan, deputy director of the Artificial Intelligence Center of the Institute of Scientific and Technical Information of China, said in an interview with China News Finance that in recent years, the scale and number of artificial intelligence models have grown rapidly, which has also brought about a rapid increase in energy demand. Although problems such as "lack of electricity" will not occur soon in the short term, the surge in energy demand after the advent of the large-scale intelligent era in the future cannot be ignored.

Restricting the development of AI, energy issues have been taken seriously

There are various indications that the power consumption of AI may exceed imagination, which is directly related to whether the AI industry can develop smoothly. Li Xiuquan believes that the wave of intelligent scientific and technological revolution is unstoppable, and the energy problem is a key issue that must be dealt with simultaneously.

This issue is becoming more and more important, and according to Agence France-Presse, AI computing has been criticized for its high energy consumption and low energy efficiency compared to traditional computers. At the recent NVIDIA GTC conference, the issue of energy consumption was also mentioned.

The Boston Consulting Group has released a report that by the end of 2030, the electricity consumption of data centers in the United States alone is expected to triple that of 2022. This increase is mainly driven by two key demand factors for AI model training and serving more frequent AI queries.

The "Research Report on Green Computing Technology Innovation (2024)" released by the China Academy of Information and Communications Technology on March 15 pointed out that the average annual growth rate of China's total computing power in the past five years is nearly 30%. With the rapid growth of the overall scale of China's computing industry, the overall energy consumption and carbon emissions of computing infrastructure represented by data centers have become more and more prominent, and policies have begun to pay attention to the green energy consumption link.

How to deal with it? Improving energy efficiency may be the key

The development of AI is inseparable from computing power. As an AI "arms dealer", NVIDIA is already thinking about energy consumption. Recently, it released a new generation of AI chips, which are said to consume less energy than the previous generation.

For example, Nvidia CEO Jensen Huang said that using 8,000 of its first-generation AI chips to train ChatGPT for three months would consume 15 megawatts of energy, while using the new generation of chips to perform the same task in the same amount of time would require only 2,000 chips and reduce energy consumption to 4 megawatts.

In addition, according to Japanese media reports, Nvidia plans to purchase high-bandwidth memory (HBM) chips from Samsung, which is a key component of artificial intelligence processors and is currently being tested.

"HBM is a technological marvel that can improve energy efficiency and help the world become more sustainable as power-hungry AI chips become more common," Huang said. ”

It is worth mentioning that many data centers in China also consider energy conservation. For example, the submarine data center uses the ocean as the natural cooling source, and the servers are placed in containers on the seabed for natural cooling through the flow of seawater.

Li Xiuquan said that we should not panic too much about the problem of energy demand, but we must deal with it simultaneously. In the future, while the computing power of chips is rapidly upgraded and improved, the energy consumption level per unit of computing power will continue to decrease. At the same time, liquid cooling technology and optical interconnection technology will further improve the energy efficiency ratio of AI computing power clusters, and large models will be quantitatively compressed and retrained into special models for specific problems, so that large-scale and energy-intensive models are no longer needed for many tasks.

RegionChina
Like(0)
Collect(0)