China Carbon Credit Platform

Is AI a "maniac eater"? Will it face a "lack of electricity"? China's solution is worth paying attention to

SourceCenewsComCn
Release Time5 months ago

"The next shortage will be electricity. ”

Not long ago, around the development of artificial intelligence (AI), Tesla CEO Elon Musk issued such an early warning.

He said that the constraints of AI computing are predictable, "I predicted the chip shortage more than a year ago, and the next shortage will be electricity." I don't think there will be enough power next year to run all the chips. ”

In addition, OpenAI CEO Sam Altman also said that artificial intelligence will consume more electricity than people expect, and future development will require energy breakthroughs.

Behind the rapid development of artificial intelligence, the problem of energy consumption has become increasingly prominent, becoming the focus of attention in the industry. Some people even proposed, "The end of AI is computing power, and the end of computing power is electricity."

So, how much power does artificial intelligence consume? Is there a "lack of electricity" in the development of artificial intelligence? In the face of the problem of energy consumption, what kind of response plan has China come up with?

01

How much power does AI consume?

At present, the competition of artificial intelligence large models is quite like a "computing power arms race". Driven by the Scaling Law, companies are increasing the number of model parameters and data volumes to achieve "miracles", and the demand for computing power has increased exponentially.

The so-called computing power, simply understood, is the ability to process data or information.

Computing power is abstract, but its carrier is really visible, that is, the computing infrastructure represented by data centers and intelligent computing centers. Behind the computing power, it is supported by electricity.

You can imagine in your mind that in a data center or computing center, thousands of servers and chips are lined up and running around the clock.

At present, the mainstream computing power chip NVIDIA H100 chip used to train AI large models has a maximum power consumption of 700 watts, which means that it consumes 0.7 kWh of power for one hour of operation. Previously, it was reported that OpenAI needs tens of thousands of H100 chips to train GPT-5.

Through the round-the-clock data center, everyone will have a perceptual understanding of AI power consumption, and the data is more intuitive.

Take GPT-3 training, for example, which has 175 billion parameters and is estimated to use about 1,287 megawatt-hours (or 1,287,000 kWh) of electricity during training.

How do you understand this electricity consumption? That's equivalent to the electricity consumption of about 121 households in the United States for an entire year. Some experts have also made such an analogy, which is roughly equivalent to 3,000 Tesla electric vehicles running together, each running 200,000 miles.

GPT-3 was released in 2020, and many people may ask, what is the energy consumption of the updated model, because in recent years, many AI technology companies have stopped publishing training details, such as what hardware was used and how long it took, which makes energy consumption calculations difficult.

However, the energy consumption of GPT-3 can be used as a reference, the GPT-3 model parameters are 175 billion, and GPT-4 has been exposed to contain 1.8 trillion parameters, and with the doubling of parameters, the energy consumption will also increase significantly.

The above energy consumption is limited to the training phase, and after the completion of training, the AI will usher in a new power-hungry stage - inference, which is the process by which people use AI to output results.

AI training is a one-time event, but use is a long-term process, and power consumption will continue to stack up as the application becomes more widespread and the number of users increases.

The International Energy Agency (IEA) said in a report in January that ChatGPT uses an average of 2.9 watt-hours of power to respond to a single request — the equivalent of lighting a 60-watt light bulb in just under three minutes.

In addition, according to US media reports, ChatGPT responds to about 200 million needs per day and consumes more than 500,000 kWh of electricity, which is equivalent to the average daily electricity consumption of 17,000 American households.

Let's do a simple math problem, GPT-3 training consumes about 1.28 million kWh, ChatGPT consumes 500,000 kWh of power per day to respond to demand, and GPT-3 consumes more power during the training phase, which can't even support ChatGPT to run for 3 days.

Over time, the power consumption is considerable.

02

Is AI "running out of power"?

All kinds of data seem to show that AI is a "power-hungry monster", so the next question is, can its appetite still be satisfied?

Let the data do the talking.

According to public information, in 2023, the full-caliber net power generation in the United States will be 4,178.171 billion kWh (1 kWh = 1 kWh), based on ChatGPT's daily power consumption of 500,000 kWh, based on 365 days a year, the electricity consumption will be about 182.5 million kWh, which will only account for about 0.0044% of the national power generation in the United States.

AI is certainly not just ChatGPT, but its energy consumption data can be used as an incision. It can be seen that although the demand for electricity continues to grow with the rapid growth of AI computing power, it currently accounts for a small proportion of overall power consumption.

In this case, are the tech tycoons "selling anxiety" by frequently calling attention to the problem of AI energy consumption?

The Boston Consulting Group has released a report that by the end of 2030, the electricity consumption of data centers in the United States alone is expected to triple that of 2022, and this increase is mainly due to AI model training and more frequent AI queries.

"The 'lack of electricity' in AI development is not a problem that has emerged now, but a problem that may be faced in the future. ”

Liu Chong, director of the Institute of International Security at the China Institute of Contemporary International Relations, made such a judgment.

He said that the current development route of AI is to continuously increase model parameters and superimpose chips, and if it continues to develop according to this route, it will consume more electricity in the future, and from this point of view, the energy consumption problem of AI may become more and more prominent in the future, especially for countries where the power supply itself is relatively tight. But for now, energy hasn't been a limiting factor for AI development.

Liu Xingliang, a member of the Information and Communication Economy Expert Committee of the Ministry of Industry and Information Technology, also said that the technology boss predicted the "lack of electricity" of AI, which may be to make everyone pay attention to this problem, which only shows that AI does consume electricity and the cost of electricity is indeed very high, but the current energy problem has not yet reached the level of affecting the development of AI.

Liu Xingliang agrees that there may be hidden concerns in the future, if the scale of parameters continues to be unchecked, and with more and more users, and there is no progress in energy consumption technology, power consumption will soon become a problem. But at the same time, he has made a more optimistic outlook, believing that energy consumption can be further reduced through technology.

It can be said that the lack of power in AI is a problem that may be faced in the future, and a series of solutions are already on the way before this problem really arrives.

On the demand side, the optimization of AI models, the improvement of chip efficiency and algorithm efficiency, and the advancement of data center software and hardware technology are all expected to reduce the energy consumption of AI.

Looking back may help us think about the future.

According to a study published in the journal Science, between 2010 and 2018, global data centers saw a 550% increase in computing capacity and a 2,400% increase in storage space, but only a 6% increase in power consumption.

On the supply side, the power problem involves energy, infrastructure, policy, technology and other aspects, and it is difficult to explain it by simply "yes" or "no", "sufficient" or "insufficient". A more diversified energy mix, power technology innovation, and state regulation will help to address this problem.

The International Energy Agency (IEA), for example, is optimistic about the role of clean energy, noting in a report that low-carbon energy sources, including renewables and nuclear, are expected to account for 46% of global electricity generation by the end of 2026 and can meet all additional demand growth, including artificial intelligence, which will double electricity demand.

All in all, although there will not be a shortage of AI power in the short term, the discussion does provide a wake-up call for the world that is developing AI vigorously - with the rapid growth of the scale and number of AI large models, the potential surge in energy demand in the future cannot be ignored.

03

One of the Chinese solutions to AI power consumption

At present, the total scale of China's computing power has ranked second in the world.

According to the estimation of China's authoritative institutions, at present, the total power consumption of data centers in the country accounts for 2% of the whole society, and the electricity cost accounts for 50% of the total operating costs of data centers.

According to the China Academy of Information and Communications Technology, the total energy consumption of data centers is expected to reach about 380 billion kilowatt hours by 2030.

AI power consumption is also a problem that China needs to deal with.

China has an advantage in electric power, and has built the world's largest power supply system and clean power generation system, among which hydropower, wind power, photovoltaic, biomass power generation and nuclear power under construction have ranked first in the world for many years.

It is worth mentioning that in addition to improving specific AI and power-related technologies, a "problem-solving idea" at the macro level in China is also worth paying attention to.

Let's start with a recent Qinghai promotion meeting, the theme of which is the development of green computing industry.

We know that Qinghai has abundant green power resources, including photovoltaics, wind power, hydropower, etc.

By the end of 2023, Qinghai Province will have more than 51 million kilowatts of clean energy capacity, accounting for 92.8% of the total, and more than 84.5% of the electricity generation.

However, the abundant green electricity faces the problem of "not being used up and not being delivered" - the instability of green electricity leads to many of the inability to send out, and can only be consumed locally, but not used up locally.

AI computing power consumes a lot of electricity, and Qinghai's green power is not exhausted, so if the data center is moved to Qinghai, the "green power" will be turned into "green computing", and the green "watts" will become green "bits", wouldn't it be a two-way street?

In China Telecom (National) Digital Qinghai Green Big Data Center, 100% clean energy supply has been realized for the data center through the complementarity of clean energy such as wind, solar, and water.

Moreover, the construction of the data center in Qinghai can not only solve the power problem, but also greatly reduce the heat dissipation energy consumption. Qinghai has a dry and cool climate, and the data center can achieve natural cooling for 314 days a year, and the cooling electricity consumption is about 40% lower than the national average.

With such unique conditions, Qinghai must of course be confidently promoted.

In this regard, there is another pioneer, which is Guizhou, which also has power and climate advantages.

As one of the regions with the largest concentration of hyperscale data centers in the world, Guizhou is taking a step forward on the track of artificial intelligence.

A national artificial intelligence training ground was established in Guiyang, Guizhou Province, Guizhou Province and Shenzhen signed a strategic cooperation agreement on the coordinated development of computing power, and Huawei Cloud Computing cooperated with Gui'an New Area to build a world-leading intelligent computing centerPanguThe two basic models of iFLYTEK Xinghuo were launched in Guizhou, and Guizhou and Zhejiang jointly created a virtual digital person "Hang Xiaoyi" for cultural tourism promotion......

Last year, Guizhou deployed 80,000 intelligent computing chips, and the total computing power increased by 28.8 times. Guizhou's goal is to provide low-cost, high-quality, and easy-to-use computing power services for model training in the east.

Behind the vigorous development of the green computing power industry in Guizhou and Qinghai is a more ambitious project - the "Eastern Data and Western Computing" project.

The basic logic of the "Eastern Data and Western Computing" project is:

Affected by factors such as land, water and electricity, and operation and maintenance, the operating costs of data centers in the eastern region are relatively high.

In the vast western region, renewable energy, clean energy, and land resources are relatively abundant, and the superior climatic conditions can also reduce the energy consumption and carbon emissions of data center operations.

Therefore, guiding data centers to gather in resource-rich areas in the west can not only promote the low-carbon, green, and sustainable development of data centers in the western region, but also meet the computing power needs of the eastern region.

As early as May 2021, the relevant departments clearly proposed to implement the "Eastern Data and Western Computing" project.

In February 2022, the National Development and Reform Commission and other departments jointly issued a notice agreeing to start the construction of national computing hub nodes in 8 places, including Inner Mongolia, Guizhou, Gansu, and Ningxia, and planned 10 national data center clusters. The project of "Eastern Data and Western Computing" was officially launched.

In December 2023, the "Implementation Opinions on In-depth Implementation of the "Eastern Data and Western Computing" Project and Accelerating the Construction of a National Integrated Computing Network" was released, proposing "computing power synergy" for the first time.

What is "Computing Power Synergy"?

On the one hand, the efficient operation of the data center is inseparable from the support of a large amount of power, and on the other hand, the smooth and efficient operation of the power system is also inseparable from the support of computing power. Coordinating the coordinated layout of computing power and power will help promote the consumption of wind, solar and green power and the zero-carbon development of data centers.

Of course, we must note that, objectively, due to limitations such as network latency, not all computing service scenarios are applicable to "Eastern Data and Western Computing". For example, low-latency business scenarios such as autonomous driving and securities trading require nearby computing.

However, high-latency business scenarios such as AI model training and inference are just the "comfort zone" of "Eastern Data and Western Computing", and "Eastern Data and Western Computing" has become a typical application scenario of "Eastern Data and Western Computing".

In recent years, the energy consumption problem brought by artificial intelligence has aroused extensive discussion, and China began to deploy "Eastern Data and Western Computing" as early as 2021, which is undoubtedly very forward-looking, and has also greatly helped China to gain an advantage in this round of computing power competition.

At present, the "Eastern Data and Western Computing" projects in artificial intelligence are speeding up.

Beijing Capital Online signed a framework agreement on cooperation in the artificial intelligence industry with Qingyang City, Gansu Province, and Baidu Intelligent Cloud signed a strategic cooperation signing ceremony with Chengdu High-tech Zone to build a large-scale model industry......

Since the beginning of this year, new things have also emerged - "hash coupons" and "hash cards".

Beijing, Guizhou, Qingyang, Gansu and other places have issued "computing power coupons" to enterprises, universities, scientific research institutions, etc., to reduce the cost of using computing power and support the development of the artificial intelligence industry.

In addition, some operators have launched "computing power cards" for ordinary consumers. The person in charge of the relevant business said that in the future, computing power services will also become popular products like traffic and broadband.

It is conceivable that in the future, artificial intelligence will be deeply integrated into our lives, and the issue of energy consumption is worth paying attention to, but there is no need to worry about it. It is more positive to think about how to use technological development to deal with energy problems than to worry about how energy problems will limit technological development. (CCTV News Client)

RegionChina,Beijing,Inner Mongolia,Zhejiang,Guizhou,Gansu,Qinghai,Ningxia
Like(0)
Collect(0)