Compared to the same period last year, the United States' electricity consumption has increased by almost 4% so far this year. This is in contrast to decades of virtually flat use, which has been linked to the rapid growth of data centers. Additionally, many of those data centers are being constructed to accommodate the surge in the use of AI. The environmental effects of AI appear to be quite dire, especially as some of this growing demand is being satisfied by using more coal (as of May, coal's share of generating is up roughly 20% from the previous year).
However, without access to the kinds of information that can only be obtained by operating a data center, such how frequently the hardware is being used and how frequently it is answering AI queries, it is hard to be sure. Therefore, it is difficult to extrapolate the power requirements of specific AI models to real-world use cases, even though researchers can test them.
Google, on the other hand, has a wealth of data from actual use cases. Its publication of a fresh analysis of AI's effects on the environment is therefore a unique chance to take a quick look inside. However, according to the company's data, the energy drain of a search has decreased by a factor of 33 in the last year alone, indicating that energy estimations are currently a changing goal.
What to include in these studies is one of the major questions. The energy used by the processors to process a request is evident. To sustain such processors, however, energy is also needed for memory, storage, cooling, and other purposes. In addition, energy is required to produce all of that stuff and construct the buildings that hold it. A portion of the energy used by AIs during training may be deducted from each request made to the model after training.
Choosing which of these aspects to take into account is a necessary step in any energy use study. Because the individuals conducting the study do not have access to the pertinent data, several issues have been mostly overlooked in the previous ones. They most likely have no idea how many processors are required for a certain operation, let alone the carbon emissions involved in their production.
However, Google has access to almost everything, including the hardware required to process a request, the energy used to do so, the cooling requirements, and more. Additionally, the corporation probably has access to both Scope 2 and Scope 3 emissions that are generated as a result of its operations (either directly, through things like power generation, or indirectly, through a supplier chain), as it is now common practice to do so.
For the new analysis, Google tracks the energy of CPUs, dedicated AI accelerators, and memory, both when active on handling queries and while idling in between queries. Additionally, it monitors the data center's overall energy and water use and is aware of its other components, allowing it to calculate the percentage dedicated to answering AI inquiries. It also monitors the carbon emissions from the creation of all the gear it uses, as well as those related to the electrical supply.