AI Consumption

recent study found that training a large neural network with 175 billion parameters consumed 1287 MWh of electricity. It resulted in carbon emissions of 502 metric tons, equivalent to driving 112 gasoline-powered cars for a year.

In the United States, data centers where AI models are trained are already major consumers of electricity, representing approximately 2% of the nation's total usage

These centers demand significantly more energy than standard office spaces, requiring 10 to 50 times more power per unit of floor area. 

Another study highlights the energy needs of AI models like ChatGPT, likening its consumption to "drinking" a 500ml bottle of water for every 20-50 interactions it handles, with its successor, GPT-4, demonstrating an even higher energy demand.

Comments

Popular posts from this blog

Perplexity

Aphorisms: AI

DeepAI's Austen on China