Water on teh brain

Popular large language models (LLMs) like OpenAI’s ChatGPT and Google’s Bard are energy-intensive, requiring massive server farms to provide enough data to train the powerful programs. 

Cooling those same data centers also makes the AI chatbots incredibly thirsty

New research suggests training for GPT-3 alone consumed 185,000 gallons (700,000 liters) of water. An average user’s conversational exchange with ChatGPT basically amounts to dumping a large bottle of fresh water out on the ground, according to a new study. 

Given the chatbot’s unprecedented popularity, researchers fear all those spilled bottles could take a troubling toll on water supplies, especially amid historic droughts and looming environmental uncertainty in the US.

Comments

Popular posts from this blog

Perplexity

Aphorisms: AI

DeepAI's Austen on China