scroll to top arrow or icon

{{ subpage.title }}

FILE PHOTO: Smoke and steam billows from the coal-fired power plant owned by Indonesia Power, next to an area for Java 9 and 10 Coal-Fired Steam Power Plant Project in Suralaya, Banten province, Indonesia, July 11, 2020.

REUTERS/Willy Kurniawan/File Photo

Hard Numbers: It’s electric, OpenAI’s billions, AI-related legislation, Fred Trump ‘returns,’ Multiplication problems

1,300: Training a large language model is estimated to use about 1,300 megawatt hours of electricity. It’s about the same consumption of 130 US homes for one year. But that’s for the last generation of LLMs, like OpenAI’s GPT-3. The potential electricity usage for GPT-4, the current model, and beyond could be much, much greater.

Read moreShow less

Subscribe to our free newsletter, GZERO Daily