Simon Hunt

A crackdown on AI energy consumption would be a mistake

The logo for OpenAI (Credit: Getty images)

AI has an energy problem: it consumes an awful lot of it. Firms like ChatGPT creator OpenAI demand eye-watering levels of energy to develop their models. Training Chat GPT-3 used as much as 120 American homes over the course of a year, according to one study, while the training of GPT-4 used many multiples more.

The International Energy Agency estimates that AI will cause the number of data centres, the warehouses in which their vast training datasets are stored, to double globally at some point in the next ten years. Inevitably, this will also lead to a doubling in the amount of electricity needed to run them. There have been warnings that coal-fired power stations may have to stay open for longer to keep pace with demand, and there are concerns over the vast quantities of water used to keep the systems cool.

There are plenty of smaller AI businesses set up with the expressed purpose of cutting emissions

None of this sounds encouraging.

Comments

Join the debate for just $5 for 3 months

Be part of the conversation with other Spectator readers by getting your first three months for $5.

Already a subscriber? Log in