During my time in No. 10 as one of Dominic Cummings’s ‘weirdos and misfits’, my team would often speak with frontline artificial intelligence researchers. We grew increasingly concerned about what we heard. Researchers at tech companies believed they were much closer to creating superintelligent AIs than was being publicly discussed. Some were frightened by the technology they were unleashing. They didn’t know how to control it; their AI systems were doing things they couldn’t understand or predict; they realised they could be producing something very dangerous.
This is why the UK’s newly established AI Taskforce is hosting its first summit next week at Bletchley Park where international politicians, tech firms, academics and representatives of civil society will meet to discuss these dangers.
Without oversight, the range of possible harms will only grow in ways we can’t foresee
Getting to the point of ‘superintelligence’ – when AI exceeds human intelligence – is the stated goal of companies such as Google DeepMind, Anthropic and OpenAI, and they estimate that this will happen in the short-term. Demis Hassabis, of DeepMind, says some form of human-equivalent intelligence will be achieved in the next decade. Sam Altman, CEO of OpenAI (ChatGPT’s creator), reckons he’ll achieve it by 2030 or 2031. They may be wrong – but so far their predictions have usually been right.
So you see the conundrum. AI that has the power to damage society is being created by people who know the risks but are locked in a race against each other, unable to slow down because they worry about becoming irrelevant in their field. Yet, even though they aren’t slowing down, all the major lab CEOs signed a letter earlier this year saying that AI was a nuclear-level extinction risk.
The dangers are real. Two years ago, an AI was developed that could, in a few hours, rediscover from scratch internationally banned chemical warfare agents, and invent 40,000 more ‘promising candidate’ toxins.

Comments
Join the debate for just £1 a month
Be part of the conversation with other Spectator readers by getting your first three months for £3.
UNLOCK ACCESS Just £1 a monthAlready a subscriber? Log in