Jeremy Hunt as Home Secretary said something very important by mistake. He told the Commons in May 2018 that ‘a computer algorithm failure’ meant 450,000 patients in England missed breast cancer screenings. As many as 270 women might have had their lives shortened as a result. This point hasn’t received the analysis it deserves. Scores of women died sooner than they should have done, because of an algorithm.
He’s probably right, you know. It really could have been a computer model to blame here. But that’s obviously unsatisfactory, since we need humans to hold to account when things go wrong. Let’s say it was a poorly programmed algorithm – who’s at fault? The tech guy who wrote it, years ago? The person who commissioned it? The people feeding the data in?
The more we outsource decisions to machines, the more blurred the lines of accountability will become.
Comments
Join the debate for just $5 for 3 months
Be part of the conversation with other Spectator readers by getting your first three months for $5.
UNLOCK ACCESS Just $5 for 3 monthsAlready a subscriber? Log in