Think of a computer and your mind might conjure the brushed steel contours of the latest must-have laptop or, for those of a certain age, a room full of whirring cabinets and reel-to-reel tape decks. The era of electronic computing has its roots in the code-breaking exploits of Bletchley Park; but the need for repetitive and reliable number-crunching did not suddenly begin with the wartime threat of Nazi submarines. For centuries, such
everyday activities as banking, commerce, engineering and navigation have all relied on computing to manipulate large amounts of numerical information. But before there were machines to do the mathematical
donkey-work, there were human brains, and in the 19th and early 20th centuries a computer was not a device but a person.
The physical sciences have a particular appetite for big numbers, and perhaps none more so than astronomy, in which vast distances and dizzying spans of time are coupled with the sheer quantity of objects that populate the night sky.
Comments
Join the debate for just $5 for 3 months
Be part of the conversation with other Spectator readers by getting your first three months for $5.
UNLOCK ACCESS Just $5 for 3 monthsAlready a subscriber? Log in