Turing's Cathedral:The origins of the Digital Universe by George Dyson
Daily Telegraph, 24 March 2012"Princeton is a madhouse,” wrote Robert Oppenheimer in January 1935. Twelve years later, after directing the building of the atom bomb, he would return to the Institute for Advanced Study (IAS) to take charge of this “madhouse”.
One of the permanent residents was Einstein. Another of Oppenheimer’s new charges was a former colleague from the Manhattan Project who was now “thinking about something much more important than bombs”.
The Hungarian-born polymath John von Neumann would make seminal contributions to everything from quantum mechanics to game theory, and had turned his prodigious talent to “thinking about computers”.
On November 12 1945, he gathered together six people and started the IAS’s Electronic Computer Project to design and construct a programmable electronic digital computer. After five years the Mathematical and Numerical Integrator and Computer (Maniac) was fully functioning but it had only five kilobytes of storage, less memory than is used to display a single icon on your computer screen.
The rest may be history but it’s one George Dyson is uniquely qualified to capture in Turing’s Cathedral. The son of the distinguished physicist Freeman Dyson, he grew up in the environs of the IAS where his father has been a member since 1948. Dyson used his privileged position to gain access to people and to explore archives untouched for decades. The years of research and writing have enabled him to bring to life a myriad cast of extraordinary characters each of whom contributed to ushering in today’s digital age.
While our universe may have popped out of nothing due to what physicists describe as a quantum fluctuation, the origins of the digital universe of 0s and 1s required the US military’s desire to be armed with a hydrogen bomb at the beginning of the Cold War and it “had to be squeezed into existence” between simulations of nuclear explosions. Two real-world explosions in 1952 and 1954 confirmed the correctness of those calculations and the indispensable nature of a computer that could be reprogrammed to carry out different tasks, the theory behind which had first been worked out by the British mathematician Alan Turing.
Turing may have been the intellectual visionary, but Dyson’s book is about Von Neumann, the chief architect who oversaw the construction of the hardware and software architecture that allowed sequences of code to be stored, recalled and executed. Yet Dyson acknowledges that Maniac was not the first operational stored-programme computer. That was the Small Scale Experimental Machine, developed in June 1948 at Manchester University where Turing was by then based having helped break the German navy’s Enigma code during the war as a leading member of Bletchley Park.
Turing and Von Neumann were chalk and cheese in everything except their shared interest in computers. Von Neumann always dressed in a suit and spoke with precision; Turing was unkempt and hesitated as if words could not keep up with his thoughts. Von Neumann had an eye for women; Turing’s homosexuality would lead to a conviction for gross indecency. Forced to undergo “therapy” with oestrogen injections, he committed suicide in 1954.
Faced with the tricky task of balancing technical details with keeping the narrative accessible for the non-computer buff, Dyson ends up probably not giving enough detail to satisfy the aficionado but too much for the lay reader. “Evolution in the digital universe now drives evolution in our universe,” he says, “rather than the other way around.”
Turing, Von Neumann and their colleagues may have let the genie out of the bottle, but Dyson has done the difficult job of reminding us of how much we owe them and how far we have come in such a short time.