Thursday 24 March 2011

The Book of Universes

The Book of Universes by John D Barrow

Independent, 25 March 2011


"Einstein explained his theory to me every day and on my arrival I was fully convinced that he understood it," reported Chaim Weizmann. He would become the first president of Israel, but in 1921 was accompanying Einstein on a transatlantic voyage to New York. The theory in question was general relativity, in which gravity is due to the warping of space caused by the presence of mass. The Earth moves around the Sun not because some mysterious invisible force pulls it, but because of the warping of space due to the Sun's enormous mass.

"The theory is beautiful beyond comparison," Einstein wrote. When, in November 1919, British astronomers announced that they had discovered that gravity bends light – as predicted by general relativity – it made headlines around the world. Yet buried within his greatest achievement was what Einstein called "my greatest blunder".

He knew that his equations could be solved in a number of different ways, with each solution representing a model of a possible universe. Like everyone else at the time, Einstein believed that the actual universe was eternal and unchanging. So he introduced a term (his "greatest blunder") into the equations that ensured exactly that. It was left to others, a Russian mathematician and then a Belgian Jesuit priest, to find and take seriously the solutions that pointed to an expanding universe. Soon this non-static model attracted some experimental support.

In the 1920s, the American astronomer Edwin Hubble discovered two remarkable facts. First, what we had long assumed to be the universe was actually our host galaxy and there were many other such "island universes". Second, he found that light from these distant galaxies was stretched towards the red end of the visible spectrum. This so-called redshift is evidence that these galaxies are moving away from our own Milky Way and that the universe is expanding.

Eventually, this led theorists to a universe that was exploded into being in a Big Bang some 13 billion years ago from a single point, called a singularity, which was infinitely hot and dense. Add a surge of accelerated expansion only a trillion trillion trillion trillionth of a second after the Big Bang that lasted for only a trillion trillion trillionth of a second, and the discovery that 96 per cent of it is made up of dark matter and dark energy, then we arrive at the most popular model of our universe.

In the 20th century, cosmology became a bonafide scientific discipline, but there remains plenty of room for some metaphysical speculation. What exactly do we mean by "universe"? Is the universe everything that has existed, does exist and will ever exist? asks Cambridge cosmologist John Barrow. What about including all that cannot exist? After all, as he points out, some medieval philosophers "were drawn to this sort of completeness, adding everything that has existed, does exist and will not exist to the catalogue of what was, is and will be".

Barrow and his colleagues are not only interested in the structure and history of our universe. There are other universes that live inside black holes, or are chaotically unpredictable or allow time travel into the past. However, the most mind-numbing concept of all only emerged in the 1990s: the never-ending "multiverse" – the universe of all possible universes. There can be few better guides to the bewildering array of potential universes, and none so readable or entertaining.

Wednesday 23 March 2011

The Meeting of Minds

The Meeting of Minds
Nature.com, 23 March 2011

I first saw the photograph of those gathered at the fifth Solvay conference, which was held in Brussels from 24 to 29 October 1927, in a biography of Albert Einstein. This was in 1979, when I was just 16. I wondered what brought these people together, and soon learned that the picture included most of the key players involved in the discovery of the quantum, and the subsequent development of quantum physics. With 17 of the 29 invited eventually earning a Nobel Prize, the conference was one of the most spectacular meetings of minds ever held.


When I was 18, I was given a print of the above photograph as a present. Many years later I began to think about it as a possible starting point for a book about the quantum. In the photograph there are nine seated in the front row. Eight men, and one woman; six have Nobel Prizes in either physics or chemistry. The woman has two, one for physics, awarded in 1903, and another for chemistry, awarded in 1911. It could only be Marie Curie. In the centre, the place of honour, sits Albert Einstein. Looking straight ahead, gripping the chair with his right hand, he seems ill at ease. Is it the winged collar and tie that are causing him discomfort, or is it what he has heard during the preceding week? At the end of the second row, on the right, is Niels Bohr, looking relaxed with a half-whimsical smile. It had been a good conference for him. Nevertheless, Bohr would be returning to Denmark disappointed that he had failed to convince Einstein to adopt his Copenhagen interpretation of what quantum mechanics revealed about the nature of reality.

Instead of yielding, Einstein had spent the week attempting to show that quantum mechanics was inconsistent, that Bohr's 'Copenhagen interpretation' was flawed. Einstein said years later that:

This theory reminds me a little of the system of delusions of an exceedingly intelligent paranoic, concocted of incoherent elements of thoughts.

It was Max Planck, sitting on Marie Curie's right, holding his hat and cigar, who discovered the quantum. In 1900 he was forced to accept that the energy of light, and all other forms of electromagnetic radiation, could only be emitted or absorbed by matter in bits, bundled up in various sizes. 'Quantum' was the name Planck gave to an individual packet of energy, with 'quanta' being the plural. The quantum of energy was a radical break with the long-established idea that energy was emitted or absorbed continuously, like water flowing from a tap. In the everyday world of the macroscopic, where the physics of Newton ruled supreme, water could drip from a tap, but energy was not exchanged in droplets of varying size. However, the atomic and subatomic level of reality was the domain of the quantum.

Bohr discovered that the energy of an electron inside an atom was 'quantised'; it could possess only certain amounts of energy and not others. The same was true of other physical properties, as the microscopic realm was found to be lumpy and discontinuous. Not some shrunken version of the large-scale world that we humans inhabit, where physical properties vary smoothly and continuously, where going from A to C means passing through B. Quantum physics, however, revealed that an electron in an atom can be in one place, and then, as if by magic, reappear in another without ever being anywhere in between, by emitting or absorbing a quantum of energy.

By the early 1920s, it had long been apparent that the advance of quantum physics on an ad hoc, piecemeal basis, had left it without solid foundations or a logical structure. Out of this state of confusion and crisis emerged a bold new theory; known as quantum mechanics, with Werner Heisenberg and Erwin Schrödinger, third and sixth from the right in the back row, leading the way. In 1927 Heisenberg made a discovery. It was so at odds with common sense that he initially struggled to grasp its significance. The uncertainty principle said that if you want to know the exact velocity of a particle, then you cannot know its exact location, and vice versa.

Bohr believed he knew how to interpret the equations of quantum mechanics; what the theory was saying about the nature of reality. Questions about cause and effect, or whether the moon exists when no one is looking at it, had been the preserve of philosophers since the time of Plato and Aristotle. However, after the emergence of quantum mechanics they were being discussed by the twentieth century's greatest physicists.

The debate that began between Einstein and Bohr at the Solvay conference in 1927, raised issues that continue to preoccupy many physicists and philosophers to this day; what is the nature of reality, and what kind of description of reality should be regarded as meaningful? 'No more profound intellectual debate has ever been conducted', claimed the scientist and novelist CP Snow. 'It is a pity that the debate, because of its nature, can't be common currency.'

When Einstein and Bohr first met in Berlin in 1920, each found an intellectual sparring partner who would, without bitterness or rancour, push and prod the other into refining and sharpening his thinking about the quantum. 'It was a heroic time,' recalled Robert Oppenheimer, who was a student in the 1920s. 'It was a period of patient work in the laboratory, of crucial experiments and daring action, of many false starts and many untenable conjectures. It was a time of earnest correspondence and hurried conferences, of debate, criticism and brilliant mathematical improvisation. For those who participated it was a time of creation.'

Planck, Einstein, Bohr, Heisenberg, Schrodinger, Born, Pauli, De Broglie, Dirac, the leading lights of the quantum revolution, are all there in that picture.

Sunday 20 March 2011

Geek Nation

Geek Nation: How Indian science is taking over the world by Angela Saini

Financial Times, 19-20 March 2011

‘It shall be the duty of every citizen of India to develop the scientific temper, humanism and the spirit of inquiry and reform.’ The inclusion of this statement in the Indian constitution, which came into effect on January 26 1950, was insisted upon byJawaharlal Nehru, India’s first prime minister.

Nehru’s ‘scientific temper’ is a wonderfully concise phrase, which describes his vision of a nation in which people could think independently, employ logic and understand the scientific method. In a land of religion, Nehru put his faith in science and technology. He believed that it was ‘science alone that can solve the problems of hunger and poverty, insanitation and illiteracy, of superstition and deadening custom and tradition’ and that the ‘future belongs to science and to those who make friends with science’. Nehru wanted a nation of geeks.

‘Wherever in the world we live, Indians and people of Indian origin are famous for being swots, nerds, dweebs, boffins, and dorks,’ writes Angela Saini in Geek Nation. A British science journalist of Indian parentage, Saini spent six months in India exploring Nehru’s geek nation almost 50 years after his death.

With a population approaching 1.2 billion, India has the largest pool of scientists and engineers in the world. While the literacy rate hovers around a dismal 60 per cent, some 400 universities produce two million graduates every year, including a staggering 600,000 engineers, the most sought after of which are from the 16 Indian Institutes of Technology (IIT’s). Yet, instead of discovering hothouses of intellectual curiosity and innovation, Saini found drones, not geeks. The relentless pressure on India’s students is ‘disabling imaginations’ and driving hundreds to suicide.

From the vast Soviet-style Bhabha Atomic Research Centre to the Academy of Sanskrit Research, ‘the geeky and the bizarre’ sit side-by-side; wacky ideas are more easily tolerated than in the west. Indians, Saini observes, have ‘a unique freedom to explore the edges of what’s believed to be possible’.

Indian science is far from taking over the world: it currently contributes less than 3 per cent of global research output, lagging far behind the US and UK. Yet an increasing number of Indian researchers, having established reputations aboard, are returning home to lead a younger generation.

Saini’s vivid portrait of hi-tech India reveals a country in a hurry. No one knows how long it will take, but India’s present economic expansion is a reminder that more than 1,000 years ago it had a scientific culture as advanced as any in the world. ‘The Empires of the future,’ Winston Churchill once said, ‘are going to be the empires of the mind.’

Wednesday 2 March 2011

The man who went nuclear


The Man Who Went Nuclear: How Ernest Rutherford Ushered in the Atomic Age


Independent, 3 March 2011

Did the nuclear age begin in 1942, when Chicago Pile-1, a reactor built in a squash court, went "critical" by achieving a self-sustaining chain reaction? Or was it on 16 July 1945 in the Jemez mountains in New Mexico, when "The Gadget", the first atomic bomb, was successfully tested and Robert Oppenheimer quoted the Bhagavad Gita? Maybe it was June 1954, when the Russian Obninsk nuclear station first generated electricity for the grid.

In reality, it was during a meeting of the Manchester Literary and Philosophical Society that the nuclear age was announced, on Tuesday, 7 March 1911, by Professor Ernest Rutherford, the 39-year-old head of physics at Manchester University. Rutherford was born in 1871, in Spring Grove, New Zealand. Descended from Scottish emigrants, it was from this scattered rural community on the north coast of the South Island that Rutherford's aptitude for science and maths led in 1895 to a coveted place at Cambridge. There, under the direction of JJ Thomson, Rutherford established a reputation as a fine experimentalist with a study of X-rays.

Though surrounded at Cambridge by all the excitement generated by Thomson's discovery of the electron in 1897, Rutherford opted to investigate radioactivity and soon found that there were two distinct types of radiation emitted from uranium, which he called alpha and beta, before a third was discovered, called gamma rays.

Aged just 27, in 1898, he was appointed professor of physics at McGill University in Montreal, Canada. Among his successes over the next nine years the most important was the discovery, with his collaborator Frederick Soddy, that radioactivity was the transformation of one element into another due to the emission of an alpha or beta particle. Rutherford regarded "all science as either physics or stamp collecting" but saw the funny side when he received the 1908 Nobel prize for chemistry for this seminal work. By then he was in Manchester.

"Youthful, energetic, boisterous, he suggested anything but the scientist," was how Chaim Weizmann, then a chemist but later the first president of Israel, remembered Rutherford in Manchester. "He talked readily and vigorously on any subject under the sun, often without knowing anything about it. Going down to the refectory for lunch, I would hear the loud, friendly voice rolling up the corridor."

At the time Rutherford was busy using the alpha particle to probe and unlock the secrets of the atom. But what exactly is an alpha particle? It was a question that Rutherford and his German colleague Hans Geiger answered. It was a helium ion; that is, a helium atom that had been stripped of its two electrons. Rutherford had noticed, while still in Montreal, that some alpha particles passing through thin sheets of metal were slightly deflected, causing fuzziness on a photographic plate. It was something he asked Geiger to investigate.

As instructed by Rutherford he fired beams of alpha particles at some gold foil and by the tiny flashes of light when they struck a zinc sulphide screen discovered that a few "were deflected through quite an appreciable angle". Soon afterwards Rutherford assigned a research project to a promising undergraduate called Ernest Marsden: "Why not let him see if any alpha particles can be scattered through a large angle?" Marsden found some alpha particles bouncing straight back after hitting the gold foil and Rutherford was shocked: "It was almost as incredible as if you had fired a 15-inch shell at a piece of tissue paper and it came back and hit you."

Marsden and Geiger made comparative measurements using different metals and they discovered exactly they same large angle scattering. In June 1909 they published their extraordinary results, but with Rutherford unable to offer any kind of explanation they attracted little interest.

After decades of intense arguments, by 1910 the reality of atoms was established beyond reasonable doubt. The most widely-accepted atomic model was Thomson's so-called "plum pudding". Its ingredients consisted of a ball of diffuse "positive electricity" in which negatively charged electrons were embedded like plums in a pudding. But Rutherford knew that the atom of his old mentor couldn't explain alpha particle scattering. The probability that the accumulated effect of a number of tiny ricochets off electrons in Thomson's atom resulted in even one alpha particle being scattered backwards was almost zero. By December 1910, Rutherford believed that given the mass and energy of an alpha particle the large deflections must be the result of a single collision with an atom. It led him "to devise an atom superior to J.J's" he said at time.

Rutherford's atom consisted of a tiny central core containing virtually all the atomic mass, which he later called the nucleus, but it occupied only a minute volume "like a fly in a cathedral".Most alpha particles would pass straight through Rutherford's atom in any "collision", since they were too far from the tiny nucleus at its heart to suffer any deflection. But if an alpha particle approached the nucleus head-on, the repulsive force between the two would cause it to recoil straight back like a ball bouncing off a brick wall. Rutherford said that such direct hits were "like trying to shoot a gnat in the Albert Hall at night". Rutherford's model allowed him to make definite predictions using a simple formula he had derived about the fraction of scattered alpha particles to be found at any angle of deflection.

Experimental checks performed by Geiger and Marsden confirmed the predictions, but few physicists beyond Manchester gave any serious attention to the nuclear atom. Although Rutherford did not explicitly suggest a planetary model of the atom, there were those who knew that's exactly what it was. For most that settled the matter, Rutherford's atom was fatally flawed. A model of the atom with electrons moving around the nucleus, like planets orbiting the sun, would collapse. Any object moving in a circle undergoes acceleration, if it happens to be a charged particle, like an electron, as it accelerates it continuously losses energy in the form of radiation. An electron in orbit around the nucleus would spiral into it. Rutherford's atom was unstable and the existence of the material world was compelling evidence against it. Enter Niels Bohr.

Arriving in Manchester in March 1912 to learn about radioactivity, it wasn't before long the 27-year-old Dane began thinking about how to prevent Rutherford's nuclear atom from collapsing. His solution employed the quantum – the idea that energy comes in packets. Bohr argued that electrons inside an atom could only move in certain orbits in which they did not radiate energy and therefore couldn't spiral into the nucleus. Bohr said that each orbit had a certain energy associated with it, so all the allowed orbits were in effect a series of energy levels, like the rungs of a ladder. For an electron to move between levels, the famous quantum leap, required it to absorb or emit a quantum of energy that was equivalent to the difference in energy between the two levels.

"It is difficult to overestimate the scientific importance of the discovery of the nucleus," says Sean Freeman, professor of nuclear physics at Manchester University. "Rutherford's insight, imagination and attention to detail enabled him to make revolutionary discoveries using rather rudimentary technology by modern standards. He was a true pioneer."

One of his most important achievements was made in his spare time while Rutherford was developing methods for detecting submarines during the First World War – he split the atom. Arriving late for a committee meeting one day, Rutherford didn't apologise, but announced: "I have been engaged in experiments which suggest that the atom can be artificially disintegrated. If it is true, it is of far greater importance than a war!" It was 1919 before he published the results that showed the nucleus contained positively charged particles he called protons by knocking them out of nitrogen nuclei using alpha particles – thereby effectively splitting the nucleus and hence the atom. It was the last work he did at Manchester before moving to Cambridge to take over from Thomson as head of the Cavendish Laboratory.

It was there that in 1932 his colleagues James Cockcroft and Ernest Walton "split the atom" using the world's first particle accelerator. Also at the Cavendish, James Chadwick used Rutherford's suggestion that there was probably another constituent to heavier nuclei to discover the neutron. The particle plays the central role in establishing a nuclear chain reaction. The three men were among the 11 former students and colleagues of Rutherford who would win the Nobel prize.

Another of those 11 was Niels Bohr, who said that Rutherford never spoke more angrily to him than he did one evening at a Royal Society dinner. He had overheard Bohr refer to him by his title (Rutherford was awarded a peerage in 1931) and angrily asked the Dane loudly: "Do you Lord me?" Rutherford never cared for the honours and was indifferent to academic or social standing. What mattered most to him were the results of experiments. "I was brought up to look at the atom as a nice hard fellow, red or grey in colour, according to taste," he once said. It was a model he replaced with an atom that began the nuclear age.