Thursday, 29 December 2011

The Infinity Puzzle

                                                                 
The Infinity Puzzle: How the quest to understand quantum field theory led to extraordinary science, high politics, and the world's most expensive experiment by Frank Close

Literary Review, December 2011


‘A desk or table, a chair, paper and pencils,’ was what Einstein asked for in 1933 when he arrived at the Institute for Advanced Study in Princeton. Then he remembered one last item: ‘Oh yes, and a large wastebasket, so I can throw away all my mistakes.’ In the next two decades before his death in 1955 there were plenty of them, but Einstein had earned the right to make those mistakes in search of his holy grail – a unified field theory.

In 1864 the Scottish physicist James Clerk Maxwell showed that electricity and
magnetism were different manifestations of the same underlying phenomenon – electromagnetism. His great achievement was to encapsulate the disparate behaviour of electricity and magnetism into a set of four elegant mathematical equations that were to be the crowning glory of nineteenth-century physics.

Einstein sought a single, all-encompassing theoretical structure that would unify electromagnetism with his theory of gravity, the general theory of relativity. Such a unification was the logical next step for Einstein, but few were convinced, for in the twentieth century two new forces were discovered and given names that alluded to their strengths relative to the electromagnetic: the so-called strong and weak forces.

The strong force is the binding force that holds atomic nuclei together; conversely the weak force destabilises nuclei, causing a form of radioactivity that plays an essential role in the way that the sun produces its energy. As the years passed the belief grew that these four forces – electromagnetism, gravity, and the strong and weak forces – would be reunited in a Theory of Everything.

With the exception of general relativity, physicists have been able to ‘quantize’ the other three forces, since quantum mechanics deals with the atomic and sub-atomic domain. In effect they managed to get three trains running on the same size track. The quantum gravity train is still stuck at the station. In The Infinity Puzzle Oxford particle physicist Frank Close tells the tale of quantum field theory – the attempts to understand and then unite electromagnetism and the strong and weak forces.

In the 1930s the union of Maxwell’s theory of electromagnetism, Einstein’s theory of special relativity, and quantum mechanics gave birth to a theory of the electromagnetic force known as quantum electrodynamics, or QED. However, in the bowels of the theory lurked a monster – infinity. The equations of QED kept predicting that the chance of some things occurring was ‘infinite’. When infinity pops up in physics it spells disaster since, as Close explains, it is ‘proof that you are trying to apply a theory beyond its realm of applicability’. In the case of QED, if you can’t calculate something as basic as a photon – a particle of light – interacting with an electron without getting infinity, you haven’t got a theory.

It was the late 1940s before a way was found to solve the infinity puzzle in QED by a process called renormalisation. The calculations of many properties of atoms and their constituent particles, including those for the mass and charge of an electron, gave infinity as the answer. However, these two quantities of the electron had already been measured to a high degree of precision using other methods and the results were sufficient to provide benchmarks for anything else physicists wished to compute in QED. Instead of infinity, many of the answers now turned out to be finite and correct. Some physical quantities that have been calculated using renormalisation agree with earlier experiments to an accuracy of one part in a trillion, which is an order of magnitude akin to the diameter of a hair when compared to the width of the Atlantic.

Renormalisation may have been inelegant but its ‘recipe for extracting sensible answers for QED worked’. Those who cooked it up independently of each other – Richard Feynman, Julian Schwinger and Sin-Itiro Tomonaga – won a share of the 1965 Nobel Prize in Physics.

Gerard 't Hooft
When it came to the weak force, infinity was not so easy to banish, even with the efforts of the world’s leading physicists over a quarter of a century. It was the brilliant Dutch postgraduate student Gerard ’t Hooft who finally found a solution. The nature of the problem, how it was solved, and the inevitable jostling for Nobel Prizes are major themes of Close’s gripping and extensively researched narrative history of particle physics over the last sixty years.

It may be a collective enterprise but, as Close’s book reveals, science is full of wrong turns, partial answers, missed opportunities and misunderstandings. How could it be otherwise, since the dispassionate, logic-driven stereotype of the scientist is a fiction? The physicists in The Infinity Puzzle ‘experience the same emotions, pressures and temptations as any other group of people, and respond in as many ways’.

A timeline of who did what when, together with a glossary, could be added to the paperback, to help readers as they grapple with gauge invariance, parity violation, spontaneous symmetry breaking, gluons, colour, the Higgs boson and SU(2)xU(1). Yet Close has succeeded in humanising a dramatic era of physics in what is my science book of the year. Some sections of his narrative are difficult because of the inherent nature of the ideas he’s trying to explain. But then, it took exceedingly clever people to devise them.

‘Hold Infinity in the palm of your hand,’ William Blake wrote in the ‘Auguries of Innocence’. Frank Close does a fabulous job of reconstructing how physicists like Feynman and ’t Hooft managed to do exactly that.

Solvay 1911

The Witches Sabbath 

Nature.com, 21 November 2011


The First Solvay Congress, Brussels, October 1911. Left-to right standing – Robert Goldschmidt, Max Planck, Heinrich Rubens, Arnold Sommerfeld, Frederick Lindemann, Maurice de Broglie, Martin Knudsen, Fritz Hasenöhrl, Georges Hostelet, Edouard Herzen, James Hopwood Jeans, Ernest Rutherford, Heike Kamerlingh Onnes, Albert Einstein, Paul Langevin. Seated – Walther Nernst, Marcel Brillouin, Ernest Solvay, Hendrik Lorentz, Emil Warburg, Jean-Baptiste Perrin (reading), Wilhelm Wien (upright), Marie Curie, Henri Poincaré.


In June 1911 Albert Einstein was a professor of physics in Prague when he received a letter and an invitation from a wealthy Belgium industrialist. Ernst Solvay, who had made a substantial fortune by revolutionizing the manufacture of sodium carbonate, offered to pay him one thousand francs if he agreed to attend a ‘Scientific Congress’ to be held in Brussels from 29 October to 4 November. He would be one of a select group of twenty-two physicists from Holland, France, England, Germany, Austria, and Denmark being convened to discuss ‘current questions concerning the molecular and kinetic theories’. Max Planck, Ernest Rutherford, Henri Poincare, Hendrik Lorentz and Marie Curie were among those invited. It was the first international meeting devoted to a specific agenda in contemporary physics: the quantum.


Planck and Einstein were among the eight asked to prepare reports on a particular topic. To be written in French, German, or English they were to be sent out to the participants before the meeting and serve as the starting point for discussion during the planned sessions. Planck would discuss his blackbody radiation theory, while Einstein had been assigned his quantum theory of specific heat. Accorded the honour of giving the final talk, there was no room on the proposed agenda for a discussion of his light-quanta – better known these days as photons.
‘I find the whole undertaking extremely attractive,’ Einstein wrote to Walter Nernst, ‘and there is little doubt in my mind that you are its heart and soul.’ Nernst with his love of motorcars was more flamboyant than the staid Planck, but was just as highly respected – in 1920 he was awarded the Nobel Prize for chemistry for what became known as the third law of thermodynamics. A decade earlier, in 1910 he was convinced that the time was ripe to launch a cooperative effort to try and get to grips with the quantum he saw as nothing more than a ‘rule with most curious, indeed grotesque properties’. Nernst put the idea to Planck who replied that such ‘a conference will be more successful if you wait until more factual material is available’. Planck argued that ‘a conscious need for reform, which would motivate’ scientists to attend the congress was shared by ‘hardly half of the participants’ envisaged by Nernst. Planck was sceptical that the ‘older’ generation would attend or would ‘ever be enthusiastic’. He advised: ‘Let one or even better two years pass by, and then it will be evident that the gap in theory which now starts to split open will widen more and more, and eventually those still remote will be sucked into it. I do not believe that one can hasten such processes significantly, the thing must and will take its course; and if you then initiate such a conference, a hundred times more eyes will be turned to it and, more importantly, it will take place, which I doubt for the present.’
Undeterred by Planck’s response, Nernst convinced Solvay to finance the conference. Interested in physics, and hoping to address the delegates about his own ideas on matter and energy, Solvay spared no expense as he booked the Hotel Metropole. In its luxurious surrounding, with all their needs catered for, Einstein and colleagues spent five days talking about the quantum and, as Lorentz said in his opening remarks, the reasons why the ‘old theories do not have the power to penetrate the darkness that surrounds us on all sides’. However, he continued, that the ‘beautiful hypothesis of the energy elements, which was first formulated by Planck and then extended to many domains by Einstein, Nernst, and others’ had opened unexpected perspectives, and ‘even those who regard it with a certain misgiving must recognize its importance and fruitfulness.’
‘We all agree that the so-called quantum theory of today, although a useful device, is not a theory in the usual sense of the word, in any case not a theory that can be developed coherently at present,’ said Einstein. ‘On the other hand, it has been shown that classical mechanics…cannot be considered a generally useful scheme for the theoretical representation of all physical phenomena.’ Whatever slim hopes he abhorred for progress at what he called ‘the Witches’ Sabbath’, Einstein returned to Prague disappointed at having learnt nothing new. ‘The h-disease looks ever more hopeless,’ he wrote to Lorentz after the conference.
Nevertheless, Einstein had enjoyed getting to know some of the other ‘witches’. Marie Curie, whom he found to be ‘unpretentious’, appreciated ‘the clearness of his mind, the shrewdness with which he marshalled his facts and the depth of his knowledge’. During the congress it was announced that she had been awarded the Nobel Prize for chemistry. She had become the first scientist to win two, having already won the Physics prize in 1903. It was a tremendous achievement that was overshadowed by the scandal that broke around her during the congress. The French press had learned that she was having an affair with a married French physicist. Paul Langevin was another delegate at the congress and the papers were full of stories that the pair had eloped. Einstein, who had seen no signs of a special relationship between the two, dismissed the newspaper reports as rubbish. Despite her ‘sparkling intelligence’, he thought Curie was ‘not attractive enough to represent a danger to anyone’.
The Solvay Congress was the end of the beginning for the quantum. It dawned on physicists that it was here to stay and they were still struggling to learn how to live with it. When the proceedings of the conference were published it brought to the attention of others, not yet aware or engaged in the struggle, what an immense challenge it was to successfully do so. The quantum would be the focus of attention at the fifth Solvay conference in 1927. What happened in the intervening years is, as they say, history.

The Reason Why


The Reason Why: The Miracle of Life on Earth by John Gribbin

Literary Review, November 2011

In the summer of 1950 the New Yorker magazine published a cartoon suggesting that aliens were behind the mysterious disappearance of rubbish bins from the streets of New York. Not long after the cartoon appeared, a group of scientists at the Los Alamos Laboratory in New Mexico were joking about it with their distinguished visitor, the Italian physicist Enrico Fermi. During lunch Fermi suddenly asked, ‘Where is everybody?’ It took a moment for his colleagues to realise that he was referring to extraterrestrials.

With hundreds of billions of galaxies, the universe could easily be teeming with extraterrestrial life. However, the enormous intergalactic distances involved rules out the possibility of a visit. Yet Fermi was not thinking about the entire universe, only our tiny part of it – the Milky Way. After a quick calculation, then and there, he concluded that space-faring aliens, should they exist, would have colonised our galaxy long ago and therefore have visited Earth. As Fermi was regarded as one of the great physicists of the twentieth century, his back-of-the-envelope reasoning was taken seriously and led to what was called the Fermi Paradox: ‘If they are there, why aren’t they here?’ The reason why, argues science writer John Gribbin, is simple: ‘We are alone, and we had better get used to the idea.’

Gribbin begins his entertaining polemic by referring to an equation devised in 1961 by the American astronomer Frank Drake that attempted to quantify the chances of detecting intelligent life elsewhere in our galaxy. Starting with the total number of stars, Drake calculated how many of them were Sun-like, and then asked what fraction of these had planets, how many of them were capable of supporting life, and so forth. With one guesstimate piled upon another it’s a futile approach that, at best, succeeds only in demonstrating our ignorance, for the answers generated range from zero to the hundreds of billions.

Gribbin, though, is prepared to play the numbers game while treating Drake’s equation for what it is, ‘a kind of mnemonic to remind us of the sort of things we have to take into account when considering the possibility of finding intelligent life elsewhere’. For Gribbin, the reasonable thing to do is to look at the history and geography of our galaxy and try to understand why, and how, intelligent life emerged on Earth. For he seeks to understand not whether we are alone but why we are alone. It’s Fermi’s paradox repackaged as: ‘If they are not there, why are we here?’

Gribbin sets out the arguments that the Earth occupies a special location in both space and time that has allowed the development of the only technological civilisation in the Milky Way. The Sun, for example, is fortuitously situated in what is called the Galactic Habitable Zone, a kind of Goldilocks region that’s just right for life. Critically, five billion years ago when the solar system was being formed, the zone had an abundance of metallic elements that allowed the formation of a planet like Earth. Any closer to the galactic centre would have been hazardous for life because of radiation from supernovae explosions, while the outer regions were poor in metals.

Gribbin only entertains the possibility of ‘life as we know it’ and therefore the presence of water is essential. Fortunately for us, Jupiter is of the right size and in the right place to have sent water-rich asteroids and icy comets crashing to Earth early in its history. In yet another piece of luck, the Earth acquired a relatively large moon. Others have covered before much of what Gribbin describes about the Moon’s essential role in the development of life on Earth, but he does so in an easily accessible style befitting the author of more than 100 books. He explains how the fledgling Earth was formed close to Theia, another proto-planet the size of Mars, and that when the two collided the surface of the Earth turned into magma as it swallowed up Theia’s core.

The vast quantities of debris ejected into space as the result of these worlds colliding eventually formed the Moon. The collision tilted the Earth on its axis and set it spinning much faster than it does today. The Moon has acted as a stabilising influence ever since and continues to shield the Earth from the full effect of Jupiter’s gravity. And our neighbour has certainly prevented cosmic debris from striking the planet, says Gribbin. He argues convincingly that the Moon is the single most important factor in making life on Earth possible.

The collision that led to the creation of the Moon resulted in the Earth having a thin crust and, through plate tectonics, the dynamic surface required to sustain the temperature range required for liquid water – and therefore, eventually, us. Otherwise it would probably have become a hot desert like Venus or a frozen world as cold as the Moon.

Life on Earth, argues Gribbin, is the product of such an improbable sequence of chance events that the possibility of finding any other technological civilisation in the galaxy is effectively nil. Yet the fact remains that we are here to ponder the reason for our existence because things are the way they are, because they were the way they were. Elsewhere they could have been different, leading to life, but not as we know it.

Thursday, 15 December 2011

Art of Science

The Art of Science: A Natural History of Science by Richard Hamblyn

Independent, 4 November 2011                     

is a thing of beauty. It's the inscription carved on the memorial stone in Westminster Abbey commemorating the life and work of the greatest British physicist of the 20th century, Paul Dirac. By accounting for the spin of the electron, Dirac's equation managed to reconcile Einstein's special theory of relativity with one of the few genuine revolutions in human thought, quantum mechanics. Does it matter, asks Richard Hamblyn, that Dirac's equation remains a closed book to all but a handful of initiates able to translate its compact hieroglyphics into a statement about the nature of the universe?


For Hamblyn, it does. Few of us can read ancient Aramaic or have ever finished Finnegan's Wake; nevertheless we manage to struggle along just fine. So why is Dirac's equation, and other such mathematical statements, different? Hamblyn worries that while an inability to read an ancient language or an experimental novel rarely leads to a wholesale rejection of all other languages or literature, there is a tendency among non scientists to characterise the whole of science as being as reductive, difficult and as alien as   Dirac's equation.

Introducing this anthology, Hamblyn overstates the case to reinforce his point, especially when the likes of Brian Cox and Alice Roberts make science on TV accessible to all. Intellectual engagement and entertainment are the key ingredients as scientists try to connect with audiences beyond the lab and lecture theatre. Hamblyn achieves that in this collection as he showcases not only readable translations of key scientific ideas but situates those ideas in their cultural and historical context.

The hundred-odd pieces selected either reflect the situation in which a moment of scientific understanding took place or reveal the personalities of the scientists involved. The extract from James Watson's account of the events leading up to the discovery of the structure of DNA, for example, highlights the egotism and insensitivity to be found on virtually every page of The Double Helix – yet these character traits were important factors in Watson's scientific success.

Among the classics Hamblyn has chosen is Tycho Brahe on the supernova, William Harvey on the circulation of blood, Galileo on the moons of Jupiter, Einstein on the quantum theory of light, Fahrenheit on his temperature scale and Darwin on the Origin of Species. However, the strength of the collection lies in the surprises from among the contributions made by amateurs: Seneca on whirlwinds; the schoolteacher and champion of atomism John Dalton on colour blindness; the classification of clouds into cirrus, cumulus, and stratus that remains in use today by the pharmacist Luke Howard; the account by the country doctor Gideon Mantell of how he reconstructed the Iguanodon from its fossilised teeth, and - my favourite – a piece on snowflakes by Vermont farmer Wilson Bentley.

With the ingenious aid of a bellows camera rigged up to a microscope, in 1885 Bentley became the first person successfully to photograph snowflakes. Over the next 40 years, having built up thousands of images, Bentley concluded that no two snowflakes are the same. His life's work was "one of the little romances of science". Although there are other such romances, Hamblyn has largely chosen pieces that have documentary value.

James Lind's account of his clinical trials on board HMS Salisbury affords us a surgeon's-eye view of everyday life on an 18th-century warship complete with barrels of baked biscuits and a scurvy-ridden crew. We get a glimpse of the reaction to Copernicus's new ordering of the cosmos through contemporary accounts that also shed light on the means by which his ideas began to spread.

"Art is the Tree of Life," wrote William Blake. "Science is the Tree of Death." This collection proves such accusations to be groundless as it offers ample evidence, to be dipped into at leisure, for what Hamblyn describes as the greatest invention of the human imagination, "the art of scientific thinking".

Dawkins' new book impresses the kids


The Magic of Reality: How we know what's really true by Richard Dawkins
New Humanist, Nov-Dec 2011
Richard Dawkins has a new book out, his first for “a family audience”, called The Magic of Reality: How We Know What’s Really True. But is it any good? I turned it over to the experts to find out – my sons Ravinder, 12, and Jaz, nine. Before I tell you what they said, a brief summary: evolution, dinosaurs, natural selection, time, continental drift, rainbows, earthquakes, DNA and the FoxP2 gene, supernova, the Goldilocks zone, the law of averages are among the array of ideas and concepts explained by Dawkins.
Each chapter is headed by a question, such as “what is reality?”, “what are things made of?”, “how did everything begin?”, “who was the first person?”,  and Dawkins begins with mythical answers to them from around the world – because “they are colourful and interesting and real people believed them and some still do” – before explaning the science. Dawkins cleverly uses the fact that we all love a good story to hook the reader before revealing what science has discovered, since “the truth is more magical …than any myth or made-up mystery or miracle”.
My boys, raised on Harry Potter and the Percy Jackson series of books by Rick Riordan about the adventures of a young demi-god, needed no persuading that myths and magic can be fun. Equally, as the children of a science writer, they have no trouble distingishing myth from science, and recognise that facts can have their own magic: “I didn’t know my 185,000,000 great grandfather was a fish, 417 million years ago!’ said Ravinder, complaining that they had never taught him this astonishing fact when he did evolution at school.
They also seemed to grasp that some questions are not amenable to a precise scientific answer. They both readily accepted that the question of who was the first person, and when did they live, can’t have a precise answer because it’s like asking, “When does someone stop being a baby and become a toddler?” Dawkins does narrow it down to somewhere between a million and a hundred thousand years ago “when our ancestors were sufficiently different from us that a modern person wouldn’t have been able to breed with them if they had met”, which is all well and good although it did leave me having to explain to the nine-year-old what “breed” means.
The boys wanted to make sure that Dawkins’s collaborator, illustrator Dave McKean, got a special mention for his “fantastic” and “brilliant” illustrations, “which make the book come alive” (their words), because “just seeing them makes you want to look more closely and read the words”. But they warned me not to look at pages 94 and 95 before going to bed “because reality is a bit too real in that picture”. Like a typical adult I ignored them and regretted it: the close-up of dust mites had me scratching all night long.
For Jaz, Dawkins is fun to read “because he writes like he’s talking to you”. Ravinder’s verdict? “This is the best non-fiction book I’ve ever read.” Coming from someone who reads more than I do, that’s some endorsement. Having written a book on quantum physics, I hope that when the boys get round to reading it one day they will see it for what it clearly is, and revise their all-time list. But it seems I’ll just have to accept that with the help of McKean, Dawkins has conjured up a book that deserves the top slot, at least for the time being.

Wednesday, 14 December 2011

The Quantum Universe


The Quantum Universe: Everything that can happen does happen by Brian Cox and Jeff Forshaw
Daily Telegraph, 22 October 2011
More than 10,000,000,000, 000,000,000 transistors are manufactured each year. For an idea of the magnitude of this number, it is roughly 100 times greater than all the grains of rice consumed annually by the people of planet Earth. This astonishing fact about the fundamental building block of all electronic devices is buried deep within The Quantum Universe, the latest book from Brian Cox and Jeff Forshaw.The very first transistor computer built in 1953 had just 92 transistors, but today more than 100,000 can be bought for the cost of a single grain of rice and there are around a billion of them in a mobile phone. It is easy to see why Cox and Forshaw believe the invention of this device was “the most important application of quantum theory”, while the theory itself is “the prime example of the infinitely esoteric becoming the profoundly useful”.


It is esoteric because the theory describes a reality in which a particle can be in several places at once and moves from one place to another by exploring the entire universe simultaneously. The American physicist Richard Feynman unveiled a piece on the quantum universe, but nevertheless cautioned: “I think I can safely say that nobody understands quantum mechanics. Do not keep asking yourself, if you can possibly avoid it, ‘But how can it be like that?’ Nobody knows how it can be like that.”
Heeding this advice and sticking to the maxim that “following the rules is far simpler than trying to visualise what they actually mean”, Cox and Forshaw set out to “demystify quantum theory”. If they do not entirely succeed, it says more about the size of the task they have set themselves than its execution. The word “quantum”, they warn at the outset, is at “once evocative, bewildering and fascinating”. Having written a narrative history myself with that one word as a title, I know exactly what they mean.
Peppered with diagrams and equations, The Quantum Universe is not an easy read. We encounter Planck's constant (nature’s own axe for chopping up energy and much else besides); the principle of least action; the wave function; the uncertainty principle; electron standing waves; the exclusion principle; semiconductors; Feynman diagrams; quantum electrodynamics; the Higgs boson and the standard model of particle physics. The reader is made to work along the way and for those prepared to do so there is much to learn. Why, for example, empty space isn’t empty but is a seething maelstrom of subatomic particles.
While they sidestep the question of its interpretation and the decades-long debate between Albert Einstein, Niels Bohr and others, for Cox and Forshaw there is no better demonstration of the power of the scientific method than quantum mechanics. Nobody could have come up with the theory without the aid of detailed experiments, and the physicists who came up with it were forced to suspend and then discard their previously held beliefs to explain the evidence that confronted them. In an attempt to convince any sceptical readers about the power of quantum mechanics, the authors turn to the death of stars and the Chandrasekhar limit as they champion curiosity-driven research.
The sun is a gaseous mix of protons, neutrons, electrons and photons with the volume of a million earths that is slowly collapsing under its own gravity. This compression heats the core to such temperatures that protons fuse together to form helium nuclei. The fusion process releases energy that increases the pressure on the outer layers of the star, thus balancing the inward pull of gravity. And so it will go on for the next five billion years until the sun runs out of material to fuse and ends up as a super dense ball of nuclear matter in a sea of electrons known as a white dwarf. It’s a fate that will befall more than 95 per cent of the stars in our galaxy. Though the highlight of the book is confined to the epilogue, Cox and Forshaw show how it’s possible to approximately calculate the largest possible mass of these stars.
The detailed and more complex calculation was originally published in 1931 by the Indian astrophysicist, and future Nobel laureate, Subrahmanyan Chandrasekhar. It led to two remarkable predictions: white dwarf stars exist and they cannot have a mass greater than 1.4 times that of the sun. Astronomers have catalogued some 10,000 white dwarves and the largest recorded mass is just under 1.4 solar masses. Depending on four of nature’s fundamental numbers – Planck’s constant, the speed of light, Newton’s gravitational constant and the mass of the proton – Chandrasekhar’s limit is a stunning triumph of the scientific method. “The eternal mystery of the world is its comprehensibility,” Einstein wrote. “The fact that it is comprehensible is a miracle.”

Wednesday, 21 September 2011

Knocking on Heaven's Door

Knocking on Heaven's Door: How physics and scientific thinking illuminate the universe and the modern world by Lisa Randall


Independent, 16 September 2011 

Why do things weigh what they do? It seems like a simple enough question, but physicists don't know for sure why particles weigh anything at all. For the best part of 50 years they have had an answer – the Higgs boson. It plays such a fundamental role in nature that its been dubbed the "God Particle".

Attempting to answer the question of how the universe got its mass means searching for the Higgs boson. It's a nine billion-dollar enterprise involving thousands of scientists and the largest, most complex machine ever built. The Large Hadron Collider (LHC) contains an enormous 26.6km circular tunnel that stretches between the Jura Mountains and Lake Geneva across the French-Swiss border. Electric fields inside accelerate two beams of protons as they go around 11,000 times per second.

 In this fascinating book, Lisa Randall, professor of theoretical physics at Harvard, explains the experimental research at the LHC and the theories that try to anticipate what they will find: "The goal... is to probe the structure of matter at distances never before measured and at energies higher than have ever been explored." These energies should generate an array of exotic fundamental particles and reveal interactions that occurred early in the universe's evolution, roughly a trillionth of a second after the Big Bang, 13.75 billion years ago. In the debris of colliding protons, physicists hope to find the Higgs boson and get a glimpse at the nature of dark energy and dark matter that make up 96 percent of the universe.

It was 1964 when Peter Higgs conceived of an invisible field that filled the cosmos immediately after the Big Bang. As the newborn universe expanded and cooled, the field switched on. At that moment massless particles that had been travelling at the speed of light were caught in the field and became massive. The more strongly they felt the effects of the field, the more massive they became. Without this field atoms, molecules, galaxies, stars and the planets would not exist.

 The Higgs field is like a field of snow that stretches forever in all directions. Beams of light move as though they have skis on: they zip through the field as if it weren't there. Some particles have snowshoes while others go barefoot and trudge around. A particle's mass is simply a measure of how much it gets bogged down in the field.

The ripples in the Higgs field appear as particles called Higgs bosons – the snowflakes that make up the cosmic snowfield, and the thing that physicists need in order to explain why stuff weighs anything. The Higgs mechanism tells how elementary particles go from having zero mass in the absence of the Higgs field to having the masses measured in experiments. The Higgs boson is a crucial part of what's called the Standard Model of particle physics. It's a construction made out of 24 fundamental building-blocks of matter: 18 of these particles are six types of quarks that come in three varieties. The remaining six are called "leptons", a family that includes electrons.

There are also other particles known as "bosons", responsible for transmitting forces of nature. The electromagnetic force is carried by photons – the particles of light. Inside atomic nuclei, quarks are stuck together by the strong force carried by "gluons". The W and Z bosons carry the weak force that is responsible for radioactive decay. "With these ingredients," explains Randall, "physicists have been able to successfully predict the results of all particle physics experiments to date."

 On 10 September 2008, the world's media gathered near Geneva at CERN, home of the European Centre for Particle Physics, to watch the LHC being switched on. "People followed the trajectory of two spots of light on a computer screen with unbelievable excitement," recalls Randall. In the months to follow, the LHC was to be cranked up to energies that would replicate those of the early universe, but nine days later euphoria transformed into despair as a malfunction triggered an emergency shutdown. After a year-long delay and repairs costing $40m, the LHC came back online in November 2009.

Yet there are other, even bigger, problems in particle physics that the LHC should help to solve. One is the hierarchy problem. The Higgs mechanism addresses the question of why fundamental particles have mass. The hierarchy problem asks the question, why those masses are what they are.

Another concerns hints about the "holy grail of physics", the so-called "theory of everything". The best candidate for such a theory is superstrings, in which particles are really little oscillating bits of "string". The different levels of "vibration" of these strings correspond to the different particles. Alas, it was later found that there were at least five different string theories. Physicists were relieved when it was discovered they were all just different approximations to a more fundamental theory called M-theory. However, the theory poses enormous conceptual and mathematical challenges.

The "super" in superstrings refers to something called supersymmetry. The LHC will be used to look for "supersymmetric particles". If found, they would provide the first tangible evidence in support of superstrings and M-theory. The proponents of superstrings and M-theory justify their creation by pointing to its elegance and beauty.

And there's the problem. The "quest for beauty", which elevates aesthetics over empirical evidence in the formulation of a theory, took centre stage in the more esoteric areas of theoretical physics and cosmology, in the absence of experimental data. An appreciation of beauty certainly has a role to play when faced with a blank piece of paper; an appeal to aesthetic criteria is part of the physicists' unshakeable belief in the underlying simplicity and beauty of nature.

It is one of their most powerful guiding principles. Nature should not be more complicated than it has to be, they tell themselves. It is this belief that motivates the search for a "theory of everything". Randall quotes Keats: "Beauty is truth, truth beauty". It can't be denied that "the search for beauty - or at least simplicity - had also led to truth". Yet she finds the assumption "a little slippery" and readily admits that "although everyone would love to believe that beauty is at the heart of great scientific theories, and that the truth will always be aesthetically satisfying, beauty is at least in part a subjective criterion".

There is nothing wrong with speculation; it is a necessary and vital part of any science, as a first step. The danger of "truth through beauty" in physics, as Randall describes it, is that it makes a virtue of necessity. Wherever experimental evidence can be coaxed out of nature, it suffices to corroborate or refute a theory and serves as the sole arbiter of validity. As Darwin's champion Thomas Huxley once said, "science is organized common sense where many a beautiful theory was killed by an ugly fact". Despite the delay in the LHC, it will be a source of invaluable new data that will provide stringent constraints on what phenomena or theories beyond the Standard Model can exist. We maybe on the edge of discovery, but for the moment the Higgs boson remains a hypothetical particle on which rests the weight of the universe.

Wednesday, 4 May 2011

Robin Ince: The Science of Comedy


You don’t have to leave your brain at the door when going to a gig. I met up with the comedian who is taking Brian Cox and other scientists on tour.

Daily Telegraph, 30 April 2011

The “free visitor destination for the incurably curious”, otherwise known as the Wellcome Collection, opposite London’s Euston station, seemed an apt place to meet Robin Ince, comedian and co-presenter of Radio 4’s science-meets-humour chat show The Infinite Monkey Cage.

“There are a lot of intelligent, well-read comedians out there who are interested in science and who want to share their passions,” says Ince, who has done more than anyone to help them do just that. He is the brains behind Nine Lessons and Carols for Godless People, a variety show that celebrates science while giving the audience a healthy dose of humour and music.

Each Christmas since 2008 the shows have played to packed houses of non-religious people grabbing the opportunity to laugh out loud at the likes of comedian and trained physicist Dara O’Brien and being entertained by bite-sized lectures from scientists like the evolutionary biologist Richard Dawkins. “If the Royal Variety Show was put in a matter transportation machine with the Royal Institution Christmas Lectures,” says Ince, “this is what you’d get.” It’s what he calls “reading-list comedy”, because it’s all about ideas that leave the audience wanting more – and a bibliography.



Ince is about to give them more with his new tour, Uncaged Monkeys: A Night of Science and Wonder, opening in Oxford tomorrow and ending with two nights at London’s Hammersmith Apollo on May 16 and 17.

Ince’s fellow “monkeys” will be Brian Cox, recently on our screens presenting Wonders of the Universe; Ben Goldacre, psychiatrist and slayer of bad science; and Simon Singh, the best-selling science writer and celebrated debunker of the claims of alternative medicine. With their guests the quartet will be tackling everything from the Big Bang to bonobo apes and anything else they can cram into two hours.

Once again the driving force, Ince describes himself as “the idiot who will guide the audience”. Though he loved science as a child, he explains that he lost interest in it around the age of 13, “when science seemed to become facts and dull experiments with apparently no link to the world”.There was, he regrets, “no sense that the periodic table is really the ingredients list of the universe so far”.

It was only in his mid twenties that the popular books of Nobel Prize-winning, bongo-drum-playing physicist Richard Feynman rekindled his curiosity for all things scientific. “Taking a tour about science to theatres that seat up to 3,000 people is a project I’ve wanted to do for a long time,” admits Ince. The fact that he can do so may in part be down to an English-born, Canadian journalist and writer living in New York, one Malcolm Gladwell.

In November 2008, Gladwell’s two performances at the Lyceum, one of the largest theatres in London’s West End, were quickly sold out. A staffer at the New Yorker magazine, Gladwell is often described as one of the most brilliant and influential writers of his generation. His bestselling books, such The Tipping Point and Blink, identify and explore social trends and behaviour in novel ways. After his gigs in London he returned to Britain the following year to play four dates at venues that you’d normally associate with hip indie bands. Gladwell, with his afro and charisma, made ideas sexy, very much as Brian Cox is doing today.

Ince and Cox’s fellow uncaged monkey Simon Singh identifies three distinct types of event that are taking place: listening to scientists (lectures), discussing with scientists and celebrating science. “People have always gone to science lectures,” he says, “but the discussion and celebration of science in pubs and theatres is new.” He recently introduced a lecture by American physicist Brian Green to an audience of 900 at the Southbank.He admits that big events at big venues, like the Uncaged Monkeys or a lecture by a world-famous scientist, might not be “everybody’s cup of tea”.

For those who prefer things on a smaller scale, there is an ever-growing number of events like The Bright Club, a monthly variety night founded in 2009 by comedy promoter Miriam Miller and Steve Cross, University College London’s head of public engagement, as an arena for the staff and students from UCL to break free from the desks and labs and perform routines based on their research.

“Physically going out to these events involves a different level of engagement, than, say, watching Horizon at home, because you form part of the evening as an audience member,” says Miller. “You can go with friends and discuss the issues raised in the break or on the bus home, and at some of these events you can even interact with the people presenting information to you.” She believes that we have all the information in the world at our fingertips but that we don’t necessarily spend time discussing it with other people. She also believes that this social aspect is an important one: people who are interested in intelligent things usually don’t get to enjoy them together.

“Traditionally they’d watch TV or read books, both of which are pretty solitary,” argues Cross. “Other than that there are public lectures, which can be great, but most people just aren’t used to being lectured at for an hour.”

It seems more of us are prepared to let loose our inner geek, even if it’s just for the odd night. And it’s something that excites Ince because, “when you go to a well-run science gig, you don’t just come out saying 'That was fun’, you leave with your mind reeling with ideas that haunt and intrigue you”. We are not yet a nation of science-loving geeks, but as Ince says: “People now aren’t afraid to admit they like science. How can someone wilting under a stack of celebrity swimsuit mags belittle someone looking up at the stars?”

Friday, 8 April 2011

From Eternity to Here

From Eternity to Here: The Quest for the Ultimate Theory of Time by Sean Carroll

Daily Telegraph, 9 April 2011


‘What is time?” It’s the sort of question asked by philosophers, physicists and, sooner or later, children. While reading From Eternity to Here I was relieved that my eight-year-old was actually asking “What is the time?” That was a question I could answer. As for the other, most of us would side with St Augustine: “If no one asks me, I know. If I wish to explain it to one that asketh, I know not.”

St Augustine, having tackled original sin, contemplated the nature of time and concluded that “neither future nor past exists, and it is inexact language to speak of three times – past, present and future”. There, in a nutshell, is the problem that Sean Carroll, a theoretical physicist at the California Institute of Technology, explores in this fascinating book. Why is there a past, present and future? In other words, why is there an “arrow of time”?

Before Einstein, it had long been assumed that time and space were fixed and distinct, the stage on which the never-ending drama of the cosmos was played out. Einstein discovered space and time were not absolute and unchanging, that spatial distances and time intervals between events depended on the relative motion of observers. He found that space and time were woven together to form the fabric of the universe: space-time.

Yet there is one crucial difference between space and time. While it is possible to move in any direction in space, the ticks of a clock forever march time forward. This inexorable flight of time’s arrow from past to present to future is bound up with the second law of thermodynamics. Put simply, the amount of disorder, what physicists call entropy, increases with the passage of time.

Breaking eggs to make an omelette, stirring milk into coffee or spilling wine all exhibit, says Carroll, “the fundamental irreversibility that is the hallmark of the arrow of time”.It is the increase in entropy, in the disorderliness of the world, which makes these everyday events irreversible and separates the past from the future. Eggs can’t spontaneously unscramble or spilt wine jump back into the bottle because that would lead to a decrease in entropy. But why should entropy always increase?

“Understanding the arrow of time is a matter of understanding the origin of the universe,” Carroll argues. For him the reason we can’t unscramble an egg is due to the low entropy conditions in the early universe some 13 billion years ago. Attempting to explain how such a low entropy state was possible has led Carroll to become one of an increasing number of physicists who in recent years have begun to question whether the Big Bang was really the beginning of the universe. For him it is “simply a plausible hypothesis, not a result established beyond reasonable doubt” and it is conceivable that space and time extend beyond the moment that we identify as “the Big Bang”.

Traditionally, questions about what was there “before the Big Bang” have been dismissed as meaningless, since space and time were deemed to be created at the Big Bang there simply was no “before”. Instead of the universe, theorists now talk of the “multiverse” and “baby universes” that Carroll believes provide “a natural mechanism for creating more and more entropy in the universe”.

From Eternity to Here is not for the faint hearted, but it’s a rewarding read because there are no answers yet to some of science’s toughest questions.“There are ideas, and some ideas seem more promising than others, but all of them are somewhat vague, and we certainly haven’t yet put the final pieces together,” admits Carroll as he guides the reader through some of the most exotic parts of the landscape of modern theoretical physics and cosmology: from evaporating black holes to wormhole construction, from the many worlds interpretation to cosmic inflation.

But the question remains: “what is time?” The response of the American physicist John Wheeler is worth remembering: “Time is nature’s way of keeping everything from happening at once.”

Thursday, 24 March 2011

The Book of Universes

The Book of Universes by John D Barrow

Independent, 25 March 2011


"Einstein explained his theory to me every day and on my arrival I was fully convinced that he understood it," reported Chaim Weizmann. He would become the first president of Israel, but in 1921 was accompanying Einstein on a transatlantic voyage to New York. The theory in question was general relativity, in which gravity is due to the warping of space caused by the presence of mass. The Earth moves around the Sun not because some mysterious invisible force pulls it, but because of the warping of space due to the Sun's enormous mass.

"The theory is beautiful beyond comparison," Einstein wrote. When, in November 1919, British astronomers announced that they had discovered that gravity bends light – as predicted by general relativity – it made headlines around the world. Yet buried within his greatest achievement was what Einstein called "my greatest blunder".

He knew that his equations could be solved in a number of different ways, with each solution representing a model of a possible universe. Like everyone else at the time, Einstein believed that the actual universe was eternal and unchanging. So he introduced a term (his "greatest blunder") into the equations that ensured exactly that. It was left to others, a Russian mathematician and then a Belgian Jesuit priest, to find and take seriously the solutions that pointed to an expanding universe. Soon this non-static model attracted some experimental support.

In the 1920s, the American astronomer Edwin Hubble discovered two remarkable facts. First, what we had long assumed to be the universe was actually our host galaxy and there were many other such "island universes". Second, he found that light from these distant galaxies was stretched towards the red end of the visible spectrum. This so-called redshift is evidence that these galaxies are moving away from our own Milky Way and that the universe is expanding.

Eventually, this led theorists to a universe that was exploded into being in a Big Bang some 13 billion years ago from a single point, called a singularity, which was infinitely hot and dense. Add a surge of accelerated expansion only a trillion trillion trillion trillionth of a second after the Big Bang that lasted for only a trillion trillion trillionth of a second, and the discovery that 96 per cent of it is made up of dark matter and dark energy, then we arrive at the most popular model of our universe.

In the 20th century, cosmology became a bonafide scientific discipline, but there remains plenty of room for some metaphysical speculation. What exactly do we mean by "universe"? Is the universe everything that has existed, does exist and will ever exist? asks Cambridge cosmologist John Barrow. What about including all that cannot exist? After all, as he points out, some medieval philosophers "were drawn to this sort of completeness, adding everything that has existed, does exist and will not exist to the catalogue of what was, is and will be".

Barrow and his colleagues are not only interested in the structure and history of our universe. There are other universes that live inside black holes, or are chaotically unpredictable or allow time travel into the past. However, the most mind-numbing concept of all only emerged in the 1990s: the never-ending "multiverse" – the universe of all possible universes. There can be few better guides to the bewildering array of potential universes, and none so readable or entertaining.

Wednesday, 23 March 2011

The Meeting of Minds

The Meeting of Minds
Nature.com, 23 March 2011

I first saw the photograph of those gathered at the fifth Solvay conference, which was held in Brussels from 24 to 29 October 1927, in a biography of Albert Einstein. This was in 1979, when I was just 16. I wondered what brought these people together, and soon learned that the picture included most of the key players involved in the discovery of the quantum, and the subsequent development of quantum physics. With 17 of the 29 invited eventually earning a Nobel Prize, the conference was one of the most spectacular meetings of minds ever held.


When I was 18, I was given a print of the above photograph as a present. Many years later I began to think about it as a possible starting point for a book about the quantum. In the photograph there are nine seated in the front row. Eight men, and one woman; six have Nobel Prizes in either physics or chemistry. The woman has two, one for physics, awarded in 1903, and another for chemistry, awarded in 1911. It could only be Marie Curie. In the centre, the place of honour, sits Albert Einstein. Looking straight ahead, gripping the chair with his right hand, he seems ill at ease. Is it the winged collar and tie that are causing him discomfort, or is it what he has heard during the preceding week? At the end of the second row, on the right, is Niels Bohr, looking relaxed with a half-whimsical smile. It had been a good conference for him. Nevertheless, Bohr would be returning to Denmark disappointed that he had failed to convince Einstein to adopt his Copenhagen interpretation of what quantum mechanics revealed about the nature of reality.

Instead of yielding, Einstein had spent the week attempting to show that quantum mechanics was inconsistent, that Bohr's 'Copenhagen interpretation' was flawed. Einstein said years later that:

This theory reminds me a little of the system of delusions of an exceedingly intelligent paranoic, concocted of incoherent elements of thoughts.

It was Max Planck, sitting on Marie Curie's right, holding his hat and cigar, who discovered the quantum. In 1900 he was forced to accept that the energy of light, and all other forms of electromagnetic radiation, could only be emitted or absorbed by matter in bits, bundled up in various sizes. 'Quantum' was the name Planck gave to an individual packet of energy, with 'quanta' being the plural. The quantum of energy was a radical break with the long-established idea that energy was emitted or absorbed continuously, like water flowing from a tap. In the everyday world of the macroscopic, where the physics of Newton ruled supreme, water could drip from a tap, but energy was not exchanged in droplets of varying size. However, the atomic and subatomic level of reality was the domain of the quantum.

Bohr discovered that the energy of an electron inside an atom was 'quantised'; it could possess only certain amounts of energy and not others. The same was true of other physical properties, as the microscopic realm was found to be lumpy and discontinuous. Not some shrunken version of the large-scale world that we humans inhabit, where physical properties vary smoothly and continuously, where going from A to C means passing through B. Quantum physics, however, revealed that an electron in an atom can be in one place, and then, as if by magic, reappear in another without ever being anywhere in between, by emitting or absorbing a quantum of energy.

By the early 1920s, it had long been apparent that the advance of quantum physics on an ad hoc, piecemeal basis, had left it without solid foundations or a logical structure. Out of this state of confusion and crisis emerged a bold new theory; known as quantum mechanics, with Werner Heisenberg and Erwin Schrödinger, third and sixth from the right in the back row, leading the way. In 1927 Heisenberg made a discovery. It was so at odds with common sense that he initially struggled to grasp its significance. The uncertainty principle said that if you want to know the exact velocity of a particle, then you cannot know its exact location, and vice versa.

Bohr believed he knew how to interpret the equations of quantum mechanics; what the theory was saying about the nature of reality. Questions about cause and effect, or whether the moon exists when no one is looking at it, had been the preserve of philosophers since the time of Plato and Aristotle. However, after the emergence of quantum mechanics they were being discussed by the twentieth century's greatest physicists.

The debate that began between Einstein and Bohr at the Solvay conference in 1927, raised issues that continue to preoccupy many physicists and philosophers to this day; what is the nature of reality, and what kind of description of reality should be regarded as meaningful? 'No more profound intellectual debate has ever been conducted', claimed the scientist and novelist CP Snow. 'It is a pity that the debate, because of its nature, can't be common currency.'

When Einstein and Bohr first met in Berlin in 1920, each found an intellectual sparring partner who would, without bitterness or rancour, push and prod the other into refining and sharpening his thinking about the quantum. 'It was a heroic time,' recalled Robert Oppenheimer, who was a student in the 1920s. 'It was a period of patient work in the laboratory, of crucial experiments and daring action, of many false starts and many untenable conjectures. It was a time of earnest correspondence and hurried conferences, of debate, criticism and brilliant mathematical improvisation. For those who participated it was a time of creation.'

Planck, Einstein, Bohr, Heisenberg, Schrodinger, Born, Pauli, De Broglie, Dirac, the leading lights of the quantum revolution, are all there in that picture.

Sunday, 20 March 2011

Geek Nation

Geek Nation: How Indian science is taking over the world by Angela Saini

Financial Times, 19-20 March 2011

‘It shall be the duty of every citizen of India to develop the scientific temper, humanism and the spirit of inquiry and reform.’ The inclusion of this statement in the Indian constitution, which came into effect on January 26 1950, was insisted upon byJawaharlal Nehru, India’s first prime minister.

Nehru’s ‘scientific temper’ is a wonderfully concise phrase, which describes his vision of a nation in which people could think independently, employ logic and understand the scientific method. In a land of religion, Nehru put his faith in science and technology. He believed that it was ‘science alone that can solve the problems of hunger and poverty, insanitation and illiteracy, of superstition and deadening custom and tradition’ and that the ‘future belongs to science and to those who make friends with science’. Nehru wanted a nation of geeks.

‘Wherever in the world we live, Indians and people of Indian origin are famous for being swots, nerds, dweebs, boffins, and dorks,’ writes Angela Saini in Geek Nation. A British science journalist of Indian parentage, Saini spent six months in India exploring Nehru’s geek nation almost 50 years after his death.

With a population approaching 1.2 billion, India has the largest pool of scientists and engineers in the world. While the literacy rate hovers around a dismal 60 per cent, some 400 universities produce two million graduates every year, including a staggering 600,000 engineers, the most sought after of which are from the 16 Indian Institutes of Technology (IIT’s). Yet, instead of discovering hothouses of intellectual curiosity and innovation, Saini found drones, not geeks. The relentless pressure on India’s students is ‘disabling imaginations’ and driving hundreds to suicide.

From the vast Soviet-style Bhabha Atomic Research Centre to the Academy of Sanskrit Research, ‘the geeky and the bizarre’ sit side-by-side; wacky ideas are more easily tolerated than in the west. Indians, Saini observes, have ‘a unique freedom to explore the edges of what’s believed to be possible’.

Indian science is far from taking over the world: it currently contributes less than 3 per cent of global research output, lagging far behind the US and UK. Yet an increasing number of Indian researchers, having established reputations aboard, are returning home to lead a younger generation.

Saini’s vivid portrait of hi-tech India reveals a country in a hurry. No one knows how long it will take, but India’s present economic expansion is a reminder that more than 1,000 years ago it had a scientific culture as advanced as any in the world. ‘The Empires of the future,’ Winston Churchill once said, ‘are going to be the empires of the mind.’

Wednesday, 2 March 2011

The man who went nuclear


The Man Who Went Nuclear: How Ernest Rutherford Ushered in the Atomic Age


Independent, 3 March 2011

Did the nuclear age begin in 1942, when Chicago Pile-1, a reactor built in a squash court, went "critical" by achieving a self-sustaining chain reaction? Or was it on 16 July 1945 in the Jemez mountains in New Mexico, when "The Gadget", the first atomic bomb, was successfully tested and Robert Oppenheimer quoted the Bhagavad Gita? Maybe it was June 1954, when the Russian Obninsk nuclear station first generated electricity for the grid.

In reality, it was during a meeting of the Manchester Literary and Philosophical Society that the nuclear age was announced, on Tuesday, 7 March 1911, by Professor Ernest Rutherford, the 39-year-old head of physics at Manchester University. Rutherford was born in 1871, in Spring Grove, New Zealand. Descended from Scottish emigrants, it was from this scattered rural community on the north coast of the South Island that Rutherford's aptitude for science and maths led in 1895 to a coveted place at Cambridge. There, under the direction of JJ Thomson, Rutherford established a reputation as a fine experimentalist with a study of X-rays.

Though surrounded at Cambridge by all the excitement generated by Thomson's discovery of the electron in 1897, Rutherford opted to investigate radioactivity and soon found that there were two distinct types of radiation emitted from uranium, which he called alpha and beta, before a third was discovered, called gamma rays.

Aged just 27, in 1898, he was appointed professor of physics at McGill University in Montreal, Canada. Among his successes over the next nine years the most important was the discovery, with his collaborator Frederick Soddy, that radioactivity was the transformation of one element into another due to the emission of an alpha or beta particle. Rutherford regarded "all science as either physics or stamp collecting" but saw the funny side when he received the 1908 Nobel prize for chemistry for this seminal work. By then he was in Manchester.

"Youthful, energetic, boisterous, he suggested anything but the scientist," was how Chaim Weizmann, then a chemist but later the first president of Israel, remembered Rutherford in Manchester. "He talked readily and vigorously on any subject under the sun, often without knowing anything about it. Going down to the refectory for lunch, I would hear the loud, friendly voice rolling up the corridor."

At the time Rutherford was busy using the alpha particle to probe and unlock the secrets of the atom. But what exactly is an alpha particle? It was a question that Rutherford and his German colleague Hans Geiger answered. It was a helium ion; that is, a helium atom that had been stripped of its two electrons. Rutherford had noticed, while still in Montreal, that some alpha particles passing through thin sheets of metal were slightly deflected, causing fuzziness on a photographic plate. It was something he asked Geiger to investigate.

As instructed by Rutherford he fired beams of alpha particles at some gold foil and by the tiny flashes of light when they struck a zinc sulphide screen discovered that a few "were deflected through quite an appreciable angle". Soon afterwards Rutherford assigned a research project to a promising undergraduate called Ernest Marsden: "Why not let him see if any alpha particles can be scattered through a large angle?" Marsden found some alpha particles bouncing straight back after hitting the gold foil and Rutherford was shocked: "It was almost as incredible as if you had fired a 15-inch shell at a piece of tissue paper and it came back and hit you."

Marsden and Geiger made comparative measurements using different metals and they discovered exactly they same large angle scattering. In June 1909 they published their extraordinary results, but with Rutherford unable to offer any kind of explanation they attracted little interest.

After decades of intense arguments, by 1910 the reality of atoms was established beyond reasonable doubt. The most widely-accepted atomic model was Thomson's so-called "plum pudding". Its ingredients consisted of a ball of diffuse "positive electricity" in which negatively charged electrons were embedded like plums in a pudding. But Rutherford knew that the atom of his old mentor couldn't explain alpha particle scattering. The probability that the accumulated effect of a number of tiny ricochets off electrons in Thomson's atom resulted in even one alpha particle being scattered backwards was almost zero. By December 1910, Rutherford believed that given the mass and energy of an alpha particle the large deflections must be the result of a single collision with an atom. It led him "to devise an atom superior to J.J's" he said at time.

Rutherford's atom consisted of a tiny central core containing virtually all the atomic mass, which he later called the nucleus, but it occupied only a minute volume "like a fly in a cathedral".Most alpha particles would pass straight through Rutherford's atom in any "collision", since they were too far from the tiny nucleus at its heart to suffer any deflection. But if an alpha particle approached the nucleus head-on, the repulsive force between the two would cause it to recoil straight back like a ball bouncing off a brick wall. Rutherford said that such direct hits were "like trying to shoot a gnat in the Albert Hall at night". Rutherford's model allowed him to make definite predictions using a simple formula he had derived about the fraction of scattered alpha particles to be found at any angle of deflection.

Experimental checks performed by Geiger and Marsden confirmed the predictions, but few physicists beyond Manchester gave any serious attention to the nuclear atom. Although Rutherford did not explicitly suggest a planetary model of the atom, there were those who knew that's exactly what it was. For most that settled the matter, Rutherford's atom was fatally flawed. A model of the atom with electrons moving around the nucleus, like planets orbiting the sun, would collapse. Any object moving in a circle undergoes acceleration, if it happens to be a charged particle, like an electron, as it accelerates it continuously losses energy in the form of radiation. An electron in orbit around the nucleus would spiral into it. Rutherford's atom was unstable and the existence of the material world was compelling evidence against it. Enter Niels Bohr.

Arriving in Manchester in March 1912 to learn about radioactivity, it wasn't before long the 27-year-old Dane began thinking about how to prevent Rutherford's nuclear atom from collapsing. His solution employed the quantum – the idea that energy comes in packets. Bohr argued that electrons inside an atom could only move in certain orbits in which they did not radiate energy and therefore couldn't spiral into the nucleus. Bohr said that each orbit had a certain energy associated with it, so all the allowed orbits were in effect a series of energy levels, like the rungs of a ladder. For an electron to move between levels, the famous quantum leap, required it to absorb or emit a quantum of energy that was equivalent to the difference in energy between the two levels.

"It is difficult to overestimate the scientific importance of the discovery of the nucleus," says Sean Freeman, professor of nuclear physics at Manchester University. "Rutherford's insight, imagination and attention to detail enabled him to make revolutionary discoveries using rather rudimentary technology by modern standards. He was a true pioneer."

One of his most important achievements was made in his spare time while Rutherford was developing methods for detecting submarines during the First World War – he split the atom. Arriving late for a committee meeting one day, Rutherford didn't apologise, but announced: "I have been engaged in experiments which suggest that the atom can be artificially disintegrated. If it is true, it is of far greater importance than a war!" It was 1919 before he published the results that showed the nucleus contained positively charged particles he called protons by knocking them out of nitrogen nuclei using alpha particles – thereby effectively splitting the nucleus and hence the atom. It was the last work he did at Manchester before moving to Cambridge to take over from Thomson as head of the Cavendish Laboratory.

It was there that in 1932 his colleagues James Cockcroft and Ernest Walton "split the atom" using the world's first particle accelerator. Also at the Cavendish, James Chadwick used Rutherford's suggestion that there was probably another constituent to heavier nuclei to discover the neutron. The particle plays the central role in establishing a nuclear chain reaction. The three men were among the 11 former students and colleagues of Rutherford who would win the Nobel prize.

Another of those 11 was Niels Bohr, who said that Rutherford never spoke more angrily to him than he did one evening at a Royal Society dinner. He had overheard Bohr refer to him by his title (Rutherford was awarded a peerage in 1931) and angrily asked the Dane loudly: "Do you Lord me?" Rutherford never cared for the honours and was indifferent to academic or social standing. What mattered most to him were the results of experiments. "I was brought up to look at the atom as a nice hard fellow, red or grey in colour, according to taste," he once said. It was a model he replaced with an atom that began the nuclear age.

Saturday, 12 February 2011

The 4% Universe

The 4% Universe: Dark matter, dark energy and the race to discover the rest of reality by Richard Panek

Times, 12 February 2011


For Galileo seeing was believing. When in 1609 he learnt of the Dutch invention of the telescope, he quickly constructed his own. With no reason to think there was anything to find, he searched the night sky and found that there was far more to the universe than meets the naked eye. He saw that the Moon had mountains, the Sun had spots and he observed the phases of Venus. With the discovery of Jupiter’s moons, Galileo found hard evidence that not all heavenly bodies revolved around the Earth. In March 1610 he published, The Starry Messenger, his report of what he had seen. All 500 copies were sold within a week.

Four centuries later Galileo’s successors know that they cannot see, even using their dazzlingly variety of modern telescopes, an astonishing 96 percent of the universe. The tiny fraction that is visible to their fine-tuned instruments is the stuff that we and all the countless planets, stars and galaxies are made from. Get rid of us and of everything else we’ve ever thought of as the universe, and very little would change. ‘We’re just a bit of pollution,’ one cosmologist says. We maybe irrelevant, but the rest of reality has been dubbed ‘dark’ and for the American science writer Richard Panek it ‘could go down in history as the ultimate semantic surrender’. For this is not ‘dark’ as in distant or invisible, but ‘dark’ as in unknown - for now at least.

Yet what is known is that almost a quarter of what can’t be seen is something called dark matter. Although its very nature is a mystery, its presence is discernible through its gravitational effect on the movement of galaxies. Without dark matter the astronomical data doesn’t make sense.

From a derelict iron mine in Minnesota to mountaintop observatories, at a pace that would shame many a thriller writer, Panek tells the story of the quest to unlock the secrets of dark matter and the particles that make it up. These weakly interacting massive particles, or WIMPs, have proven so elusive that the possibility that two were detected in November 2009 caused great excitement.

Dark matter is less than half the tale Panek wants to tell. For three quarters of the unknown universe consists of an even stranger substance called dark energy. Its existence was inferred, once again, from the circumstantial evidence gathered by astronomers measuring what could be seen. They didn’t need Sherlock Holmes to remind them that after eliminating the impossible, whatever remains, no matter how improbable, is the truth.

In the late 1990s two rival teams set out to collect data on distant supernovae in an attempt to determine the rate at which the universe was expanding. It was assumed that the pull of gravity would act as a break on the pace of expansion. To their disbelief they found that space-time was being pushed apart faster than ever before. Something was overwhelming the force of gravity to drive the expansion. Dark energy was winning the cosmic tug-of-war.

With a future Nobel prize at stake, disputes and arguments over who did what and when were inevitable. Parek provides a behind-the-scenes glimpse of science in the raw as alliances are forged and friendships strained. There is a new universe to explore and the latest experiments reveal it is 13.75 billion years old and made up of 72.8 per cent dark energy, 22.7 per cent dark matter and 4.5 per cent ordinary matter. These numbers are ‘an exquisitely precise accounting of the depths of our ignorance,’ says Panek. ‘It’s 1610 all over again.’