Wednesday, 21 September 2011

Knocking on Heaven's Door

Knocking on Heaven's Door: How physics and scientific thinking illuminate the universe and the modern world by Lisa Randall


Independent, 16 September 2011 

Why do things weigh what they do? It seems like a simple enough question, but physicists don't know for sure why particles weigh anything at all. For the best part of 50 years they have had an answer – the Higgs boson. It plays such a fundamental role in nature that its been dubbed the "God Particle".

Attempting to answer the question of how the universe got its mass means searching for the Higgs boson. It's a nine billion-dollar enterprise involving thousands of scientists and the largest, most complex machine ever built. The Large Hadron Collider (LHC) contains an enormous 26.6km circular tunnel that stretches between the Jura Mountains and Lake Geneva across the French-Swiss border. Electric fields inside accelerate two beams of protons as they go around 11,000 times per second.

 In this fascinating book, Lisa Randall, professor of theoretical physics at Harvard, explains the experimental research at the LHC and the theories that try to anticipate what they will find: "The goal... is to probe the structure of matter at distances never before measured and at energies higher than have ever been explored." These energies should generate an array of exotic fundamental particles and reveal interactions that occurred early in the universe's evolution, roughly a trillionth of a second after the Big Bang, 13.75 billion years ago. In the debris of colliding protons, physicists hope to find the Higgs boson and get a glimpse at the nature of dark energy and dark matter that make up 96 percent of the universe.

It was 1964 when Peter Higgs conceived of an invisible field that filled the cosmos immediately after the Big Bang. As the newborn universe expanded and cooled, the field switched on. At that moment massless particles that had been travelling at the speed of light were caught in the field and became massive. The more strongly they felt the effects of the field, the more massive they became. Without this field atoms, molecules, galaxies, stars and the planets would not exist.

 The Higgs field is like a field of snow that stretches forever in all directions. Beams of light move as though they have skis on: they zip through the field as if it weren't there. Some particles have snowshoes while others go barefoot and trudge around. A particle's mass is simply a measure of how much it gets bogged down in the field.

The ripples in the Higgs field appear as particles called Higgs bosons – the snowflakes that make up the cosmic snowfield, and the thing that physicists need in order to explain why stuff weighs anything. The Higgs mechanism tells how elementary particles go from having zero mass in the absence of the Higgs field to having the masses measured in experiments. The Higgs boson is a crucial part of what's called the Standard Model of particle physics. It's a construction made out of 24 fundamental building-blocks of matter: 18 of these particles are six types of quarks that come in three varieties. The remaining six are called "leptons", a family that includes electrons.

There are also other particles known as "bosons", responsible for transmitting forces of nature. The electromagnetic force is carried by photons – the particles of light. Inside atomic nuclei, quarks are stuck together by the strong force carried by "gluons". The W and Z bosons carry the weak force that is responsible for radioactive decay. "With these ingredients," explains Randall, "physicists have been able to successfully predict the results of all particle physics experiments to date."

 On 10 September 2008, the world's media gathered near Geneva at CERN, home of the European Centre for Particle Physics, to watch the LHC being switched on. "People followed the trajectory of two spots of light on a computer screen with unbelievable excitement," recalls Randall. In the months to follow, the LHC was to be cranked up to energies that would replicate those of the early universe, but nine days later euphoria transformed into despair as a malfunction triggered an emergency shutdown. After a year-long delay and repairs costing $40m, the LHC came back online in November 2009.

Yet there are other, even bigger, problems in particle physics that the LHC should help to solve. One is the hierarchy problem. The Higgs mechanism addresses the question of why fundamental particles have mass. The hierarchy problem asks the question, why those masses are what they are.

Another concerns hints about the "holy grail of physics", the so-called "theory of everything". The best candidate for such a theory is superstrings, in which particles are really little oscillating bits of "string". The different levels of "vibration" of these strings correspond to the different particles. Alas, it was later found that there were at least five different string theories. Physicists were relieved when it was discovered they were all just different approximations to a more fundamental theory called M-theory. However, the theory poses enormous conceptual and mathematical challenges.

The "super" in superstrings refers to something called supersymmetry. The LHC will be used to look for "supersymmetric particles". If found, they would provide the first tangible evidence in support of superstrings and M-theory. The proponents of superstrings and M-theory justify their creation by pointing to its elegance and beauty.

And there's the problem. The "quest for beauty", which elevates aesthetics over empirical evidence in the formulation of a theory, took centre stage in the more esoteric areas of theoretical physics and cosmology, in the absence of experimental data. An appreciation of beauty certainly has a role to play when faced with a blank piece of paper; an appeal to aesthetic criteria is part of the physicists' unshakeable belief in the underlying simplicity and beauty of nature.

It is one of their most powerful guiding principles. Nature should not be more complicated than it has to be, they tell themselves. It is this belief that motivates the search for a "theory of everything". Randall quotes Keats: "Beauty is truth, truth beauty". It can't be denied that "the search for beauty - or at least simplicity - had also led to truth". Yet she finds the assumption "a little slippery" and readily admits that "although everyone would love to believe that beauty is at the heart of great scientific theories, and that the truth will always be aesthetically satisfying, beauty is at least in part a subjective criterion".

There is nothing wrong with speculation; it is a necessary and vital part of any science, as a first step. The danger of "truth through beauty" in physics, as Randall describes it, is that it makes a virtue of necessity. Wherever experimental evidence can be coaxed out of nature, it suffices to corroborate or refute a theory and serves as the sole arbiter of validity. As Darwin's champion Thomas Huxley once said, "science is organized common sense where many a beautiful theory was killed by an ugly fact". Despite the delay in the LHC, it will be a source of invaluable new data that will provide stringent constraints on what phenomena or theories beyond the Standard Model can exist. We maybe on the edge of discovery, but for the moment the Higgs boson remains a hypothetical particle on which rests the weight of the universe.

Wednesday, 4 May 2011

Robin Ince: The Science of Comedy


You don’t have to leave your brain at the door when going to a gig. I met up with the comedian who is taking Brian Cox and other scientists on tour.

Daily Telegraph, 30 April 2011

The “free visitor destination for the incurably curious”, otherwise known as the Wellcome Collection, opposite London’s Euston station, seemed an apt place to meet Robin Ince, comedian and co-presenter of Radio 4’s science-meets-humour chat show The Infinite Monkey Cage.

“There are a lot of intelligent, well-read comedians out there who are interested in science and who want to share their passions,” says Ince, who has done more than anyone to help them do just that. He is the brains behind Nine Lessons and Carols for Godless People, a variety show that celebrates science while giving the audience a healthy dose of humour and music.

Each Christmas since 2008 the shows have played to packed houses of non-religious people grabbing the opportunity to laugh out loud at the likes of comedian and trained physicist Dara O’Brien and being entertained by bite-sized lectures from scientists like the evolutionary biologist Richard Dawkins. “If the Royal Variety Show was put in a matter transportation machine with the Royal Institution Christmas Lectures,” says Ince, “this is what you’d get.” It’s what he calls “reading-list comedy”, because it’s all about ideas that leave the audience wanting more – and a bibliography.



Ince is about to give them more with his new tour, Uncaged Monkeys: A Night of Science and Wonder, opening in Oxford tomorrow and ending with two nights at London’s Hammersmith Apollo on May 16 and 17.

Ince’s fellow “monkeys” will be Brian Cox, recently on our screens presenting Wonders of the Universe; Ben Goldacre, psychiatrist and slayer of bad science; and Simon Singh, the best-selling science writer and celebrated debunker of the claims of alternative medicine. With their guests the quartet will be tackling everything from the Big Bang to bonobo apes and anything else they can cram into two hours.

Once again the driving force, Ince describes himself as “the idiot who will guide the audience”. Though he loved science as a child, he explains that he lost interest in it around the age of 13, “when science seemed to become facts and dull experiments with apparently no link to the world”.There was, he regrets, “no sense that the periodic table is really the ingredients list of the universe so far”.

It was only in his mid twenties that the popular books of Nobel Prize-winning, bongo-drum-playing physicist Richard Feynman rekindled his curiosity for all things scientific. “Taking a tour about science to theatres that seat up to 3,000 people is a project I’ve wanted to do for a long time,” admits Ince. The fact that he can do so may in part be down to an English-born, Canadian journalist and writer living in New York, one Malcolm Gladwell.

In November 2008, Gladwell’s two performances at the Lyceum, one of the largest theatres in London’s West End, were quickly sold out. A staffer at the New Yorker magazine, Gladwell is often described as one of the most brilliant and influential writers of his generation. His bestselling books, such The Tipping Point and Blink, identify and explore social trends and behaviour in novel ways. After his gigs in London he returned to Britain the following year to play four dates at venues that you’d normally associate with hip indie bands. Gladwell, with his afro and charisma, made ideas sexy, very much as Brian Cox is doing today.

Ince and Cox’s fellow uncaged monkey Simon Singh identifies three distinct types of event that are taking place: listening to scientists (lectures), discussing with scientists and celebrating science. “People have always gone to science lectures,” he says, “but the discussion and celebration of science in pubs and theatres is new.” He recently introduced a lecture by American physicist Brian Green to an audience of 900 at the Southbank.He admits that big events at big venues, like the Uncaged Monkeys or a lecture by a world-famous scientist, might not be “everybody’s cup of tea”.

For those who prefer things on a smaller scale, there is an ever-growing number of events like The Bright Club, a monthly variety night founded in 2009 by comedy promoter Miriam Miller and Steve Cross, University College London’s head of public engagement, as an arena for the staff and students from UCL to break free from the desks and labs and perform routines based on their research.

“Physically going out to these events involves a different level of engagement, than, say, watching Horizon at home, because you form part of the evening as an audience member,” says Miller. “You can go with friends and discuss the issues raised in the break or on the bus home, and at some of these events you can even interact with the people presenting information to you.” She believes that we have all the information in the world at our fingertips but that we don’t necessarily spend time discussing it with other people. She also believes that this social aspect is an important one: people who are interested in intelligent things usually don’t get to enjoy them together.

“Traditionally they’d watch TV or read books, both of which are pretty solitary,” argues Cross. “Other than that there are public lectures, which can be great, but most people just aren’t used to being lectured at for an hour.”

It seems more of us are prepared to let loose our inner geek, even if it’s just for the odd night. And it’s something that excites Ince because, “when you go to a well-run science gig, you don’t just come out saying 'That was fun’, you leave with your mind reeling with ideas that haunt and intrigue you”. We are not yet a nation of science-loving geeks, but as Ince says: “People now aren’t afraid to admit they like science. How can someone wilting under a stack of celebrity swimsuit mags belittle someone looking up at the stars?”

Friday, 8 April 2011

From Eternity to Here

From Eternity to Here: The Quest for the Ultimate Theory of Time by Sean Carroll

Daily Telegraph, 9 April 2011


‘What is time?” It’s the sort of question asked by philosophers, physicists and, sooner or later, children. While reading From Eternity to Here I was relieved that my eight-year-old was actually asking “What is the time?” That was a question I could answer. As for the other, most of us would side with St Augustine: “If no one asks me, I know. If I wish to explain it to one that asketh, I know not.”

St Augustine, having tackled original sin, contemplated the nature of time and concluded that “neither future nor past exists, and it is inexact language to speak of three times – past, present and future”. There, in a nutshell, is the problem that Sean Carroll, a theoretical physicist at the California Institute of Technology, explores in this fascinating book. Why is there a past, present and future? In other words, why is there an “arrow of time”?

Before Einstein, it had long been assumed that time and space were fixed and distinct, the stage on which the never-ending drama of the cosmos was played out. Einstein discovered space and time were not absolute and unchanging, that spatial distances and time intervals between events depended on the relative motion of observers. He found that space and time were woven together to form the fabric of the universe: space-time.

Yet there is one crucial difference between space and time. While it is possible to move in any direction in space, the ticks of a clock forever march time forward. This inexorable flight of time’s arrow from past to present to future is bound up with the second law of thermodynamics. Put simply, the amount of disorder, what physicists call entropy, increases with the passage of time.

Breaking eggs to make an omelette, stirring milk into coffee or spilling wine all exhibit, says Carroll, “the fundamental irreversibility that is the hallmark of the arrow of time”.It is the increase in entropy, in the disorderliness of the world, which makes these everyday events irreversible and separates the past from the future. Eggs can’t spontaneously unscramble or spilt wine jump back into the bottle because that would lead to a decrease in entropy. But why should entropy always increase?

“Understanding the arrow of time is a matter of understanding the origin of the universe,” Carroll argues. For him the reason we can’t unscramble an egg is due to the low entropy conditions in the early universe some 13 billion years ago. Attempting to explain how such a low entropy state was possible has led Carroll to become one of an increasing number of physicists who in recent years have begun to question whether the Big Bang was really the beginning of the universe. For him it is “simply a plausible hypothesis, not a result established beyond reasonable doubt” and it is conceivable that space and time extend beyond the moment that we identify as “the Big Bang”.

Traditionally, questions about what was there “before the Big Bang” have been dismissed as meaningless, since space and time were deemed to be created at the Big Bang there simply was no “before”. Instead of the universe, theorists now talk of the “multiverse” and “baby universes” that Carroll believes provide “a natural mechanism for creating more and more entropy in the universe”.

From Eternity to Here is not for the faint hearted, but it’s a rewarding read because there are no answers yet to some of science’s toughest questions.“There are ideas, and some ideas seem more promising than others, but all of them are somewhat vague, and we certainly haven’t yet put the final pieces together,” admits Carroll as he guides the reader through some of the most exotic parts of the landscape of modern theoretical physics and cosmology: from evaporating black holes to wormhole construction, from the many worlds interpretation to cosmic inflation.

But the question remains: “what is time?” The response of the American physicist John Wheeler is worth remembering: “Time is nature’s way of keeping everything from happening at once.”

Thursday, 24 March 2011

The Book of Universes

The Book of Universes by John D Barrow

Independent, 25 March 2011


"Einstein explained his theory to me every day and on my arrival I was fully convinced that he understood it," reported Chaim Weizmann. He would become the first president of Israel, but in 1921 was accompanying Einstein on a transatlantic voyage to New York. The theory in question was general relativity, in which gravity is due to the warping of space caused by the presence of mass. The Earth moves around the Sun not because some mysterious invisible force pulls it, but because of the warping of space due to the Sun's enormous mass.

"The theory is beautiful beyond comparison," Einstein wrote. When, in November 1919, British astronomers announced that they had discovered that gravity bends light – as predicted by general relativity – it made headlines around the world. Yet buried within his greatest achievement was what Einstein called "my greatest blunder".

He knew that his equations could be solved in a number of different ways, with each solution representing a model of a possible universe. Like everyone else at the time, Einstein believed that the actual universe was eternal and unchanging. So he introduced a term (his "greatest blunder") into the equations that ensured exactly that. It was left to others, a Russian mathematician and then a Belgian Jesuit priest, to find and take seriously the solutions that pointed to an expanding universe. Soon this non-static model attracted some experimental support.

In the 1920s, the American astronomer Edwin Hubble discovered two remarkable facts. First, what we had long assumed to be the universe was actually our host galaxy and there were many other such "island universes". Second, he found that light from these distant galaxies was stretched towards the red end of the visible spectrum. This so-called redshift is evidence that these galaxies are moving away from our own Milky Way and that the universe is expanding.

Eventually, this led theorists to a universe that was exploded into being in a Big Bang some 13 billion years ago from a single point, called a singularity, which was infinitely hot and dense. Add a surge of accelerated expansion only a trillion trillion trillion trillionth of a second after the Big Bang that lasted for only a trillion trillion trillionth of a second, and the discovery that 96 per cent of it is made up of dark matter and dark energy, then we arrive at the most popular model of our universe.

In the 20th century, cosmology became a bonafide scientific discipline, but there remains plenty of room for some metaphysical speculation. What exactly do we mean by "universe"? Is the universe everything that has existed, does exist and will ever exist? asks Cambridge cosmologist John Barrow. What about including all that cannot exist? After all, as he points out, some medieval philosophers "were drawn to this sort of completeness, adding everything that has existed, does exist and will not exist to the catalogue of what was, is and will be".

Barrow and his colleagues are not only interested in the structure and history of our universe. There are other universes that live inside black holes, or are chaotically unpredictable or allow time travel into the past. However, the most mind-numbing concept of all only emerged in the 1990s: the never-ending "multiverse" – the universe of all possible universes. There can be few better guides to the bewildering array of potential universes, and none so readable or entertaining.

Wednesday, 23 March 2011

The Meeting of Minds

The Meeting of Minds
Nature.com, 23 March 2011

I first saw the photograph of those gathered at the fifth Solvay conference, which was held in Brussels from 24 to 29 October 1927, in a biography of Albert Einstein. This was in 1979, when I was just 16. I wondered what brought these people together, and soon learned that the picture included most of the key players involved in the discovery of the quantum, and the subsequent development of quantum physics. With 17 of the 29 invited eventually earning a Nobel Prize, the conference was one of the most spectacular meetings of minds ever held.


When I was 18, I was given a print of the above photograph as a present. Many years later I began to think about it as a possible starting point for a book about the quantum. In the photograph there are nine seated in the front row. Eight men, and one woman; six have Nobel Prizes in either physics or chemistry. The woman has two, one for physics, awarded in 1903, and another for chemistry, awarded in 1911. It could only be Marie Curie. In the centre, the place of honour, sits Albert Einstein. Looking straight ahead, gripping the chair with his right hand, he seems ill at ease. Is it the winged collar and tie that are causing him discomfort, or is it what he has heard during the preceding week? At the end of the second row, on the right, is Niels Bohr, looking relaxed with a half-whimsical smile. It had been a good conference for him. Nevertheless, Bohr would be returning to Denmark disappointed that he had failed to convince Einstein to adopt his Copenhagen interpretation of what quantum mechanics revealed about the nature of reality.

Instead of yielding, Einstein had spent the week attempting to show that quantum mechanics was inconsistent, that Bohr's 'Copenhagen interpretation' was flawed. Einstein said years later that:

This theory reminds me a little of the system of delusions of an exceedingly intelligent paranoic, concocted of incoherent elements of thoughts.

It was Max Planck, sitting on Marie Curie's right, holding his hat and cigar, who discovered the quantum. In 1900 he was forced to accept that the energy of light, and all other forms of electromagnetic radiation, could only be emitted or absorbed by matter in bits, bundled up in various sizes. 'Quantum' was the name Planck gave to an individual packet of energy, with 'quanta' being the plural. The quantum of energy was a radical break with the long-established idea that energy was emitted or absorbed continuously, like water flowing from a tap. In the everyday world of the macroscopic, where the physics of Newton ruled supreme, water could drip from a tap, but energy was not exchanged in droplets of varying size. However, the atomic and subatomic level of reality was the domain of the quantum.

Bohr discovered that the energy of an electron inside an atom was 'quantised'; it could possess only certain amounts of energy and not others. The same was true of other physical properties, as the microscopic realm was found to be lumpy and discontinuous. Not some shrunken version of the large-scale world that we humans inhabit, where physical properties vary smoothly and continuously, where going from A to C means passing through B. Quantum physics, however, revealed that an electron in an atom can be in one place, and then, as if by magic, reappear in another without ever being anywhere in between, by emitting or absorbing a quantum of energy.

By the early 1920s, it had long been apparent that the advance of quantum physics on an ad hoc, piecemeal basis, had left it without solid foundations or a logical structure. Out of this state of confusion and crisis emerged a bold new theory; known as quantum mechanics, with Werner Heisenberg and Erwin Schrödinger, third and sixth from the right in the back row, leading the way. In 1927 Heisenberg made a discovery. It was so at odds with common sense that he initially struggled to grasp its significance. The uncertainty principle said that if you want to know the exact velocity of a particle, then you cannot know its exact location, and vice versa.

Bohr believed he knew how to interpret the equations of quantum mechanics; what the theory was saying about the nature of reality. Questions about cause and effect, or whether the moon exists when no one is looking at it, had been the preserve of philosophers since the time of Plato and Aristotle. However, after the emergence of quantum mechanics they were being discussed by the twentieth century's greatest physicists.

The debate that began between Einstein and Bohr at the Solvay conference in 1927, raised issues that continue to preoccupy many physicists and philosophers to this day; what is the nature of reality, and what kind of description of reality should be regarded as meaningful? 'No more profound intellectual debate has ever been conducted', claimed the scientist and novelist CP Snow. 'It is a pity that the debate, because of its nature, can't be common currency.'

When Einstein and Bohr first met in Berlin in 1920, each found an intellectual sparring partner who would, without bitterness or rancour, push and prod the other into refining and sharpening his thinking about the quantum. 'It was a heroic time,' recalled Robert Oppenheimer, who was a student in the 1920s. 'It was a period of patient work in the laboratory, of crucial experiments and daring action, of many false starts and many untenable conjectures. It was a time of earnest correspondence and hurried conferences, of debate, criticism and brilliant mathematical improvisation. For those who participated it was a time of creation.'

Planck, Einstein, Bohr, Heisenberg, Schrodinger, Born, Pauli, De Broglie, Dirac, the leading lights of the quantum revolution, are all there in that picture.

Sunday, 20 March 2011

Geek Nation

Geek Nation: How Indian science is taking over the world by Angela Saini

Financial Times, 19-20 March 2011

‘It shall be the duty of every citizen of India to develop the scientific temper, humanism and the spirit of inquiry and reform.’ The inclusion of this statement in the Indian constitution, which came into effect on January 26 1950, was insisted upon byJawaharlal Nehru, India’s first prime minister.

Nehru’s ‘scientific temper’ is a wonderfully concise phrase, which describes his vision of a nation in which people could think independently, employ logic and understand the scientific method. In a land of religion, Nehru put his faith in science and technology. He believed that it was ‘science alone that can solve the problems of hunger and poverty, insanitation and illiteracy, of superstition and deadening custom and tradition’ and that the ‘future belongs to science and to those who make friends with science’. Nehru wanted a nation of geeks.

‘Wherever in the world we live, Indians and people of Indian origin are famous for being swots, nerds, dweebs, boffins, and dorks,’ writes Angela Saini in Geek Nation. A British science journalist of Indian parentage, Saini spent six months in India exploring Nehru’s geek nation almost 50 years after his death.

With a population approaching 1.2 billion, India has the largest pool of scientists and engineers in the world. While the literacy rate hovers around a dismal 60 per cent, some 400 universities produce two million graduates every year, including a staggering 600,000 engineers, the most sought after of which are from the 16 Indian Institutes of Technology (IIT’s). Yet, instead of discovering hothouses of intellectual curiosity and innovation, Saini found drones, not geeks. The relentless pressure on India’s students is ‘disabling imaginations’ and driving hundreds to suicide.

From the vast Soviet-style Bhabha Atomic Research Centre to the Academy of Sanskrit Research, ‘the geeky and the bizarre’ sit side-by-side; wacky ideas are more easily tolerated than in the west. Indians, Saini observes, have ‘a unique freedom to explore the edges of what’s believed to be possible’.

Indian science is far from taking over the world: it currently contributes less than 3 per cent of global research output, lagging far behind the US and UK. Yet an increasing number of Indian researchers, having established reputations aboard, are returning home to lead a younger generation.

Saini’s vivid portrait of hi-tech India reveals a country in a hurry. No one knows how long it will take, but India’s present economic expansion is a reminder that more than 1,000 years ago it had a scientific culture as advanced as any in the world. ‘The Empires of the future,’ Winston Churchill once said, ‘are going to be the empires of the mind.’

Wednesday, 2 March 2011

The man who went nuclear


The Man Who Went Nuclear: How Ernest Rutherford Ushered in the Atomic Age


Independent, 3 March 2011

Did the nuclear age begin in 1942, when Chicago Pile-1, a reactor built in a squash court, went "critical" by achieving a self-sustaining chain reaction? Or was it on 16 July 1945 in the Jemez mountains in New Mexico, when "The Gadget", the first atomic bomb, was successfully tested and Robert Oppenheimer quoted the Bhagavad Gita? Maybe it was June 1954, when the Russian Obninsk nuclear station first generated electricity for the grid.

In reality, it was during a meeting of the Manchester Literary and Philosophical Society that the nuclear age was announced, on Tuesday, 7 March 1911, by Professor Ernest Rutherford, the 39-year-old head of physics at Manchester University. Rutherford was born in 1871, in Spring Grove, New Zealand. Descended from Scottish emigrants, it was from this scattered rural community on the north coast of the South Island that Rutherford's aptitude for science and maths led in 1895 to a coveted place at Cambridge. There, under the direction of JJ Thomson, Rutherford established a reputation as a fine experimentalist with a study of X-rays.

Though surrounded at Cambridge by all the excitement generated by Thomson's discovery of the electron in 1897, Rutherford opted to investigate radioactivity and soon found that there were two distinct types of radiation emitted from uranium, which he called alpha and beta, before a third was discovered, called gamma rays.

Aged just 27, in 1898, he was appointed professor of physics at McGill University in Montreal, Canada. Among his successes over the next nine years the most important was the discovery, with his collaborator Frederick Soddy, that radioactivity was the transformation of one element into another due to the emission of an alpha or beta particle. Rutherford regarded "all science as either physics or stamp collecting" but saw the funny side when he received the 1908 Nobel prize for chemistry for this seminal work. By then he was in Manchester.

"Youthful, energetic, boisterous, he suggested anything but the scientist," was how Chaim Weizmann, then a chemist but later the first president of Israel, remembered Rutherford in Manchester. "He talked readily and vigorously on any subject under the sun, often without knowing anything about it. Going down to the refectory for lunch, I would hear the loud, friendly voice rolling up the corridor."

At the time Rutherford was busy using the alpha particle to probe and unlock the secrets of the atom. But what exactly is an alpha particle? It was a question that Rutherford and his German colleague Hans Geiger answered. It was a helium ion; that is, a helium atom that had been stripped of its two electrons. Rutherford had noticed, while still in Montreal, that some alpha particles passing through thin sheets of metal were slightly deflected, causing fuzziness on a photographic plate. It was something he asked Geiger to investigate.

As instructed by Rutherford he fired beams of alpha particles at some gold foil and by the tiny flashes of light when they struck a zinc sulphide screen discovered that a few "were deflected through quite an appreciable angle". Soon afterwards Rutherford assigned a research project to a promising undergraduate called Ernest Marsden: "Why not let him see if any alpha particles can be scattered through a large angle?" Marsden found some alpha particles bouncing straight back after hitting the gold foil and Rutherford was shocked: "It was almost as incredible as if you had fired a 15-inch shell at a piece of tissue paper and it came back and hit you."

Marsden and Geiger made comparative measurements using different metals and they discovered exactly they same large angle scattering. In June 1909 they published their extraordinary results, but with Rutherford unable to offer any kind of explanation they attracted little interest.

After decades of intense arguments, by 1910 the reality of atoms was established beyond reasonable doubt. The most widely-accepted atomic model was Thomson's so-called "plum pudding". Its ingredients consisted of a ball of diffuse "positive electricity" in which negatively charged electrons were embedded like plums in a pudding. But Rutherford knew that the atom of his old mentor couldn't explain alpha particle scattering. The probability that the accumulated effect of a number of tiny ricochets off electrons in Thomson's atom resulted in even one alpha particle being scattered backwards was almost zero. By December 1910, Rutherford believed that given the mass and energy of an alpha particle the large deflections must be the result of a single collision with an atom. It led him "to devise an atom superior to J.J's" he said at time.

Rutherford's atom consisted of a tiny central core containing virtually all the atomic mass, which he later called the nucleus, but it occupied only a minute volume "like a fly in a cathedral".Most alpha particles would pass straight through Rutherford's atom in any "collision", since they were too far from the tiny nucleus at its heart to suffer any deflection. But if an alpha particle approached the nucleus head-on, the repulsive force between the two would cause it to recoil straight back like a ball bouncing off a brick wall. Rutherford said that such direct hits were "like trying to shoot a gnat in the Albert Hall at night". Rutherford's model allowed him to make definite predictions using a simple formula he had derived about the fraction of scattered alpha particles to be found at any angle of deflection.

Experimental checks performed by Geiger and Marsden confirmed the predictions, but few physicists beyond Manchester gave any serious attention to the nuclear atom. Although Rutherford did not explicitly suggest a planetary model of the atom, there were those who knew that's exactly what it was. For most that settled the matter, Rutherford's atom was fatally flawed. A model of the atom with electrons moving around the nucleus, like planets orbiting the sun, would collapse. Any object moving in a circle undergoes acceleration, if it happens to be a charged particle, like an electron, as it accelerates it continuously losses energy in the form of radiation. An electron in orbit around the nucleus would spiral into it. Rutherford's atom was unstable and the existence of the material world was compelling evidence against it. Enter Niels Bohr.

Arriving in Manchester in March 1912 to learn about radioactivity, it wasn't before long the 27-year-old Dane began thinking about how to prevent Rutherford's nuclear atom from collapsing. His solution employed the quantum – the idea that energy comes in packets. Bohr argued that electrons inside an atom could only move in certain orbits in which they did not radiate energy and therefore couldn't spiral into the nucleus. Bohr said that each orbit had a certain energy associated with it, so all the allowed orbits were in effect a series of energy levels, like the rungs of a ladder. For an electron to move between levels, the famous quantum leap, required it to absorb or emit a quantum of energy that was equivalent to the difference in energy between the two levels.

"It is difficult to overestimate the scientific importance of the discovery of the nucleus," says Sean Freeman, professor of nuclear physics at Manchester University. "Rutherford's insight, imagination and attention to detail enabled him to make revolutionary discoveries using rather rudimentary technology by modern standards. He was a true pioneer."

One of his most important achievements was made in his spare time while Rutherford was developing methods for detecting submarines during the First World War – he split the atom. Arriving late for a committee meeting one day, Rutherford didn't apologise, but announced: "I have been engaged in experiments which suggest that the atom can be artificially disintegrated. If it is true, it is of far greater importance than a war!" It was 1919 before he published the results that showed the nucleus contained positively charged particles he called protons by knocking them out of nitrogen nuclei using alpha particles – thereby effectively splitting the nucleus and hence the atom. It was the last work he did at Manchester before moving to Cambridge to take over from Thomson as head of the Cavendish Laboratory.

It was there that in 1932 his colleagues James Cockcroft and Ernest Walton "split the atom" using the world's first particle accelerator. Also at the Cavendish, James Chadwick used Rutherford's suggestion that there was probably another constituent to heavier nuclei to discover the neutron. The particle plays the central role in establishing a nuclear chain reaction. The three men were among the 11 former students and colleagues of Rutherford who would win the Nobel prize.

Another of those 11 was Niels Bohr, who said that Rutherford never spoke more angrily to him than he did one evening at a Royal Society dinner. He had overheard Bohr refer to him by his title (Rutherford was awarded a peerage in 1931) and angrily asked the Dane loudly: "Do you Lord me?" Rutherford never cared for the honours and was indifferent to academic or social standing. What mattered most to him were the results of experiments. "I was brought up to look at the atom as a nice hard fellow, red or grey in colour, according to taste," he once said. It was a model he replaced with an atom that began the nuclear age.

Saturday, 12 February 2011

The 4% Universe

The 4% Universe: Dark matter, dark energy and the race to discover the rest of reality by Richard Panek

Times, 12 February 2011


For Galileo seeing was believing. When in 1609 he learnt of the Dutch invention of the telescope, he quickly constructed his own. With no reason to think there was anything to find, he searched the night sky and found that there was far more to the universe than meets the naked eye. He saw that the Moon had mountains, the Sun had spots and he observed the phases of Venus. With the discovery of Jupiter’s moons, Galileo found hard evidence that not all heavenly bodies revolved around the Earth. In March 1610 he published, The Starry Messenger, his report of what he had seen. All 500 copies were sold within a week.

Four centuries later Galileo’s successors know that they cannot see, even using their dazzlingly variety of modern telescopes, an astonishing 96 percent of the universe. The tiny fraction that is visible to their fine-tuned instruments is the stuff that we and all the countless planets, stars and galaxies are made from. Get rid of us and of everything else we’ve ever thought of as the universe, and very little would change. ‘We’re just a bit of pollution,’ one cosmologist says. We maybe irrelevant, but the rest of reality has been dubbed ‘dark’ and for the American science writer Richard Panek it ‘could go down in history as the ultimate semantic surrender’. For this is not ‘dark’ as in distant or invisible, but ‘dark’ as in unknown - for now at least.

Yet what is known is that almost a quarter of what can’t be seen is something called dark matter. Although its very nature is a mystery, its presence is discernible through its gravitational effect on the movement of galaxies. Without dark matter the astronomical data doesn’t make sense.

From a derelict iron mine in Minnesota to mountaintop observatories, at a pace that would shame many a thriller writer, Panek tells the story of the quest to unlock the secrets of dark matter and the particles that make it up. These weakly interacting massive particles, or WIMPs, have proven so elusive that the possibility that two were detected in November 2009 caused great excitement.

Dark matter is less than half the tale Panek wants to tell. For three quarters of the unknown universe consists of an even stranger substance called dark energy. Its existence was inferred, once again, from the circumstantial evidence gathered by astronomers measuring what could be seen. They didn’t need Sherlock Holmes to remind them that after eliminating the impossible, whatever remains, no matter how improbable, is the truth.

In the late 1990s two rival teams set out to collect data on distant supernovae in an attempt to determine the rate at which the universe was expanding. It was assumed that the pull of gravity would act as a break on the pace of expansion. To their disbelief they found that space-time was being pushed apart faster than ever before. Something was overwhelming the force of gravity to drive the expansion. Dark energy was winning the cosmic tug-of-war.

With a future Nobel prize at stake, disputes and arguments over who did what and when were inevitable. Parek provides a behind-the-scenes glimpse of science in the raw as alliances are forged and friendships strained. There is a new universe to explore and the latest experiments reveal it is 13.75 billion years old and made up of 72.8 per cent dark energy, 22.7 per cent dark matter and 4.5 per cent ordinary matter. These numbers are ‘an exquisitely precise accounting of the depths of our ignorance,’ says Panek. ‘It’s 1610 all over again.’

Thursday, 10 February 2011

Culture Under The Microscope

Culture Under The Microscope
The Visceral Exhibition @ The Science Gallery, Dublin.

Independent, 10 February 2011




What is Life? It's a question that the quantum physicist Erwin Schrödinger tackled in three famous lectures given at Trinity College, Dublin. The first, on 5 February 1943, was heard by an audience that included the entire Irish cabinet led by Éamon de Valera.

Schrödinger is remembered today for making vivid the weirdness of the quantum world with his famous cat-in-the-box thought experiment. Schrödinger's cat is neither dead nor alive but exists in a superposition of states until we open it and look. Yet when his Trinity College lectures were published they became influential in persuading many young physicists that Schrödinger's methods might solve some of the problems in the developing field of molecular biology. James Watson and Francis Crick cited the book as a key inspiration for the research that led them to the discovery of the double-helix structure of DNA.

"Schrödinger with his mythical 'semi-living' cat, could be described as a pioneer of BioArt," says Dr Michael John Gorman, the director of Dublin's Science Gallery, which is also located in Trinity College. His tongue is firmly in his cheek as he accompanies 40 people on the short walk from his gallery to the Schrödinger Theatre, to discuss what life is. This is one of the many activities surrounding the gallery's latest exhibition, Visceral: The Living Art Experiment.

"BioArt" was a term coined in 1997 as a number of artists abandoned paints and brush in favour of cells, fragments of DNA, proteins and living tissue. Visceral, a month-long exhibition uses new technologies, tissue and neural engineering to explore the question "what is life?" People may be put off by some of the 15 works, some of which use human tissue as book covers or retinal cells to project film. Gorman admits there is something a little queasy about creating artworks from living tissue. "The very idea of tissue-engineering becoming an art form makes us squirm," he says. However, Visceral is all about provoking the sort of instinctive gut reaction that Gorman hopes will gets visitors asking questions about the ethical implications of manipulating living material and what we mean by "living".

The exhibition's curator, Oron Catts, believes that the "logic that drives things like nanotechnology, synthetic biology and even things like neuroengineering needs to be scrutinised and explored by people other than just scientists and engineers". It was one of the reason that Catts helped to set up SymbioticA, an artistic lab dedicated to a hands-on engagement with the life sciences based at the University of Western Australia in Perth.

"Our interest is in life," says Catts, "not only art or science." Yet the exhibition demonstrates the depth of the potential of interactions between art and science. For Gorman, nothing illustrates this better than Silent Barrage, the largest work on show. The product of a collaboration between Neurotica, a group of five artists, and Dr Steve Potter of the Georgia Institute of Technology in Atlanta, its a cutting edge piece of neural engineering. It consists of an array of robotic poles hooked up to neurons from the brains of rats in Potter's lab.

The array responds to the way visitors move through it and sends signals back to the neurons. These neurons then fire, making the robotic poles shudder up and down. Depending on the amount of audience activity, the neurons can undergo what is called a "barrage" – when they start firing in a chaotic fashion. This is exactly what happens during an epileptic seizure. With epilepsy affecting over 450000 people in the UK alone, it is hoped by the scientists involved that the data collected might lead to a better understanding of the process by which cells are calmed and seizures mitigated. And its not the only exhibit that promises something scientifically tangible.

The battlefield of Kathy High's Blood Wars is a Petri dish with the combatants being the white blood cells drawn from two different people. After a few hours slugging it out, one set of platelets will have destroyed the other. The "winner" of each cellular battle goes on to fight another participant. The concept may sound sinister to some with concerns about eugenics, but it is in an ingenious attempt to engage in the age-old debates surrounding traits inherited through blood.

Catts says that cell lines create a form of immortality since they can live beyond the life of the donor. I'm reminded of the story told by Rebecca Skloot in her bestselling book, The Immortal Life of Henrietta Lacks. Known to scientists as HeLa, Lacks died in 1951 but her cancer cells were taken without her knowledge and became one of the most important tools in medicine. The Vision Splendid, a work by Alicia King consists of two sealed glass jars, connected by tubes that contain nutrients and cultured human tissue. The cells were those of an unknown African-American girl aged 13. You're left wondering who owns the stuff our bodies are made of. If that worries you, then Catts offers a way to ease your troubles.

The Semi-Living Worry Dolls by Catts and Ionat Zurr are a modern version of the famous Guatemalan worry dolls constructed out of degradable polymer on which cells are grown in micro-gravity conditions. You can whisper your troubles to them through a microphone as they eventually replace the polymer completely, transforming the piece from fabric to tissue.

With the Irish general election rescheduled for the closing date of Visceral on 25 February, there's a rumour going around that the Silent Barrage installation may be able to predict the outcome – if political candidates are willing to present themselves to the cultured rat neurons in person.





Friday, 4 February 2011

Unnatural

Unnatural: The Heretical Idea of Making People by Philip Ball

Guardian, 5 February 2011

The award of the Nobel Prize, when it came in October 2010, was long overdue. By then there was more than three decades' worth of growing evidence to back up the claim of two British men. Around 4 million people, none older than 33, were living proof of their pioneering work in developing the technique of in vitro fertilisation (IVF). Sadly for the gynaecologist and surgeon Dr Patrick Steptoe, who died in 1988, the Nobel isn't awarded posthumously. Therefore the sole recipient was the physiologist Professor Robert Edwards who at 85 was too ill to travel to Stockholm to collect the prize in person.

Since the birth of Louise Brown on 25 July 1978, IVF has helped and offered hope to some of the 10% of all couples worldwide who suffer from infertility. Yet the birth of the "first test tube baby" at Oldham General hospital outraged many for being the product of an unnatural interference by scientists in the creation of a human being. Repeatedly having to fend off charges that he was playing God, Edwards once complained that the early public response to IVF was conditioned by "fantasies of horror and disaster, and visions of white-coated, heartless men, breeding and rearing embryos in the laboratory to bring forth Frankenstein genetic monsters".

Philip Ball, who in Critical Mass explored how one thing leads to another, points out in his latest book, Unnatural, that traditionally the "natural" end of sex is procreation since the latter requires the former. However, religious objections to IVF, Ball argues, invoke this reasoning in reverse: the natural beginning of procreation is sex – not sex in terms of sperm meets egg, but in the anatomical sense. Hence, Ball's interest in exploring what lies beyond the "this bit goes in here" method. The result is a fascinating and impressive cultural history of anthropoeia – the centuries of myths and tales about the artificial creation of people.

Ball explores what these fables reveal about contemporary views on life, humanity and technology as modern science has turned the fantasy of making people into reality. From the homunculus of the medieval alchemists and the clay golem of Jewish legend to Frankenstein's monster and the babies in jars of Huxley's Brave New World, Ball ranges far and wide to show that the idea that making life is either hubristic or "unnatural" is a relatively recent one.

Until the Enlightenment, it was widely assumed that it was possible to make lower forms of life. For example, a process called bougonia in which bees were created using the carcasses of dead oxen was once accepted as fact. It was only in the 19th century that "spontaneous generation", the belief that life could spring forth from inanimate matter without the need for seeds, eggs or parents, was finally discredited. If there were any doubts about such practices, explains Ball, then they were about the quality and character of "artificial life" – was it inferior, equivalent, or better than "natural" life?

The ultimate "unnatural" act is the artificial creation of humans, since it challenges the conviction that we are God's chosen. Yet Ball makes a persuasive case when he suggests that the response of the medieval mind to the idea of artificial human life was very different from the horror it now typically engenders. This indicates that feelings of revulsion about these "unnatural" creations are not inevitable.

The prefix "un" was only attached to acts that were deemed reprehensible because they were contra naturam, against nature. However, people in the middle ages saw nothing intrinsically wrong in creating human and other forms of life. The problem for them was rather, as the 12th-century Muslim scholar Averroes said, that organisms made by art were like alchemist's gold, a kind of fake. In short, any "unnatural" creation lacked a soul.

Doubts about the possibility of an artificial person having a soul are still with us, though given a modern spin. The fabricated being is denied genuine humanity. He or she is thought to be soulless: lacking in love, warmth and human feeling. This same failing is now imputed to human clones – 21st-century reincarnations of Frankenstein's monster, as the very term carries connotations of spiritual vacancy. A skilled practitioner of the book-length essay, Ball can also be wonderfully succinct: "'Soul' has become a kind of watermark of humanity, a defence against the awful thought that we could be manufactured."

Debates about the pros and cons of human embryo research, cloning and the like require a focus on issues that are rooted in the particularities of our time and culture. Ball's thoughtful book is a reminder that as we try and deal with how to enable and assist people into being, we need to understand and then conquer our fears surrounding the very idea of making people.