The sun shines on a solar quest
Chasing The Sun: The epic story of the star that gave us life by Richard Cohen
Independent, 3 December 2010
A Chinese-made portable solar-powered lamp and charger was the final object chosen by Neil MacGregor, director of the British Museum, in his A History of the World in 100 Objects. It was a choice that few would have predicted, but once its solar panel captures and stores the energy of eight hours of sunshine, this 21st-century lamp converts it into 100 hours of light to be used whenever and wherever needed. "The power of the Sun seems a good place to end this global history," said MacGregor, "because solar energy is a dream of the future that echoes the oldest and most universal of human myths, that of the life-giving Sun."
Richard Cohen has produced an encyclopaedia of a book that could almost serve as a companion to MacGregor's landmark series – an alternative history of the world told through humanity's relationship with a single object.
He begins with the myth of the Sun as the god Inti, of the tribes of Peru and northern Chile, who descended into the ocean every evening, swam back to the east, then reappeared, refreshed by his bath. The Hopi of Arizona claim that they made the Sun by throwing up a buckskin shield along with a fox's coat and a parrot's tail, to make the colours of sunrise and sunset.
"The Sun is 32,000 light years from the centre of its galaxy of a hundred billion stars, which it orbits at about 155 miles a second, taking about 200 million years to complete a revolution," reports Cohen. The Sun has been active for 4.6 billion years and a single particle of light, a photon, from its core takes 150,000 years to reach space. Every second, about five million metric tonnes of mass are converted into nuclear energy: equivalent to the detonation of 90,000 million one-megaton hydrogen bombs. The numbers are mind-boggling, but this constant blast of nuclear reactions pushes energy to the surface, releasing it as light and heat.
The Earth receives more of this energy in just 45 minutes than its inhabitants consume in a year. About 35 per cent is reflected back into space by clouds and the atmosphere absorbs another 19 per cent. This still leaves 12,000 times as much energy as used by all man-made devices.
Effectively harnessing the Sun is not just a modern ambition. The Greeks in the third century BC used "burning mirrors" to focus sunlight on enemy warships. Archimedes, so legend has it, deployed such mirrors in 212BC to defend Syracuse from a blockading Roman fleet by burning the enemy's sails.The Romans were the first to build greenhouses. In the sixth century, the Emperor Justinian passed a law protecting public and domestic sunrooms from the erection of buildings that obstructed light; a 1,000 years later, Leonardo da Vinci proposed using a giant mirror as a commercial source of heat.
Like some latter-day Victorian species hunter who travels the globe collecting new specimens, Cohen spent eight years chasing the Sun across 18 countries and six continents. He began with a climb to the top of Mount Fuji to watch the sunrise on the summer solstice and ended in a sunset viewed from a boat on the Ganges at Varanasi.
Among other trips, Cohen recounts seeing an eclipse on the Antarctic ice. He visits the Arctic city of Tromso in Norway, which for ten weeks each year receives virtually no sunlight (a period the locals call "the dark times"), to investigate how we react to the loss of light.
Among Cohen's treasure-trove of solar miscellanea is the "sunspot cycle" in modern economics, the story of sundials and calendars from Julius Caesar to Pope Gregory VII, the introduction of daylight saving time in 1916 as a wartime economic measure, and a brief history of navigation and cartography. He shows how the Babylonians were the first to record the Sun's movements in detail, while the Greeks went further, inquiring into its size, shape, and distance from the Earth.
Occasionally Cohen's passion for all things solar gets the better of him, as when he suggests that the lingams of Hindu temple domes are "purposely imitated in the design of nuclear-reactor cones, the latest tribute to the Sun's potency". Yet he can be forgiven, as we learn how artists from the Renaissance to Hockney have depicted the Sun, or of Shakespeare's enthusiasm for suntans. We see why Galileo recorded his discovery of sunspots in code, why Matisse rushed out of his dying wife's bedroom, why Wagner hated the Sun and Mozart loved it.
All things come to an end, and Cohen devotes a chapter to the death of the Sun. As I read it I remembered the first part of a poem by Francis William Bourdillon:
The night has a thousand eyes,
And the day but one;
Yet the light of the bright world dies
With the dying of the sun.
Friday, 3 December 2010
Monday, 22 November 2010
Species Seekers
A Breed of their own
The Species Seekers: Heroes, Fools and the Mad Pursuit of Life on Earth by Richard Conniff
Financial Times, 20-21 November 2010
‘Our perfect naturalist,’ wrote the English clergyman, naturalist and novelist Charles Kingsley in 1855, “should be strong in body; able to haul a dredge, climb a rock, turn a boulder, walk all day … he should know how to swim for his life, to pull an oar, sail a boat, and ride the first horse which comes to hand; and, finally, he should be a thoroughly good shot, and a skilful fisherman; and, if he go far abroad, be able on occasion to fight for his life.’
Amid the tales of adventure and hardship vividly told by the American science writer Richard Conniff in this marvelous book, most of those naturalists who made it their mission to travel the globe in of search glory and new species were far from perfect. But as he captures the mania for collecting and cataloguing the natural world in the 18th and 19th centuries, Conniff shows that these daredevil amateurs played an invaluable ‘part in building a great and permanent body of knowledge’.
What does it mean to discover a species? Surely local people had known most of these species for many years before they were ‘discovered’. Discovery, explains Conniff, isn’t just a matter of being the first person to lay eyes on an animal, plant or insect. You must recognize that there’s something different about it and explain in print just how and why it’s different. That requires some scheme of classification.
At the beginning of the 18th century, naturalists knew only a few thousand species, and sometimes could not even distinguish plants from animals. That changed in 1735 when the Swedish botanist Carolus Linnaeus published his system for identifying and classifying species.
Armed with the Linnaean system, guns, nets, collecting boxes and an almost missionary sense of purpose, species seekers went everywhere, from the Namib Desert to the Great Barrier Reef, and brought back creatures that even the authors of medieval bestiaries could hardly have imagined.
Naturalists were often caught up in the business of conquest and colonization, using natural history to advance their own careers and to remake the world on European lines. Yet many of us are alive today, for instance, because naturalists identified obscure species that later turned out to cause malaria, yellow fever, typhus and other epidemic diseases.
Modern species seekers still aim to catalogue every species on Earth, even though the tally is nearing 2m, and new ones are found daily. In 2003, the eminent American zoologist E. O. Wilson proposed the Encyclopedia of Life, a web-based project with a page for every species within 25 years. But as Wilson acknowledged at the time, ‘the truth is that we do not know how many species of organisms exist on Earth even to the nearest order of magnitude.’ He thought the final count would be about 10m. Others believe it could be 50m or even 100m - numbers that Conniff’s heroes and fools could not have imagined. We still live in the great age of discovery and this is the story of how it began.
The Species Seekers: Heroes, Fools and the Mad Pursuit of Life on Earth by Richard Conniff
Financial Times, 20-21 November 2010
‘Our perfect naturalist,’ wrote the English clergyman, naturalist and novelist Charles Kingsley in 1855, “should be strong in body; able to haul a dredge, climb a rock, turn a boulder, walk all day … he should know how to swim for his life, to pull an oar, sail a boat, and ride the first horse which comes to hand; and, finally, he should be a thoroughly good shot, and a skilful fisherman; and, if he go far abroad, be able on occasion to fight for his life.’
Amid the tales of adventure and hardship vividly told by the American science writer Richard Conniff in this marvelous book, most of those naturalists who made it their mission to travel the globe in of search glory and new species were far from perfect. But as he captures the mania for collecting and cataloguing the natural world in the 18th and 19th centuries, Conniff shows that these daredevil amateurs played an invaluable ‘part in building a great and permanent body of knowledge’.
What does it mean to discover a species? Surely local people had known most of these species for many years before they were ‘discovered’. Discovery, explains Conniff, isn’t just a matter of being the first person to lay eyes on an animal, plant or insect. You must recognize that there’s something different about it and explain in print just how and why it’s different. That requires some scheme of classification.
At the beginning of the 18th century, naturalists knew only a few thousand species, and sometimes could not even distinguish plants from animals. That changed in 1735 when the Swedish botanist Carolus Linnaeus published his system for identifying and classifying species.
Armed with the Linnaean system, guns, nets, collecting boxes and an almost missionary sense of purpose, species seekers went everywhere, from the Namib Desert to the Great Barrier Reef, and brought back creatures that even the authors of medieval bestiaries could hardly have imagined.
Naturalists were often caught up in the business of conquest and colonization, using natural history to advance their own careers and to remake the world on European lines. Yet many of us are alive today, for instance, because naturalists identified obscure species that later turned out to cause malaria, yellow fever, typhus and other epidemic diseases.
Modern species seekers still aim to catalogue every species on Earth, even though the tally is nearing 2m, and new ones are found daily. In 2003, the eminent American zoologist E. O. Wilson proposed the Encyclopedia of Life, a web-based project with a page for every species within 25 years. But as Wilson acknowledged at the time, ‘the truth is that we do not know how many species of organisms exist on Earth even to the nearest order of magnitude.’ He thought the final count would be about 10m. Others believe it could be 50m or even 100m - numbers that Conniff’s heroes and fools could not have imagined. We still live in the great age of discovery and this is the story of how it began.
Friday, 19 November 2010
Sleights of Mind
A fascinating look at a new branch of cognitive research: "neuromagic"
Sleights of Mind: What the neuroscience of magic reveals about our everyday deceptions by Stephen L Macnik, Susana Martinez-Conde and Sandra Blakeslee
New Scientist, 20 November 2010
MAGIC, it mystifies and captivates us. We shake our heads in disbelief as coins are conjured out of thin air, as cards are mysteriously summoned from a pack, and as the magician's assistant vanishes before our eyes. Of course, there is no such thing as "magic", so how does magic work? It's a question that neuroscientists like Stephen Macknik and Susana Martinez-Conde are trying to answer. In the process they have conjured up a new branch of cognitive research called neuromagic.
From misdirection and the magical practice of "forcing", to mirror neurons and synaptic plasticity, Sleights of Mind is a spellbinding mix of magic and science. The authors invite us to sip this heady potion as they show us how understanding the myriad ways in which the brain is deceived by magic may solve some of the mysteries surrounding how it works.
“Magic tricks fool us because humans have hard-wired processes of attention and awareness that are hackable,” say the authors. Magicians use your mind’s intrinsic properties against you. In a magical feat of their own, the authors persuaded magicians such as James Randi and Teller from the Las Vegas headline act Penn and Teller to deconstruct tricks so that Macknik and Martinez-Conde could later attempt to reconstruct what is going on inside your head “as you are suckered”.
Magic, say the neuroscientists, could reveal how the brain functions in everyday situations such as shopping. However, it is a stretch to believe, as the authors do, that if you’ve bought an expensive item that you never intended to buy, then you were probably a victim of the “illusion of choice”, a technique magicians use to rob their dupes of genuine choice.
The magician toys with us when he appears to put a coin into his right hand, closes it, waves his left over it, and then opens the right. The coin, which we feel must still be there, has “vanished”. He makes us experience the impossible by disrupting the expected relationship between a cause and its effect.
What we see, hear, and feel is based on what we expect to see, hear and feel due to our experiences and memories. When these expectations are violated the brain takes more time to process data or our attention focuses on the violation. Success or failure for the magician relies on his skill in diverting our attention away from the method and towards the magical effect.
Great magicians, through countless hours of practice, manipulate our attention, memory and causal inferences using a bewildering combination of visual, auditory and tactile methods. The greatest magic show on earth, though, is the one happening in your brain.
Sleights of Mind: What the neuroscience of magic reveals about our everyday deceptions by Stephen L Macnik, Susana Martinez-Conde and Sandra Blakeslee
New Scientist, 20 November 2010
MAGIC, it mystifies and captivates us. We shake our heads in disbelief as coins are conjured out of thin air, as cards are mysteriously summoned from a pack, and as the magician's assistant vanishes before our eyes. Of course, there is no such thing as "magic", so how does magic work? It's a question that neuroscientists like Stephen Macknik and Susana Martinez-Conde are trying to answer. In the process they have conjured up a new branch of cognitive research called neuromagic.
From misdirection and the magical practice of "forcing", to mirror neurons and synaptic plasticity, Sleights of Mind is a spellbinding mix of magic and science. The authors invite us to sip this heady potion as they show us how understanding the myriad ways in which the brain is deceived by magic may solve some of the mysteries surrounding how it works.
“Magic tricks fool us because humans have hard-wired processes of attention and awareness that are hackable,” say the authors. Magicians use your mind’s intrinsic properties against you. In a magical feat of their own, the authors persuaded magicians such as James Randi and Teller from the Las Vegas headline act Penn and Teller to deconstruct tricks so that Macknik and Martinez-Conde could later attempt to reconstruct what is going on inside your head “as you are suckered”.
Magic, say the neuroscientists, could reveal how the brain functions in everyday situations such as shopping. However, it is a stretch to believe, as the authors do, that if you’ve bought an expensive item that you never intended to buy, then you were probably a victim of the “illusion of choice”, a technique magicians use to rob their dupes of genuine choice.
The magician toys with us when he appears to put a coin into his right hand, closes it, waves his left over it, and then opens the right. The coin, which we feel must still be there, has “vanished”. He makes us experience the impossible by disrupting the expected relationship between a cause and its effect.
What we see, hear, and feel is based on what we expect to see, hear and feel due to our experiences and memories. When these expectations are violated the brain takes more time to process data or our attention focuses on the violation. Success or failure for the magician relies on his skill in diverting our attention away from the method and towards the magical effect.
Great magicians, through countless hours of practice, manipulate our attention, memory and causal inferences using a bewildering combination of visual, auditory and tactile methods. The greatest magic show on earth, though, is the one happening in your brain.
Thursday, 4 November 2010
Neutrino
Ghost Particle
Neutrino by Frank Close
New Scientist, 6 November 2010
For a moment in the late 1920s, Niels Bohr considered the unthinkable: abandoning the notion of conservation of energy. He wasn't calling for its wholesale rejection, only that it be disregarded whenever a neutron decayed into a proton and an electron, as some energy appeared to go missing along the way.
Wolfgang Pauli, who was wont to damn poor ideas as "not even wrong", came up with a solution he called "a terrible thing" - an unknown particle to account for the missing energy. Since it had to be electrically neutral with little or no mass, it was called the neutrino, the "little neutral one".
In this short and informative book, Frank Close recalls those who had the ingenuity and patience to catch and understand this elusive particle that barely interacts with other matter. Their successors are hunting neutrinos left over from the big bang, and no one knows what story these relics will tell.
Neutrino by Frank Close
New Scientist, 6 November 2010
For a moment in the late 1920s, Niels Bohr considered the unthinkable: abandoning the notion of conservation of energy. He wasn't calling for its wholesale rejection, only that it be disregarded whenever a neutron decayed into a proton and an electron, as some energy appeared to go missing along the way.
Wolfgang Pauli, who was wont to damn poor ideas as "not even wrong", came up with a solution he called "a terrible thing" - an unknown particle to account for the missing energy. Since it had to be electrically neutral with little or no mass, it was called the neutrino, the "little neutral one".
In this short and informative book, Frank Close recalls those who had the ingenuity and patience to catch and understand this elusive particle that barely interacts with other matter. Their successors are hunting neutrinos left over from the big bang, and no one knows what story these relics will tell.
Friday, 29 October 2010
Thursday, 28 October 2010
The Naked Scientist
Bare Essentials
The Naked Scientist by Chris Smith
New Scientist, 30 October 2010
If you are not a fan of Jamie Oliver, aka the Naked Chef, then the title alone may be enough to put you off this book. But resist the temptation to judge it by its cover. This is science packaged as light entertainment, with flash-facts and bite-sized stories ranging from how fish help pollinate flowers to why booze makes us drunk. Among the more fascinating entries are a study that found that people are less likely to remember brands advertised during violent and sexually explicit programmes, and the possibility that eating curry could help ward off Alzheimer's.
The Naked Scientist is the alter ego of University of Cambridge virologist Chris Smith, who wants to "strip down science to the bare essentials and expose you to what it really is - addictively enjoyable, interesting and occasionally a bit naughty". With Christmas looming, and in search of an audience, Smith bares just enough to pull it off.
The Naked Scientist by Chris Smith
New Scientist, 30 October 2010
If you are not a fan of Jamie Oliver, aka the Naked Chef, then the title alone may be enough to put you off this book. But resist the temptation to judge it by its cover. This is science packaged as light entertainment, with flash-facts and bite-sized stories ranging from how fish help pollinate flowers to why booze makes us drunk. Among the more fascinating entries are a study that found that people are less likely to remember brands advertised during violent and sexually explicit programmes, and the possibility that eating curry could help ward off Alzheimer's.
The Naked Scientist is the alter ego of University of Cambridge virologist Chris Smith, who wants to "strip down science to the bare essentials and expose you to what it really is - addictively enjoyable, interesting and occasionally a bit naughty". With Christmas looming, and in search of an audience, Smith bares just enough to pull it off.
Sunday, 17 October 2010
Merchants of Doubt
A lot of hot air from the sceptics
Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming by Naomi Oreskes and Erik M Conway
Independent, 18 October 2010
Public scepticism about climate change is on the rise. The shift is due in part to the publication of the hacked emails of the University of East Anglia's "climategate" scientists. Now climatologists have to live with the fact that most people simply don't trust them. Merchants of Doubt, by two historians of science, may help restore some trust by showing that science is rarely black and white, and how its shades of grey have sometimes been distorted by a few willing hands.
Writing before "climategate", Naomi Oreskes and Erik Conway believe that "We all need a better understanding of what science really is...and how to separate it from the garbage." They tell the story of how for half a century, a small group of scientists in America collaborated with think-tanks and corporations in campaigns to discredit scientific research by creating doubt and manipulating debate.
Manufacturing doubt as an effective corporate strategy was first developed by the American tobacco industry. Determined to stop any government regulation in the face of scientific evidence linking tobacco to lung cancer, the cigarette-makers created the Council for Tobacco Research to discredit the scientists and dispute their findings. "Doubt is our product," boasted a now infamous 1969 industry memo. Doubt would shield the tobacco industry from litigation and regulation for decades to come.
The so-called "Tobacco Strategy" was used to "maintain the controversy" by promoting claims contrary to research. The peddlers of doubt insisted that scientists were wrong about the risks of Ronald Reagan's Strategic Defence Initiative, and that acid rain was caused by volcanoes. They would dismiss global warming by claiming, in turn, that there was none; if there was, it was just natural variation; finally, it didn't matter because humans would adapt. Aided by a complicit media, these claims generated the illusion of genuine scientific debate when there was none at all.
No scientific conclusion can ever be proven with certainty, but it is no more a "belief" to say that the Earth is heating up than to say that continents move or that germs cause disease. Oreskes and Conway warn that, "without some degree of trust in our designated experts we are paralyzed".
Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming by Naomi Oreskes and Erik M Conway
Independent, 18 October 2010
Public scepticism about climate change is on the rise. The shift is due in part to the publication of the hacked emails of the University of East Anglia's "climategate" scientists. Now climatologists have to live with the fact that most people simply don't trust them. Merchants of Doubt, by two historians of science, may help restore some trust by showing that science is rarely black and white, and how its shades of grey have sometimes been distorted by a few willing hands.
Writing before "climategate", Naomi Oreskes and Erik Conway believe that "We all need a better understanding of what science really is...and how to separate it from the garbage." They tell the story of how for half a century, a small group of scientists in America collaborated with think-tanks and corporations in campaigns to discredit scientific research by creating doubt and manipulating debate.
Manufacturing doubt as an effective corporate strategy was first developed by the American tobacco industry. Determined to stop any government regulation in the face of scientific evidence linking tobacco to lung cancer, the cigarette-makers created the Council for Tobacco Research to discredit the scientists and dispute their findings. "Doubt is our product," boasted a now infamous 1969 industry memo. Doubt would shield the tobacco industry from litigation and regulation for decades to come.
The so-called "Tobacco Strategy" was used to "maintain the controversy" by promoting claims contrary to research. The peddlers of doubt insisted that scientists were wrong about the risks of Ronald Reagan's Strategic Defence Initiative, and that acid rain was caused by volcanoes. They would dismiss global warming by claiming, in turn, that there was none; if there was, it was just natural variation; finally, it didn't matter because humans would adapt. Aided by a complicit media, these claims generated the illusion of genuine scientific debate when there was none at all.
No scientific conclusion can ever be proven with certainty, but it is no more a "belief" to say that the Earth is heating up than to say that continents move or that germs cause disease. Oreskes and Conway warn that, "without some degree of trust in our designated experts we are paralyzed".
Saturday, 16 October 2010
Cycles of Time
Now and Then
Cycles of Time: An Extraordinary New View of the Universe by Roger Penrose
Guardian, 16 October 2010
When I first encountered the work of MC Escher, I couldn't understand how he managed to depict the seemingly impossible. I was nine, and the two pieces that puzzled me were Waterfall and Ascending and Descending. In the first, water at the bottom of a waterfall flows along a channel back to the top without defying gravity in a never-ending cycle. The second is even more striking, with one set of monks climbing an endless staircase while another group walk down it without either ever getting any higher or lower. Years later I learnt that both works were inspired by Roger Penrose.
As a student in 1954, Penrose was attending a conference in Amsterdam when by chance he came across an exhibition of Escher's work. Soon he was trying to conjure up impossible figures of his own and discovered the tri-bar – a triangle that looks like a real, solid three-dimensional object, but isn't. Together with his father, Penrose went on to design a staircase that simultaneously loops up and down. An article followed and a copy was sent to Escher. Completing a cyclical flow of creativity, the Dutch master of geometrical illusions was inspired to produce his two masterpieces.
Doing what most find impossible has long been Penrose's stock in trade in mathematics and physics, even when it comes to publishing. His previous book, The Road to Reality, was a 1,049-page bestseller, although it was mostly a textbook. Penrose doesn't do "popular", as he peppers his books with equation after equation in defiance of the publishing maxim that each one cuts sales in half. By that reckoning Cycles of Time will have about four readers, though it's probably destined to be another bestseller. As Penrose puts forward his truly Extraordinary New View of the Universe, that the big bang is both the end of one aeon and the beginning of another in an Escheresque endless cycling of time, he outlines the prevailing orthodoxy about the origins of the cosmos.
In the late 20s it was discovered that the light from distant galaxies was stretched towards the red end of the visible spectrum. This redshift was found to be greater the further away the galaxy was, and was accepted as evidence of an expanding universe. This inevitably led theorists to extrapolate backwards to the big bang – the moment of its birth some 13.7bn years ago, when space and time exploded into being out of a single point, infinitely hot and dense, called a singularity. That at least was the theory, with little more to back it up until 1964, when two American scientists discovered "cosmic background radiation" – the faint echo of the big bang. In the decades since, further evidence has accumulated and theoretical refinements made to accommodate it. Yet in recent years a few physicists have challenged the big bang model by daring to ask and answer questions such as: was the big bang the beginning of the universe?
Traditionally such questions have been dismissed as meaningless – space and time were created at the big bang; there simply was no "before". Although it's possible to work out in incredible detail what happened all the way back to within a fraction of a second of the big bang, at the moment itself the theory of general relativity breaks down, or as Penrose puts it: "Einstein's equations (and physics as a whole, as we know it) simply 'give up' at the singularity." However, he believes we should not conclude from this that the big bang was the beginning of the universe.
Acknowledging that he's not the first to think such heretical thoughts, Penrose looks at earlier "pre-big bang proposals". Finding them "fanciful", Penrose looked anew at the big bang, because of an unsolved mystery at its heart involving the Second Law of Thermodynamics. One of the most fundamental in all of physics, it simply says that the amount of disorder, something that physicists label "entropy", increases with the passage of time. Herein lies the mystery for Penrose. The instant after the big bang, "a wildly hot violent event", must have been one of maximum entropy. How can entropy therefore increase? Penrose thinks he has the answer; there must be a pre-big bang era that ensures that entropy is low at the birth of the universe. And here's how.
In what Penrose calls "conformal cyclic cosmology", the beginning and the end of the universe are in effect the same, since these two phases of its evolution contain only massless particles. Between now and a far off distant future, everything from the tiniest particles to biggest galaxies will have been eaten by black holes. They in turn lose energy in the form of massless particles and slowly disappear. As one black hole after another vanishes the universe loses "information". Since information is linked to entropy, the entropy of the universe decreases with the demise of each black hole.
The strangest thing about massless particles is that for them there is no such thing as time. There is no past or present, only "now", and it stretches for all eternity – but since there is no tick of the clock, what eternity? With some mind-numbing maths, Penrose argues that as time ends in the era of massless particles, the fate of our universe can actually be reinterpreted as the big bang of a new one: "Our universe is what I call an aeon in an endless sequence of aeons." Escher would have approved.
Cycles of Time: An Extraordinary New View of the Universe by Roger Penrose
Guardian, 16 October 2010
When I first encountered the work of MC Escher, I couldn't understand how he managed to depict the seemingly impossible. I was nine, and the two pieces that puzzled me were Waterfall and Ascending and Descending. In the first, water at the bottom of a waterfall flows along a channel back to the top without defying gravity in a never-ending cycle. The second is even more striking, with one set of monks climbing an endless staircase while another group walk down it without either ever getting any higher or lower. Years later I learnt that both works were inspired by Roger Penrose.
As a student in 1954, Penrose was attending a conference in Amsterdam when by chance he came across an exhibition of Escher's work. Soon he was trying to conjure up impossible figures of his own and discovered the tri-bar – a triangle that looks like a real, solid three-dimensional object, but isn't. Together with his father, Penrose went on to design a staircase that simultaneously loops up and down. An article followed and a copy was sent to Escher. Completing a cyclical flow of creativity, the Dutch master of geometrical illusions was inspired to produce his two masterpieces.
Doing what most find impossible has long been Penrose's stock in trade in mathematics and physics, even when it comes to publishing. His previous book, The Road to Reality, was a 1,049-page bestseller, although it was mostly a textbook. Penrose doesn't do "popular", as he peppers his books with equation after equation in defiance of the publishing maxim that each one cuts sales in half. By that reckoning Cycles of Time will have about four readers, though it's probably destined to be another bestseller. As Penrose puts forward his truly Extraordinary New View of the Universe, that the big bang is both the end of one aeon and the beginning of another in an Escheresque endless cycling of time, he outlines the prevailing orthodoxy about the origins of the cosmos.
In the late 20s it was discovered that the light from distant galaxies was stretched towards the red end of the visible spectrum. This redshift was found to be greater the further away the galaxy was, and was accepted as evidence of an expanding universe. This inevitably led theorists to extrapolate backwards to the big bang – the moment of its birth some 13.7bn years ago, when space and time exploded into being out of a single point, infinitely hot and dense, called a singularity. That at least was the theory, with little more to back it up until 1964, when two American scientists discovered "cosmic background radiation" – the faint echo of the big bang. In the decades since, further evidence has accumulated and theoretical refinements made to accommodate it. Yet in recent years a few physicists have challenged the big bang model by daring to ask and answer questions such as: was the big bang the beginning of the universe?
Traditionally such questions have been dismissed as meaningless – space and time were created at the big bang; there simply was no "before". Although it's possible to work out in incredible detail what happened all the way back to within a fraction of a second of the big bang, at the moment itself the theory of general relativity breaks down, or as Penrose puts it: "Einstein's equations (and physics as a whole, as we know it) simply 'give up' at the singularity." However, he believes we should not conclude from this that the big bang was the beginning of the universe.
Acknowledging that he's not the first to think such heretical thoughts, Penrose looks at earlier "pre-big bang proposals". Finding them "fanciful", Penrose looked anew at the big bang, because of an unsolved mystery at its heart involving the Second Law of Thermodynamics. One of the most fundamental in all of physics, it simply says that the amount of disorder, something that physicists label "entropy", increases with the passage of time. Herein lies the mystery for Penrose. The instant after the big bang, "a wildly hot violent event", must have been one of maximum entropy. How can entropy therefore increase? Penrose thinks he has the answer; there must be a pre-big bang era that ensures that entropy is low at the birth of the universe. And here's how.
In what Penrose calls "conformal cyclic cosmology", the beginning and the end of the universe are in effect the same, since these two phases of its evolution contain only massless particles. Between now and a far off distant future, everything from the tiniest particles to biggest galaxies will have been eaten by black holes. They in turn lose energy in the form of massless particles and slowly disappear. As one black hole after another vanishes the universe loses "information". Since information is linked to entropy, the entropy of the universe decreases with the demise of each black hole.
The strangest thing about massless particles is that for them there is no such thing as time. There is no past or present, only "now", and it stretches for all eternity – but since there is no tick of the clock, what eternity? With some mind-numbing maths, Penrose argues that as time ends in the era of massless particles, the fate of our universe can actually be reinterpreted as the big bang of a new one: "Our universe is what I call an aeon in an endless sequence of aeons." Escher would have approved.
Thursday, 14 October 2010
The Amazing Story of Quantum Mechanics
Captain Quantum
The Amazing Story of Quantum Mechanics: A math-free exploration of the science that made our world by James Kakalios
New Scientist, 16 October 2010
“EXTRAVAGANT Fiction Today, Cold Fact Tomorrow” was the bold claim of Amazing Stories, the first American magazine devoted to science fiction. Beginning in the 1930s, these sci-fi pulps and comics envisaged that by the year 2000 we would be living in a world with domed underwater cities and travelling in flying cars and by jetpacks. Instead we have mobile phones, laptops and DVDs.
The predictions were off, says James Kakalios, because implicit in the promise of flying cars is the availability of lightweight power supplies capable of producing enormous quantities of energy. In fact, the capacity of batteries to act as reservoirs of energy is limited by the chemical and electrical properties of atoms – and we cannot change the physics of atoms.
This is Kakalios’s cue to explain the key concepts of quantum mechanics and show how these ideas account for the properties of metals, insulators and semiconductors – and how they underlie the magnetic properties of atoms that let us store vast amounts of data on computer hard drives and build MRI scanners that can see inside the human body.
The physicists who developed quantum theory and the fans of sci-fi pulps had one thing in common, says Kakalios, and that is a willingness to suspend disbelief as they accepted the impossible as real. Three such quantum facts were: light is an electromagnetic wave that is actually composed of chunks of energy; matter is composed of particles that exhibit a wave-like nature; and both light and matter have a property called spin that can only have certain values.
Having provided the reader with these counter-intuitive notions, Kakalios looks at the problems they solved. To help explain Planck’s discovery of the quantum, the photoelectric effect, the quantum atom, wave-particle duality, Schrödinger’s wave equation, the probabilistic interpretation of the wave function, the uncertainty principle and more besides, comic-loving Kakalios enlists a legion of superheroes, from Superman to Dr Manhattan.
In addition to his bright-blue appearance, Jon Osterman aka Dr Manhattan, appears to have gained control of his quantum- mechanical wave function. This, the starting point for Kakalios’s highly readable presentation of quantum ideas, give him the ability to alter his size at will, to teleport himself and others from one place to another, and to experience the past, present and future simultaneously.
The scientist as a world- changing hero is an apt description for the physicists who developed quantum mechanics, Kakalios believes. He has a point. The discoveries by a handful of physicists back in the 1920s and 1930s of the rules that govern how atoms interact with light and each other continue to shape and change the world we live in.
The Amazing Story of Quantum Mechanics: A math-free exploration of the science that made our world by James Kakalios
New Scientist, 16 October 2010
“EXTRAVAGANT Fiction Today, Cold Fact Tomorrow” was the bold claim of Amazing Stories, the first American magazine devoted to science fiction. Beginning in the 1930s, these sci-fi pulps and comics envisaged that by the year 2000 we would be living in a world with domed underwater cities and travelling in flying cars and by jetpacks. Instead we have mobile phones, laptops and DVDs.
The predictions were off, says James Kakalios, because implicit in the promise of flying cars is the availability of lightweight power supplies capable of producing enormous quantities of energy. In fact, the capacity of batteries to act as reservoirs of energy is limited by the chemical and electrical properties of atoms – and we cannot change the physics of atoms.
This is Kakalios’s cue to explain the key concepts of quantum mechanics and show how these ideas account for the properties of metals, insulators and semiconductors – and how they underlie the magnetic properties of atoms that let us store vast amounts of data on computer hard drives and build MRI scanners that can see inside the human body.
The physicists who developed quantum theory and the fans of sci-fi pulps had one thing in common, says Kakalios, and that is a willingness to suspend disbelief as they accepted the impossible as real. Three such quantum facts were: light is an electromagnetic wave that is actually composed of chunks of energy; matter is composed of particles that exhibit a wave-like nature; and both light and matter have a property called spin that can only have certain values.
Having provided the reader with these counter-intuitive notions, Kakalios looks at the problems they solved. To help explain Planck’s discovery of the quantum, the photoelectric effect, the quantum atom, wave-particle duality, Schrödinger’s wave equation, the probabilistic interpretation of the wave function, the uncertainty principle and more besides, comic-loving Kakalios enlists a legion of superheroes, from Superman to Dr Manhattan.
In addition to his bright-blue appearance, Jon Osterman aka Dr Manhattan, appears to have gained control of his quantum- mechanical wave function. This, the starting point for Kakalios’s highly readable presentation of quantum ideas, give him the ability to alter his size at will, to teleport himself and others from one place to another, and to experience the past, present and future simultaneously.
The scientist as a world- changing hero is an apt description for the physicists who developed quantum mechanics, Kakalios believes. He has a point. The discoveries by a handful of physicists back in the 1920s and 1930s of the rules that govern how atoms interact with light and each other continue to shape and change the world we live in.
Galileo
Galileo: Watcher of the Skies by David Wootton
Sunday Telegraph, 10 October 2010
It is a little known fact that in 1532 Copernicus’s sun-centred solar system was presented to an audience in the Vatican. Given the storm that was to come, it is barely believable that the then pope, Leo X, afterwards sent a note of encouragement to Copernicus as the Polish priest laboured to finish his book. On the Revolutions of the Heavenly Spheres was published in 1543 and Copernicus, so the story goes, held the first copy to come off the press just hours before he died. As long as his heliocentric model was presented as hypothetical, the Vatican was unconcerned by Copernicanism. One man changed all that.
Born in February 1564, Galileo Galilei initially set out to be a doctor before switching to mathematics – much to the displeasure of his father. It is unlikely that, according to the legend, he ever dropped balls from the leaning tower of Pisa as he investigated the motion of falling bodies and discovered that all objects fall at the same rate, contradicting what everybody believed since Aristotle.
When, in 1609, he learnt of the invention of the telescope by a Dutch spectacle maker, Galileo quickly constructed his own. Within a matter of months he had transformed it from a toy into an instrument of scientific discovery and he found that the Milky Way was not a streak across the sky but a multitude of stars; that the Moon had mountains and valleys; and he observed the phases of Venus and the spots on the Sun.
'For Galileo, seeing was believing,’ says the historian David Wootton. Yet he argues persuasively in this well researched, intellectual biography that Galileo was a Copernican long before his discovery of the moons of Jupiter proved that not all heavenly bodies revolved around the Earth. In March 1610, Galileo published his discoveries in the aptly titled book, The Starry Messenger. All 550 copies were sold within a week and soon the 46 year-old was Europe’s most celebrated natural philosopher.
Faced with the Reformation, the Catholic Church was increasingly less tolerant of dissent. In 1616, Galileo went to Rome after a letter he wrote was brought to the attention of the Holy Office of the Inquisition. In it Galileo argued that although the Bible is the word of God, it is adapted to human capacities. Nature, however, is 'inexorable and immutable’. So, when it comes to certain questions, direct knowledge of nature must always take priority over whatever the Bible may have to say on the subject. And the answer to one of those questions was that it is the Earth that moves around the Sun and not the other way round.
Wootton does a good job of untangling who said what to whom and when in Galileo’s dealings with the Inquisition. To cut a long story short, Galileo was given a formal warning that forbade him from holding, teaching or defending Copernicanism. To complicate matters, around the same time, in March 1616, the Vatican banned all books that held Copernicanism to be true.
Then, in a surprising turn of events, in 1623 Maffeo Barberini, an old friend of Galileo’s, was elected pope. Urban VIII allowed Galileo to re-enter the somewhat muted debate on Copernicanism. Before long, argues Wootton, intellectual ambition and vanity led Galileo to stake everything on facing down his opponents in his book Dialogue on the Two Chief World Systems. In April 1633, Galileo was summoned before the Inquisition and held firm that, although in the Dialogue he discussed Copernicanism, he did not defend it, and he denied any knowledge of the injunction of 1616 not even to do that.
It was at this point that the prosecutor played his trump card – a report that Galileo was guilty in an earlier book of denying transubstantiation. It was a charge that, if proven, implied that Galileo was not a Catholic but a Protestant. While admitting to piling 'conjecture upon conjecture’, Wootton goes further than any enforcer of the Inquisition and accuses Galileo of not being a Christian at all.
Galileo died in 1642, a prisoner of the Inquisition. In 1992, the Catholic Church apologised for its treatment of the secular saint. Only God knows what Leo X would have made of it all.
Sunday Telegraph, 10 October 2010
It is a little known fact that in 1532 Copernicus’s sun-centred solar system was presented to an audience in the Vatican. Given the storm that was to come, it is barely believable that the then pope, Leo X, afterwards sent a note of encouragement to Copernicus as the Polish priest laboured to finish his book. On the Revolutions of the Heavenly Spheres was published in 1543 and Copernicus, so the story goes, held the first copy to come off the press just hours before he died. As long as his heliocentric model was presented as hypothetical, the Vatican was unconcerned by Copernicanism. One man changed all that.
Born in February 1564, Galileo Galilei initially set out to be a doctor before switching to mathematics – much to the displeasure of his father. It is unlikely that, according to the legend, he ever dropped balls from the leaning tower of Pisa as he investigated the motion of falling bodies and discovered that all objects fall at the same rate, contradicting what everybody believed since Aristotle.
When, in 1609, he learnt of the invention of the telescope by a Dutch spectacle maker, Galileo quickly constructed his own. Within a matter of months he had transformed it from a toy into an instrument of scientific discovery and he found that the Milky Way was not a streak across the sky but a multitude of stars; that the Moon had mountains and valleys; and he observed the phases of Venus and the spots on the Sun.
'For Galileo, seeing was believing,’ says the historian David Wootton. Yet he argues persuasively in this well researched, intellectual biography that Galileo was a Copernican long before his discovery of the moons of Jupiter proved that not all heavenly bodies revolved around the Earth. In March 1610, Galileo published his discoveries in the aptly titled book, The Starry Messenger. All 550 copies were sold within a week and soon the 46 year-old was Europe’s most celebrated natural philosopher.
Faced with the Reformation, the Catholic Church was increasingly less tolerant of dissent. In 1616, Galileo went to Rome after a letter he wrote was brought to the attention of the Holy Office of the Inquisition. In it Galileo argued that although the Bible is the word of God, it is adapted to human capacities. Nature, however, is 'inexorable and immutable’. So, when it comes to certain questions, direct knowledge of nature must always take priority over whatever the Bible may have to say on the subject. And the answer to one of those questions was that it is the Earth that moves around the Sun and not the other way round.
Wootton does a good job of untangling who said what to whom and when in Galileo’s dealings with the Inquisition. To cut a long story short, Galileo was given a formal warning that forbade him from holding, teaching or defending Copernicanism. To complicate matters, around the same time, in March 1616, the Vatican banned all books that held Copernicanism to be true.
Then, in a surprising turn of events, in 1623 Maffeo Barberini, an old friend of Galileo’s, was elected pope. Urban VIII allowed Galileo to re-enter the somewhat muted debate on Copernicanism. Before long, argues Wootton, intellectual ambition and vanity led Galileo to stake everything on facing down his opponents in his book Dialogue on the Two Chief World Systems. In April 1633, Galileo was summoned before the Inquisition and held firm that, although in the Dialogue he discussed Copernicanism, he did not defend it, and he denied any knowledge of the injunction of 1616 not even to do that.
It was at this point that the prosecutor played his trump card – a report that Galileo was guilty in an earlier book of denying transubstantiation. It was a charge that, if proven, implied that Galileo was not a Catholic but a Protestant. While admitting to piling 'conjecture upon conjecture’, Wootton goes further than any enforcer of the Inquisition and accuses Galileo of not being a Christian at all.
Galileo died in 1642, a prisoner of the Inquisition. In 1992, the Catholic Church apologised for its treatment of the secular saint. Only God knows what Leo X would have made of it all.
Saturday, 18 September 2010
Pathfinders
Arabic Heights
Pathfinders: The Golden Age of Arabic Science by Jim Al-Khalili
The Times, 18 September 2010
Stockholm, December 1979. Ten men wait to receive their Nobel prizes from the King of Sweden. Nine are wearing white tie and tails. Abdus Salam, a devout Muslim, stands out in a turban and traditional Punjabi clothes. The Pakistani physicist is being honoured for his part in formulating the electroweak theory that unites electromagnetism and the weak nuclear force. ‘There is no doubt in my mind that his work places him as the greatest physicist of the Islamic world for a thousand years,’ says Jim Al-Khalili, Iraqi-born physicist-cum-writer.
For many non-Muslims, the term ‘Islam’ evokes a ‘negative stereotype that contrasts with our Western, secular, rational, tolerant and enlightened society’, says Al-Khalili. This captivating book is his timely reminder of the debt owed by the West to the intellectual achievements of Arab, Persian and Muslim scholars, a thousand years before Salam got his Nobel prize, when the roles were reversed.
Al-Khalili has long wanted to tell the tale of the ‘golden age of Arabic science’ that began in the late 8th century and lasted for more than 500 years. Since there is no such thing as ‘Jewish science’ or ‘Christian science’, Al-Khalili explains that by ‘Arabic science’ he means the remarkable body of work produced in Arabic, the lingua franca of science and much else as Europe slumbered through its Dark Ages.
The Koran was the first book written in Arabic, and much effort was spent studying and interpreting it. The wealth and power of the growing Islamic empire made it possible for the Abbasid caliphs to promote an ever-expanding sphere of academic inquiry that had been lost since the glory days of Greek Alexandria. Advances in maths, astronomy, physics, chemistry, medicine and the flourishing of philosophy that took place, first in Baghdad and then across the Islamic world, all have their origins in what historians call the translation movement.
Thanks to the practical benefits it brought in finance, agriculture, engineering and health, the translation movement was a 200-year process during which much of the wisdom of the Greeks, Persians and Indians was translated into Arabic. These translations helped produce a culture of scholarship that became self-sustaining and formed part of a wider quest for knowledge that evolved into a new tradition of intellectual exploration that sparked the beginning of an age of scientific progress. A 9th-century caliph of Baghdad created the House of Wisdom, a centre of learning that some say was home to 400,000 books – at a time when the best European libraries held no more than a few dozen.
In 711, Muslims crossed into Spain and so began almost eight centuries of Islamic influence in Andalusia. Just as Baghdad had been the epicentre of the translation movement from Greek into Arabic, so cities like Córdoba and Toledo became the centres of translation of the great Arabic texts into Latin. One of the first scholars to study these was Gerbert d’Aurillac, a 10th-century French monk. He would later become the first Christian scholar to carry Arabic learning across the Pyrenees. It seems fitting that the man who would later become Pope Sylvester II introduced Christian Europe to the science of the Islamic empire.
Al-Khalili argues that the scientific revolution could not have taken place without the advances of the medieval Islamic world. Ibn al-Haytham dominated the field of optics long before Newton and used the scientific method 600 years before Francis Bacon even thought about it. Abdus Salam, who died in 1996, named some of the giants of Arabic science in his Nobel lecture: al-Biruni, al-Razi, Ibn Sina, Jabir, and al-Khwarizmi. If you want to know what these men did, read this fascinating book and let Al-Khalili tell you their stories.
‘We should not be ashamed to recognise truth and assimilate it, from whatever quarter it may reach us, even though it may come from earlier generations and foreign peoples,’ wrote Ya’qub ibn Ishaq al-Kindi, one of the great polymaths of the ‘golden age’. ‘For the seeker after truth there is nothing of more value than truth itself; it never cheapens or debases the seeker, but ennobles and elevates him.’
Pathfinders: The Golden Age of Arabic Science by Jim Al-Khalili
The Times, 18 September 2010
Stockholm, December 1979. Ten men wait to receive their Nobel prizes from the King of Sweden. Nine are wearing white tie and tails. Abdus Salam, a devout Muslim, stands out in a turban and traditional Punjabi clothes. The Pakistani physicist is being honoured for his part in formulating the electroweak theory that unites electromagnetism and the weak nuclear force. ‘There is no doubt in my mind that his work places him as the greatest physicist of the Islamic world for a thousand years,’ says Jim Al-Khalili, Iraqi-born physicist-cum-writer.
For many non-Muslims, the term ‘Islam’ evokes a ‘negative stereotype that contrasts with our Western, secular, rational, tolerant and enlightened society’, says Al-Khalili. This captivating book is his timely reminder of the debt owed by the West to the intellectual achievements of Arab, Persian and Muslim scholars, a thousand years before Salam got his Nobel prize, when the roles were reversed.
Al-Khalili has long wanted to tell the tale of the ‘golden age of Arabic science’ that began in the late 8th century and lasted for more than 500 years. Since there is no such thing as ‘Jewish science’ or ‘Christian science’, Al-Khalili explains that by ‘Arabic science’ he means the remarkable body of work produced in Arabic, the lingua franca of science and much else as Europe slumbered through its Dark Ages.
The Koran was the first book written in Arabic, and much effort was spent studying and interpreting it. The wealth and power of the growing Islamic empire made it possible for the Abbasid caliphs to promote an ever-expanding sphere of academic inquiry that had been lost since the glory days of Greek Alexandria. Advances in maths, astronomy, physics, chemistry, medicine and the flourishing of philosophy that took place, first in Baghdad and then across the Islamic world, all have their origins in what historians call the translation movement.
Thanks to the practical benefits it brought in finance, agriculture, engineering and health, the translation movement was a 200-year process during which much of the wisdom of the Greeks, Persians and Indians was translated into Arabic. These translations helped produce a culture of scholarship that became self-sustaining and formed part of a wider quest for knowledge that evolved into a new tradition of intellectual exploration that sparked the beginning of an age of scientific progress. A 9th-century caliph of Baghdad created the House of Wisdom, a centre of learning that some say was home to 400,000 books – at a time when the best European libraries held no more than a few dozen.
In 711, Muslims crossed into Spain and so began almost eight centuries of Islamic influence in Andalusia. Just as Baghdad had been the epicentre of the translation movement from Greek into Arabic, so cities like Córdoba and Toledo became the centres of translation of the great Arabic texts into Latin. One of the first scholars to study these was Gerbert d’Aurillac, a 10th-century French monk. He would later become the first Christian scholar to carry Arabic learning across the Pyrenees. It seems fitting that the man who would later become Pope Sylvester II introduced Christian Europe to the science of the Islamic empire.
Al-Khalili argues that the scientific revolution could not have taken place without the advances of the medieval Islamic world. Ibn al-Haytham dominated the field of optics long before Newton and used the scientific method 600 years before Francis Bacon even thought about it. Abdus Salam, who died in 1996, named some of the giants of Arabic science in his Nobel lecture: al-Biruni, al-Razi, Ibn Sina, Jabir, and al-Khwarizmi. If you want to know what these men did, read this fascinating book and let Al-Khalili tell you their stories.
‘We should not be ashamed to recognise truth and assimilate it, from whatever quarter it may reach us, even though it may come from earlier generations and foreign peoples,’ wrote Ya’qub ibn Ishaq al-Kindi, one of the great polymaths of the ‘golden age’. ‘For the seeker after truth there is nothing of more value than truth itself; it never cheapens or debases the seeker, but ennobles and elevates him.’
Friday, 10 September 2010
The Grand Design
Theories of everything: Has cosmic science written its last word?
Independent, 10 September 2010
God is dead," declared Friedrich Nietzsche, but few listened or cared. "It is not necessary to invoke God to light the blue touch paper and set the universe going," announced Stephen Hawking last week, and it was picked up by the world's media. For over 20 years earlier, the world's most famous scientist had ended his phenomenal bestseller A Brief History of Time with the arresting conclusion that "If we discover a complete theory, it would be the ultimate triumph of human reason - for then we should know the mind of God."
Why is there something rather than nothing? Why do we exist? Why this particular set of laws and not some other? It is these "ultimate questions of life" that Hawking's now sets out to answer, with the help of the American physicist and science writer Leonard Mlodinow, in his fascinating new book The Grand Design (Bantam, £18.99). Philosophers have traditionally tackled such questions, while most physicists have stayed well clear from addressing the "why" of things and concentrated instead on the "how".
Not any more. "To understand the universe at the deepest level," says Hawking, "we need to know not only how the universe behaves, but why." He believes that "philosophy is dead" because it failed to keep up with the latest developments, especially in physics.
For it is possible to answer these questions purely within the realm of science and without resorting to God. And the answers hinge on a candidate for a theory of everything called M-theory, "if it indeed exists", the authors admit. Unfortunately, no one seems to know what "M" stands for; it could be "master", "miracle" or "mystery".
The story of M-theory could be said to begin with the desire of physicists to unify and simplify. Just as ice, water and steam are different manifestations of water, in 1864 James Clerk Maxwell showed that electricity and magnetism were likewise different manifestations of the same underlying phenomenon - electromagnetism. He managed to encapsulate the disparate the behaviour of electricity and magnetism into a set of four elegant mathematical equations. Using these equations, Maxwell was able to make the startling prediction that electromagnetic waves travelled at the speed of light, approximately 670 million miles per hour. Light was a form of electromagnetic radiation. Maxwell's unification of electricity, magnetism and light was the crowning achievement of 19th-century physics.
In the 20th century, to go with gravity and electromagnetism, physicists discovered two new forces - the weak, which is responsible for radioactivity, and the strong that binds together, for example, the nucleus of an atom. They believed that these four forces, which appeared so different, would be reunited a single all-encompassing theory of everything.With exception of general relativity, Einstein's theory of gravity, it's possible to "quantize" the other three forces, since quantum mechanics deals with the atomic and sub-atomic domain. In effect, we have three trains running on the same-sized track.
Unfortunately, Einstein's gravity train was running on a completely incompatible track. Yet the impulse for unity and simplicity is so strong that theorists have pursued a quantum theory of gravity, without success, for decades. Then in the 1980s there appeared a new theory that looked promising - superstrings.
The theory assumes that all observed particles are different manifestations of the same fundamental entity. According to the superstring idea, all particles previously thought off as little points are in fact not points at all but basically little oscillating bits of "string" which move through space. The different levels of "vibration" of these strings correspond to the different particles.
Superstrings vibrate in 10 dimensions. But we don't notice these extra dimensions because they are curled up into a space that's infinitesimally small. Alas, it was discovered that there were at least five different string theories and millions of ways the extra dimensions could be curled up - an embarrassment of riches for those who hoped that string theory was the longed for theory of everything.
As others despaired, the American physicist Ed Witten led the way, beginning in the mid-1990s, in showing that the different string theories and a theory called "supergravity" were all just different approximations to a more fundamental theory: M-theory.
"M-theory is not a theory in the usual sense," admits Hawking. "It is a whole family of different theories, each of which is a good description of observations only in some range of physical situations. It is a bit like a map." Faithfully to map the entire earth, one has to use a collection of maps, each of which covers a limited region. The maps overlap each other, and where they do, they show the same landscape.
M-theory needs 11 space-time dimensions and contains not just vibrating strings but other objects that are impossible to visualize. The extra space dimensions can be curled up in a mind-blowing 10 to the 500th different ways, each leading to a universe with its own laws. To get an idea how many that is, Hawking and Mlodinow ask the reader to imagine a being capable of scanning each of those universes in just one millisecond and who started working on it at the Big Bang. Today that being would have only have scanned just 10 to the 20th of them.
This plethora of universes, the multiverse, explains what appears to be the mystery behind the remarkable coincidences that have fine-tuned natural laws to make our universe habitable for us. With so many universes, it's a lot less remarkable that there is at least one in which conditions are Goldilocks-like: just right to have given rise to us, since we exist it has to be this one. This is the anthrophic principle that effectively says that things are the way they are because they were the way they were. From here, Hawking goes on to argue "Because there is a law like gravity, the universe can and will create itself from nothing."
"'Think of an expanding universe as a surface of a bubble," writes Hawking. "Our picture of the spontaneous quantum creation of the universe is then a bit like the formation of bubbles of steam in boiling water. Many tiny bubbles appear, and then disappear again. These represent mini-universes that expand but collapse again while still of microscopic size. They represent possible alternative universes, but they are not of much interest since they do not last long enough to develop galaxies and stars, let alone intelligent life. A few of the little bubbles will grow long enough so that they will be safe from recollapse. They will continue to expand at an ever-increasing rate and will form the bubbles of steam we are able to see. These correspond to universes that start off expanding at an ever-increasing rate."
Spontaneous creation is the reason there is something rather than nothing; why the universe exists, why we exist. God is surplus to Hawking's requirements.
Why are the fundamental laws as they are? The ultimate theory must be consistent and must predict finite results for quantities that we can measure. There must be a law like gravity and, for a theory of gravity to predict finite quantities, the theory must have what is called "supersymmetry" between the forces of nature and the matter on which they act. "If the theory is confirmed by observation," says Hawking, "it will be the successful conclusion of a search going back more than 3000 years."
"Yet in the history of science," he admits, "we have discovered a sequence of better and better theories or models, from Plato to the classical theory of Newton to modern quantum theories. It is natural to ask: Will this sequence eventually reach an end point, an ultimate theory of the universe, that will include all forces and predict every observation we can make, or will we continue forever finding better theories, but never one that cannot be improved upon?"
Though Hawking is probably being rhetorical, Russell Stannard, a former professor of physics at the Open University, looks at the unanswered questions of modern physics in his book The End of Discovery (Oxford, £14.99). Stannard believes that eventually, but he doesn't know when, fundamental science will reach the limit of what it can explain. On that day, the Scientific Age, like the Stone Age and the Iron Age before it, will come to an end. He believes that not only technological limits, but maybe humanity will have reached the limits if its mental capacities to unravel the nature and workings of reality.
Stannard takes readers on a tour of some of the deepest questions facing science: questions to do with consciousness, free will, the nature of space, time, and matter. He covers much of the same territory as Hawking and Mlodinow, and points out that to understand the subatomic world, scientists depend of particle accelerators; but to understand the very smallest units of nature, it has been calculated that we would need an accelerator the size of a galaxy.
In A Brief History of Time, Hawking said that a scientific theory "may originally be put forward for aesthetic or metaphysical reasons, but the real test is whether it makes predictions that agree with observations". As they have waited for the next generation of particle accelerators and experiments, the research of physicists from superstrings to quantum cosmology has had a tendency to take on a metaphysical character in recent decades.
So maybe philosophy isn't as dead as Stephen Hawking thinks. For those having a difficult time wrapping their head around "spontaneous creation", he has this tip: "If you like, you can call the laws of science 'God'."
Independent, 10 September 2010
God is dead," declared Friedrich Nietzsche, but few listened or cared. "It is not necessary to invoke God to light the blue touch paper and set the universe going," announced Stephen Hawking last week, and it was picked up by the world's media. For over 20 years earlier, the world's most famous scientist had ended his phenomenal bestseller A Brief History of Time with the arresting conclusion that "If we discover a complete theory, it would be the ultimate triumph of human reason - for then we should know the mind of God."
Why is there something rather than nothing? Why do we exist? Why this particular set of laws and not some other? It is these "ultimate questions of life" that Hawking's now sets out to answer, with the help of the American physicist and science writer Leonard Mlodinow, in his fascinating new book The Grand Design (Bantam, £18.99). Philosophers have traditionally tackled such questions, while most physicists have stayed well clear from addressing the "why" of things and concentrated instead on the "how".
Not any more. "To understand the universe at the deepest level," says Hawking, "we need to know not only how the universe behaves, but why." He believes that "philosophy is dead" because it failed to keep up with the latest developments, especially in physics.
For it is possible to answer these questions purely within the realm of science and without resorting to God. And the answers hinge on a candidate for a theory of everything called M-theory, "if it indeed exists", the authors admit. Unfortunately, no one seems to know what "M" stands for; it could be "master", "miracle" or "mystery".
The story of M-theory could be said to begin with the desire of physicists to unify and simplify. Just as ice, water and steam are different manifestations of water, in 1864 James Clerk Maxwell showed that electricity and magnetism were likewise different manifestations of the same underlying phenomenon - electromagnetism. He managed to encapsulate the disparate the behaviour of electricity and magnetism into a set of four elegant mathematical equations. Using these equations, Maxwell was able to make the startling prediction that electromagnetic waves travelled at the speed of light, approximately 670 million miles per hour. Light was a form of electromagnetic radiation. Maxwell's unification of electricity, magnetism and light was the crowning achievement of 19th-century physics.
In the 20th century, to go with gravity and electromagnetism, physicists discovered two new forces - the weak, which is responsible for radioactivity, and the strong that binds together, for example, the nucleus of an atom. They believed that these four forces, which appeared so different, would be reunited a single all-encompassing theory of everything.With exception of general relativity, Einstein's theory of gravity, it's possible to "quantize" the other three forces, since quantum mechanics deals with the atomic and sub-atomic domain. In effect, we have three trains running on the same-sized track.
Unfortunately, Einstein's gravity train was running on a completely incompatible track. Yet the impulse for unity and simplicity is so strong that theorists have pursued a quantum theory of gravity, without success, for decades. Then in the 1980s there appeared a new theory that looked promising - superstrings.
The theory assumes that all observed particles are different manifestations of the same fundamental entity. According to the superstring idea, all particles previously thought off as little points are in fact not points at all but basically little oscillating bits of "string" which move through space. The different levels of "vibration" of these strings correspond to the different particles.
Superstrings vibrate in 10 dimensions. But we don't notice these extra dimensions because they are curled up into a space that's infinitesimally small. Alas, it was discovered that there were at least five different string theories and millions of ways the extra dimensions could be curled up - an embarrassment of riches for those who hoped that string theory was the longed for theory of everything.
As others despaired, the American physicist Ed Witten led the way, beginning in the mid-1990s, in showing that the different string theories and a theory called "supergravity" were all just different approximations to a more fundamental theory: M-theory.
"M-theory is not a theory in the usual sense," admits Hawking. "It is a whole family of different theories, each of which is a good description of observations only in some range of physical situations. It is a bit like a map." Faithfully to map the entire earth, one has to use a collection of maps, each of which covers a limited region. The maps overlap each other, and where they do, they show the same landscape.
M-theory needs 11 space-time dimensions and contains not just vibrating strings but other objects that are impossible to visualize. The extra space dimensions can be curled up in a mind-blowing 10 to the 500th different ways, each leading to a universe with its own laws. To get an idea how many that is, Hawking and Mlodinow ask the reader to imagine a being capable of scanning each of those universes in just one millisecond and who started working on it at the Big Bang. Today that being would have only have scanned just 10 to the 20th of them.
This plethora of universes, the multiverse, explains what appears to be the mystery behind the remarkable coincidences that have fine-tuned natural laws to make our universe habitable for us. With so many universes, it's a lot less remarkable that there is at least one in which conditions are Goldilocks-like: just right to have given rise to us, since we exist it has to be this one. This is the anthrophic principle that effectively says that things are the way they are because they were the way they were. From here, Hawking goes on to argue "Because there is a law like gravity, the universe can and will create itself from nothing."
"'Think of an expanding universe as a surface of a bubble," writes Hawking. "Our picture of the spontaneous quantum creation of the universe is then a bit like the formation of bubbles of steam in boiling water. Many tiny bubbles appear, and then disappear again. These represent mini-universes that expand but collapse again while still of microscopic size. They represent possible alternative universes, but they are not of much interest since they do not last long enough to develop galaxies and stars, let alone intelligent life. A few of the little bubbles will grow long enough so that they will be safe from recollapse. They will continue to expand at an ever-increasing rate and will form the bubbles of steam we are able to see. These correspond to universes that start off expanding at an ever-increasing rate."
Spontaneous creation is the reason there is something rather than nothing; why the universe exists, why we exist. God is surplus to Hawking's requirements.
Why are the fundamental laws as they are? The ultimate theory must be consistent and must predict finite results for quantities that we can measure. There must be a law like gravity and, for a theory of gravity to predict finite quantities, the theory must have what is called "supersymmetry" between the forces of nature and the matter on which they act. "If the theory is confirmed by observation," says Hawking, "it will be the successful conclusion of a search going back more than 3000 years."
"Yet in the history of science," he admits, "we have discovered a sequence of better and better theories or models, from Plato to the classical theory of Newton to modern quantum theories. It is natural to ask: Will this sequence eventually reach an end point, an ultimate theory of the universe, that will include all forces and predict every observation we can make, or will we continue forever finding better theories, but never one that cannot be improved upon?"
Though Hawking is probably being rhetorical, Russell Stannard, a former professor of physics at the Open University, looks at the unanswered questions of modern physics in his book The End of Discovery (Oxford, £14.99). Stannard believes that eventually, but he doesn't know when, fundamental science will reach the limit of what it can explain. On that day, the Scientific Age, like the Stone Age and the Iron Age before it, will come to an end. He believes that not only technological limits, but maybe humanity will have reached the limits if its mental capacities to unravel the nature and workings of reality.
Stannard takes readers on a tour of some of the deepest questions facing science: questions to do with consciousness, free will, the nature of space, time, and matter. He covers much of the same territory as Hawking and Mlodinow, and points out that to understand the subatomic world, scientists depend of particle accelerators; but to understand the very smallest units of nature, it has been calculated that we would need an accelerator the size of a galaxy.
In A Brief History of Time, Hawking said that a scientific theory "may originally be put forward for aesthetic or metaphysical reasons, but the real test is whether it makes predictions that agree with observations". As they have waited for the next generation of particle accelerators and experiments, the research of physicists from superstrings to quantum cosmology has had a tendency to take on a metaphysical character in recent decades.
So maybe philosophy isn't as dead as Stephen Hawking thinks. For those having a difficult time wrapping their head around "spontaneous creation", he has this tip: "If you like, you can call the laws of science 'God'."
Wednesday, 1 September 2010
Why Beliefs Matter
A Matter of Faith
Why Beliefs Matter: Reflections on the nature of science by E. Brian Davies
New Scientist, 7 August 2010
Albert Einstein once asked, does the moon exist when no one is looking at it? Such questions had been the preserve of philosophers, but with the discovery of quantum mechanics in the 1920s they became legitimate queries for physicists, too.
Niels Bohr, one of the founders of quantum mechanics, did not believe that science grants us access to an objective reality and insisted that the task of physics was not to find out "how nature is" but only "what we can say about nature". Einstein, on the other hand, maintained an unshakeable belief in a reality that exists out there. Otherwise, he said, "I simply cannot see what it is that physics is meant to describe".
Einstein based his view of quantum mechanics on his belief in an independent reality - the moon does exist when no one is looking at it. In contrast, Bohr used the theory to construct and underpin his belief that the atomic realm has no independent reality. The two agreed on the equations but disagreed on what they meant.
"Scientists, like everyone else, have beliefs," writes distinguished mathematician E. Brian Davies in Why Beliefs Matter. He is not only referring to religious beliefs but to philosophical ones, too. While religious beliefs can be easy to leave at the laboratory door, philosophical beliefs are much harder to sideline.
Some mathematicians, for instance, subscribe to a Platonic view in which theorems are true statements about timeless entities that exist independent of human minds. Others believe that mathematics is a human enterprise invented to describe the regularities seen in nature. The very idea that nature has such regularities which render it comprehensible is itself a belief, as is the idea that the world we perceive is not some sort of delusion or practical joke.
The title of Davies's book, significantly, is a statement, not a question. For him, beliefs do matter. Davies offers a series of snapshots of how various philosophical views inform science, rather than a systematic inquiry into the nature of belief. Along the way he discusses the scientific revolution, the mind-body problem, machine intelligence, string theory and the multiverse. The result is a wide-ranging, thought-provoking meditation rather than a populist read. Beliefs, it seems, are a serious business, and they come in all shapes and sizes.
"At the highest level, beliefs become world views, fundamental beliefs that we use to evaluate other beliefs about the world," says Davies. World views can be evaluated, compared and changed, but you cannot avoid having one. Davies is a self-proclaimed pluralist. That is, he believes that humans have a limited mental capacity and will always need a multiplicity of ways of looking at the world in order to understand it. There may be two or more equally valid and complementary descriptions of the same phenomenon, he says - not unlike the concept of wave-particle duality in quantum mechanics. That does not mean that all world views are equally good - some simply don't hold up under the scrutiny of experiment.
The scientific revolution that began in the 16th century was a triumph of rationality and experiment over the superstition and speculation of the Middle Ages. Even so, nearly 40 per cent of Americans believe that God created humans some time within the last 10,000 years.
World views are not founded on logic, so the most that one can demand is that they should be consistent with what science has discovered. Yet, as the writer C. S. Lewis noted, some arguments are impossible to refute. "A belief in invisible cats cannot be logically disproved," he said, although it does "tell us a good deal about those who hold it".
Why Beliefs Matter: Reflections on the nature of science by E. Brian Davies
New Scientist, 7 August 2010
Albert Einstein once asked, does the moon exist when no one is looking at it? Such questions had been the preserve of philosophers, but with the discovery of quantum mechanics in the 1920s they became legitimate queries for physicists, too.
Niels Bohr, one of the founders of quantum mechanics, did not believe that science grants us access to an objective reality and insisted that the task of physics was not to find out "how nature is" but only "what we can say about nature". Einstein, on the other hand, maintained an unshakeable belief in a reality that exists out there. Otherwise, he said, "I simply cannot see what it is that physics is meant to describe".
Einstein based his view of quantum mechanics on his belief in an independent reality - the moon does exist when no one is looking at it. In contrast, Bohr used the theory to construct and underpin his belief that the atomic realm has no independent reality. The two agreed on the equations but disagreed on what they meant.
"Scientists, like everyone else, have beliefs," writes distinguished mathematician E. Brian Davies in Why Beliefs Matter. He is not only referring to religious beliefs but to philosophical ones, too. While religious beliefs can be easy to leave at the laboratory door, philosophical beliefs are much harder to sideline.
Some mathematicians, for instance, subscribe to a Platonic view in which theorems are true statements about timeless entities that exist independent of human minds. Others believe that mathematics is a human enterprise invented to describe the regularities seen in nature. The very idea that nature has such regularities which render it comprehensible is itself a belief, as is the idea that the world we perceive is not some sort of delusion or practical joke.
The title of Davies's book, significantly, is a statement, not a question. For him, beliefs do matter. Davies offers a series of snapshots of how various philosophical views inform science, rather than a systematic inquiry into the nature of belief. Along the way he discusses the scientific revolution, the mind-body problem, machine intelligence, string theory and the multiverse. The result is a wide-ranging, thought-provoking meditation rather than a populist read. Beliefs, it seems, are a serious business, and they come in all shapes and sizes.
"At the highest level, beliefs become world views, fundamental beliefs that we use to evaluate other beliefs about the world," says Davies. World views can be evaluated, compared and changed, but you cannot avoid having one. Davies is a self-proclaimed pluralist. That is, he believes that humans have a limited mental capacity and will always need a multiplicity of ways of looking at the world in order to understand it. There may be two or more equally valid and complementary descriptions of the same phenomenon, he says - not unlike the concept of wave-particle duality in quantum mechanics. That does not mean that all world views are equally good - some simply don't hold up under the scrutiny of experiment.
The scientific revolution that began in the 16th century was a triumph of rationality and experiment over the superstition and speculation of the Middle Ages. Even so, nearly 40 per cent of Americans believe that God created humans some time within the last 10,000 years.
World views are not founded on logic, so the most that one can demand is that they should be consistent with what science has discovered. Yet, as the writer C. S. Lewis noted, some arguments are impossible to refute. "A belief in invisible cats cannot be logically disproved," he said, although it does "tell us a good deal about those who hold it".
Monday, 2 August 2010
Energy, the Subtle Concept
Elusive Stuff
Energy, the Subtle Concept: The Discovery of Feynman’s Blocks from Leibniz to Einstein by Jennifer Coopersmith
New Scientist, 17 July 2010
MOST of us have a vague idea of what energy is, if only because we have to pay for it. We know that it is the E in Einstein's famous equation, E=mc2, and all of us have an opinion about the pros and cons of nuclear energy. For William Blake's devil in The Marriage of Heaven and Hell, energy was "eternal delight", yet Newton never fully appreciated the importance of a concept that was rarely used until the 19th century.
So, what is energy? Easy to ask the question but, as Jennifer Coopersmith shows in Energy, the Subtle Concept, finding the answer was a messy and tangled affair, involving plenty of argument and controversy. It's a tale of persecuted genius, of royal patronage, of social climbers and dreamers, of rich men and poor men, a foundling, entrepreneurs and industrialists, lawyers, engineers, a taxman, a spy and a brewer. Some were showered with honours, others neglected until long after death.
The concept of energy is hard to grasp because it is something that cannot be directly observed. It was only in the early 19th century that it was even recognised as a distinct physical quantity. Since then it has played a vital role in the development of science and technology. Its importance lies in the fact that it possesses the very rare property of being preserved. Energy cannot be created or destroyed; it can only be converted from one form to another. So fundamental is this property to nature that it is enshrined, in more sober scientific terms, as the first law of thermodynamics.
The first step on the long road to understanding the true nature of this relationship had been taken in the 1800s by Benjamin Thompson, an Anglo-American physicist, inventor and soldier of fortune. While supervising the boring of new cannons Thompson realised that heat might be a form of motion rather than a special weightless substance called "caloric". Most remained unconvinced, largely because Thompson was a notorious opportunist and spy. The turning point came in the form of experiments performed, in the 1840s, by English brewer and amateur scientist James Prescott Joule, who introduced the term thermodynamics.
The conservation of energy is arguably the most important law in physics. But what exactly is being conserved? Are some forms of energy more fundamental than others? You will have to read the book to find out. Coopersmith sets out to answer such questions and to explain the concept of energy through the history of its discovery. This is neither a straightforward narrative nor one for the faint-hearted. Those not put off by the odd bit of mathematics, will be well-rewarded by dipping into this book.
Energy, the Subtle Concept: The Discovery of Feynman’s Blocks from Leibniz to Einstein by Jennifer Coopersmith
New Scientist, 17 July 2010
MOST of us have a vague idea of what energy is, if only because we have to pay for it. We know that it is the E in Einstein's famous equation, E=mc2, and all of us have an opinion about the pros and cons of nuclear energy. For William Blake's devil in The Marriage of Heaven and Hell, energy was "eternal delight", yet Newton never fully appreciated the importance of a concept that was rarely used until the 19th century.
So, what is energy? Easy to ask the question but, as Jennifer Coopersmith shows in Energy, the Subtle Concept, finding the answer was a messy and tangled affair, involving plenty of argument and controversy. It's a tale of persecuted genius, of royal patronage, of social climbers and dreamers, of rich men and poor men, a foundling, entrepreneurs and industrialists, lawyers, engineers, a taxman, a spy and a brewer. Some were showered with honours, others neglected until long after death.
The concept of energy is hard to grasp because it is something that cannot be directly observed. It was only in the early 19th century that it was even recognised as a distinct physical quantity. Since then it has played a vital role in the development of science and technology. Its importance lies in the fact that it possesses the very rare property of being preserved. Energy cannot be created or destroyed; it can only be converted from one form to another. So fundamental is this property to nature that it is enshrined, in more sober scientific terms, as the first law of thermodynamics.
The first step on the long road to understanding the true nature of this relationship had been taken in the 1800s by Benjamin Thompson, an Anglo-American physicist, inventor and soldier of fortune. While supervising the boring of new cannons Thompson realised that heat might be a form of motion rather than a special weightless substance called "caloric". Most remained unconvinced, largely because Thompson was a notorious opportunist and spy. The turning point came in the form of experiments performed, in the 1840s, by English brewer and amateur scientist James Prescott Joule, who introduced the term thermodynamics.
The conservation of energy is arguably the most important law in physics. But what exactly is being conserved? Are some forms of energy more fundamental than others? You will have to read the book to find out. Coopersmith sets out to answer such questions and to explain the concept of energy through the history of its discovery. This is neither a straightforward narrative nor one for the faint-hearted. Those not put off by the odd bit of mathematics, will be well-rewarded by dipping into this book.
The Edge of Reason
Extreme Physics, Extreme Pilgrim
The Edge of Reason: Dispatches from the Frontiers of Cosmology
Tehelka, 17 July 2010
Anatole France said that wandering reestablishes the original harmony that once existed between man and the universe — a particularly apt aim for a traveller trying to explain cosmology.
Anil Ananthaswamy’s The Edge of Reason is an elegant, genre-defying book that’s part travelogue, part popular science. It has come from his journeys to some of the most remarkable and delicate scientific experiments in the world. In Siberia, Ananthaswamy, the consulting editor at the popular science magazine, New Scientist, visited Lake Baikal Neutrino Telescope, which uses the world’s largest freshwater lake to detect neutrinos — ghostlike particles that pass through matter easier than a knife through butter, carrying information about the furthest reaches of the cosmos. At the Large Hadron Collider, the underground scientific cathedral near Geneva, he writes of how physicists are smashing protons into each other at energies replicating the early universe — hoping that the debris will reveal a telltale sign of ‘the God particle’ — the Higgs boson, physics’ answer to why there’s mass in the universe. At the South Pole he was stunned with “the sheer audacity of the IceCube telescope — searching for outer space neutrinos smashing into a cubic kilometre of clear Antarctic Ice”. But even extreme physics took on new meaning, admits Ananthaswamy, when he spent a surreal month in Antarctica. “Nothing in your experience prepares you for a continent of ice where there’s neither vegetation nor trees.”
In person, the London-based writer, 46, is soft- spoken and bespectacled. Born in the small town of Bhilai, Chhattisgarh, Ananthaswamy studied engineering at IIT Madras and did a Master’s in the US. He returned to India to work for a software company that soon shipped him to Silicon Valley. Though the money was great during his 12 years there, life was “emotionally unsatisfying”. He decided to pursue a long-held desire to write. He threw up the high-paying job and eventually headed to London for a much sought-after post at the New Scientist.
Two years later, he embarked on his travels to seek the unsung heroes of science and found himself in desolate deserts, derelict mines, standing on mountaintops and even at the bottom of the world. “It’s not enough to point telescopes from local mountains or conduct lab experiments to understand dark matter and dark energy,” he says of these sites, “There’s something that links all these places — they strip off the unnecessary and leave only the essence to ponder.”
At a Christian monastery near California’s Mount Wilson Observatory, he sensed how similar the monks were to cosmologists. “If solitude and silence engender creativity,” he muses, “then it behoves us to protect not just our own solitude but of nature’s as a whole. If we pollute pristine places like Lake Baikal, we’ll deny ourselves any chance of deciphering our own beginnings.” His book is an eloquent description of a scientific pilgrim’s search for this solitary understanding.
Click the page to view the original article.
The Edge of Reason: Dispatches from the Frontiers of Cosmology
Tehelka, 17 July 2010
Anatole France said that wandering reestablishes the original harmony that once existed between man and the universe — a particularly apt aim for a traveller trying to explain cosmology.
Anil Ananthaswamy’s The Edge of Reason is an elegant, genre-defying book that’s part travelogue, part popular science. It has come from his journeys to some of the most remarkable and delicate scientific experiments in the world. In Siberia, Ananthaswamy, the consulting editor at the popular science magazine, New Scientist, visited Lake Baikal Neutrino Telescope, which uses the world’s largest freshwater lake to detect neutrinos — ghostlike particles that pass through matter easier than a knife through butter, carrying information about the furthest reaches of the cosmos. At the Large Hadron Collider, the underground scientific cathedral near Geneva, he writes of how physicists are smashing protons into each other at energies replicating the early universe — hoping that the debris will reveal a telltale sign of ‘the God particle’ — the Higgs boson, physics’ answer to why there’s mass in the universe. At the South Pole he was stunned with “the sheer audacity of the IceCube telescope — searching for outer space neutrinos smashing into a cubic kilometre of clear Antarctic Ice”. But even extreme physics took on new meaning, admits Ananthaswamy, when he spent a surreal month in Antarctica. “Nothing in your experience prepares you for a continent of ice where there’s neither vegetation nor trees.”
In person, the London-based writer, 46, is soft- spoken and bespectacled. Born in the small town of Bhilai, Chhattisgarh, Ananthaswamy studied engineering at IIT Madras and did a Master’s in the US. He returned to India to work for a software company that soon shipped him to Silicon Valley. Though the money was great during his 12 years there, life was “emotionally unsatisfying”. He decided to pursue a long-held desire to write. He threw up the high-paying job and eventually headed to London for a much sought-after post at the New Scientist.
Two years later, he embarked on his travels to seek the unsung heroes of science and found himself in desolate deserts, derelict mines, standing on mountaintops and even at the bottom of the world. “It’s not enough to point telescopes from local mountains or conduct lab experiments to understand dark matter and dark energy,” he says of these sites, “There’s something that links all these places — they strip off the unnecessary and leave only the essence to ponder.”
At a Christian monastery near California’s Mount Wilson Observatory, he sensed how similar the monks were to cosmologists. “If solitude and silence engender creativity,” he muses, “then it behoves us to protect not just our own solitude but of nature’s as a whole. If we pollute pristine places like Lake Baikal, we’ll deny ourselves any chance of deciphering our own beginnings.” His book is an eloquent description of a scientific pilgrim’s search for this solitary understanding.
Click the page to view the original article.
Seven Wonders of the Industrial World
Blood, sweat and imagination
Seven Wonders of the Industrial World
by Deborah Cadbury
The Guardian, 8 November 2003
The Great Pyramid of Giza, the Hanging Gardens of Babylon, the Statue of Zeus at Olympia, the Temple of Artemis at Ephesus, the Mausoleum at Halicarnassus, the Colossus of Rhodes and the Pharos of Alexandria. These were the seven wonders of the ancient world. The list was drawn up in the Middle Ages with probably as much rancour surrounding it as any team selection by Sven-Goran Eriksson.
Only the Great Pyramid, built around 2560BC, has survived. It is unlikely that any of Deborah Cadbury's seven wonders of the industrial world will last as long. Her magnificent seven are: the Great Eastern, the Bell Rock Lighthouse, Brooklyn Bridge, the London sewers, the Transcontinental Railroad, the Panama Canal and the Hoover Dam.
In fact, Isambard Kingdom Brunel's SS Great Eastern was sold for scrap within 30 years of being launched in 1859. Dubbed the "Crystal Palace of the Seas", it was almost 700ft long, made of iron and held together by three million rivets. On her maiden voyage to New York, the Great Eastern had on board only 38 passengers and a crew of 418. It was designed to carry 4,000 passengers in luxury all the way to Australia, but a catalogue of disasters and the opening of the Suez Canal turned the Great Eastern into a giant white elephant.
While Brunel was busy building his "great ship" on the Isle of Dogs, London was drowning in a sea of excrement as the city's 200,000 cesspits overflowed. By 1854, three outbreaks of cholera had left 30,000 dead. Something had to be done, but only after the "great stink" had forced MPs to flee both parliament and the city in fear of their lives. Joseph Bazalgette, chief engineer of the Metropolitan Board of Works, proposed an ambitious scheme to build an underground network that linked London's 1,000 miles of street-level sewers. The sewage system took 12 years to complete, and as London breathed easier, it was hailed by the Observer, in 1861, as "the most extensive and wonderful work of modern times".
The London sewers may seem like an odd choice, but Cadbury's selection reflects her desire to tell the story of how the modern world was forged "in blood, sweat, and human imagination". There was plenty of all three.
With the exception of the Hoover Dam - constructed during the height of the depression, when poverty-stricken workers died building it ahead of schedule and under budget for a few dollars a day - Cadbury's wonders are products of the industrial revolution, when a worker's life was even cheaper.
None cost more in lives than the Panama Canal, begun by the French in 1880. Within 10 years, the jungle, swamps and tropical diseases had left more than 20,000 dead. But what really mattered to investors was the $280m lost by the time the canal company was declared bankrupt. It was the largest financial collapse of the 19th century and led to the downfall of the French government. Work only started again in 1901, when Theodore Roosevelt realised that a canal was vital for US naval supremacy. The Americans would take 12 years, and another 5,000 lives, before the "longest 50 miles in history" was complete.
Cadbury's earliest and smallest wonder was built during the Napoleonic wars. In 1807, Robert Stevenson started work on the Bell Rock Lighthouse off the east coast of Scotland. The Bell Rock, a large reef 11 miles out to sea, had claimed countless lives as it "breathed abroad an atmosphere of terror". The main problem Stevenson faced was the fact that the Bell Rock lies 16ft beneath the sea for all but three hours of each day. It took four years, more than 2,500 tonnes of stone, and a brave team of men, who received 20 shillings a week, to banish "the blackness enveloping the terrible power of the Bell Rock".
Whereas Stevenson and his men toiled above the water, Washington Roebling faced a mysterious new disease, which his men nicknamed "the bends", as they laboured beneath the East River that divides New York and Brooklyn. Roebling's father, John, had designed the Brooklyn Bridge to connect America's two fastest-growing cities, but died in an accident before work began. It was left to his son to oversee the construction of what would be the longest suspension bridge in the world.
Only three years into the project, Roebling suffered a terrible case of the bends. Lucky to survive, he was too weak to leave his house and had to continue working on the bridge by dictating his instructions to his wife. When the bridge opened in 1883, after 14 years of labour, with the loss of 20 men, Roebling could only watch from his bedroom window.
While "practical visionaries" such as Brunel, Stevenson and Roebling may have been "taking risks and taking society with them as they cut a path to the future", Cadbury never forgets those risking their lives just to survive. What makes this book a compelling read is the heroism and desperation of ordinary men.
Seven Wonders of the Industrial World
by Deborah Cadbury
The Guardian, 8 November 2003
The Great Pyramid of Giza, the Hanging Gardens of Babylon, the Statue of Zeus at Olympia, the Temple of Artemis at Ephesus, the Mausoleum at Halicarnassus, the Colossus of Rhodes and the Pharos of Alexandria. These were the seven wonders of the ancient world. The list was drawn up in the Middle Ages with probably as much rancour surrounding it as any team selection by Sven-Goran Eriksson.
Only the Great Pyramid, built around 2560BC, has survived. It is unlikely that any of Deborah Cadbury's seven wonders of the industrial world will last as long. Her magnificent seven are: the Great Eastern, the Bell Rock Lighthouse, Brooklyn Bridge, the London sewers, the Transcontinental Railroad, the Panama Canal and the Hoover Dam.
In fact, Isambard Kingdom Brunel's SS Great Eastern was sold for scrap within 30 years of being launched in 1859. Dubbed the "Crystal Palace of the Seas", it was almost 700ft long, made of iron and held together by three million rivets. On her maiden voyage to New York, the Great Eastern had on board only 38 passengers and a crew of 418. It was designed to carry 4,000 passengers in luxury all the way to Australia, but a catalogue of disasters and the opening of the Suez Canal turned the Great Eastern into a giant white elephant.
While Brunel was busy building his "great ship" on the Isle of Dogs, London was drowning in a sea of excrement as the city's 200,000 cesspits overflowed. By 1854, three outbreaks of cholera had left 30,000 dead. Something had to be done, but only after the "great stink" had forced MPs to flee both parliament and the city in fear of their lives. Joseph Bazalgette, chief engineer of the Metropolitan Board of Works, proposed an ambitious scheme to build an underground network that linked London's 1,000 miles of street-level sewers. The sewage system took 12 years to complete, and as London breathed easier, it was hailed by the Observer, in 1861, as "the most extensive and wonderful work of modern times".
The London sewers may seem like an odd choice, but Cadbury's selection reflects her desire to tell the story of how the modern world was forged "in blood, sweat, and human imagination". There was plenty of all three.
With the exception of the Hoover Dam - constructed during the height of the depression, when poverty-stricken workers died building it ahead of schedule and under budget for a few dollars a day - Cadbury's wonders are products of the industrial revolution, when a worker's life was even cheaper.
None cost more in lives than the Panama Canal, begun by the French in 1880. Within 10 years, the jungle, swamps and tropical diseases had left more than 20,000 dead. But what really mattered to investors was the $280m lost by the time the canal company was declared bankrupt. It was the largest financial collapse of the 19th century and led to the downfall of the French government. Work only started again in 1901, when Theodore Roosevelt realised that a canal was vital for US naval supremacy. The Americans would take 12 years, and another 5,000 lives, before the "longest 50 miles in history" was complete.
Cadbury's earliest and smallest wonder was built during the Napoleonic wars. In 1807, Robert Stevenson started work on the Bell Rock Lighthouse off the east coast of Scotland. The Bell Rock, a large reef 11 miles out to sea, had claimed countless lives as it "breathed abroad an atmosphere of terror". The main problem Stevenson faced was the fact that the Bell Rock lies 16ft beneath the sea for all but three hours of each day. It took four years, more than 2,500 tonnes of stone, and a brave team of men, who received 20 shillings a week, to banish "the blackness enveloping the terrible power of the Bell Rock".
Whereas Stevenson and his men toiled above the water, Washington Roebling faced a mysterious new disease, which his men nicknamed "the bends", as they laboured beneath the East River that divides New York and Brooklyn. Roebling's father, John, had designed the Brooklyn Bridge to connect America's two fastest-growing cities, but died in an accident before work began. It was left to his son to oversee the construction of what would be the longest suspension bridge in the world.
Only three years into the project, Roebling suffered a terrible case of the bends. Lucky to survive, he was too weak to leave his house and had to continue working on the bridge by dictating his instructions to his wife. When the bridge opened in 1883, after 14 years of labour, with the loss of 20 men, Roebling could only watch from his bedroom window.
While "practical visionaries" such as Brunel, Stevenson and Roebling may have been "taking risks and taking society with them as they cut a path to the future", Cadbury never forgets those risking their lives just to survive. What makes this book a compelling read is the heroism and desperation of ordinary men.
Faster Than the Speed of Light
Summing up the universe
Faster Than the Speed of Light: The Story of a Scientific Speculation
by João Magueijo
The Guardian, 29 March 2003
John Dryden's poem "Annus Mirabilis: The Year of Wonders", 1666, celebrated the Royal Navy's victory over the Dutch and the failure of the great fire of London to consume the entire city. Yet as significant as these events were, they pale in comparison to one of the true high points of human achievement that occurred during that same year: the 24-year-old Isaac Newton laid the foundations of calculus and the theory of gravity, and outlined his theory of light. Only one other year and one other scientist bear comparison with Newton and his annus mirabilis .
Albert Einstein's "miraculous year" was 1905. The unknown 26-year-old patent clerk produced - in breathtaking succession - the special theory of relativity, the quantum theory of light and a convincing argument for the existence of atoms. As preparations for the centennial celebrations get under way, João Magueijo has written a gripping, no-holds-barred account of his challenge to one of the central tenets of relativity and its implications for our understanding of how the universe works.
That the universe began with a "big bang" is something that most of us now accept without question, yet there remain puzzling features about the universe that the big bang theory cannot explain. Why does the universe look the same over such vast distances? Why is it so large? Why does it have the shape it has? Why does the universe exist at all? For years cosmologists have been looking at the infant universe for "clues to its adult behaviour".
Some of these questions, Magueijo realised, could be answered if he broke just one golden rule. It was a simple solution to the cosmological problems, but it presented him with a problem of his own. For his answer "involved something that for a trained scientist approaches madness". What Magueijo proposed was that light travelled faster in the infant universe than it does now. In doing so, he risked "career suicide" by questioning the validity of a perhaps the most fundamental rule of modern physics: that the speed of light is a constant.
Magueijo, a reader in theoretical physics at Imperial College, London, is no madman. But some have called him a heretic and dismissed his theory. After all the constancy of the speed of light is, as he points out, "woven into the fabric of physics, into the way that the equations are written". Frankly, the reaction of his critics is understandable, since Magueijo's proposal would entail the wholesale revision of the entire framework of 20th-century physics.
Undaunted by the hostile reactions, Magueijo continued to investigate the possible consequences of a varying speed of light (VSL) in the very early universe. Whereas others may have been intimidated, he had the courage to follow where VSL led. Disappointingly, "for a long while it led nowhere".
Once he teamed up with his first collaborator on VSL, the American cosmologist Andy Albrecht, new avenues began to open up through regular brainstorming sessions. At the end of each session, conducted behind locked doors, the blackboard calculations were wiped clean. They wanted to keep their ideas under wraps until they were ready to publish a fully fledged theory, since "publish first or perish" is a sad fact of a life for all scientists.
Magueijo provides a highly readable account of the problems besetting modern cosmology and how they appear to be resolved by VSL. Better still, he gives an honest and revealing insight into what it's like to carry out scientific research: the endless frustrations, the fear of being beaten by competitors, the ebb and flow of tension between collaborators, the numerous dead ends, the unexpected moments of inspiration, and the often tedious task of checking and rechecking calculations.
Finally, Magueijo offers a glimpse into the often fraught process of peer review that begins after a finished article is submitted to a journal for publication. He and Albrecht had to bite the bullet, more than once, through a year-long review process, before their paper was finally accepted.
Magueijo finds it difficult to conceal his contempt for the reports written by referees that are at the heart of peer review. For him they are "often empty of scientific content and reflect nothing but the authors' social standing, or their good or bad relations with the referee". In fact, so scathing was he about one well-known journal that the libel lawyers were called out and the original print run of the UK edition of his book had to be shredded.
Whatever the final verdict on VSL, where experimental results will act as the ultimate referee, Magueijo and his collaborators have developed a theory that is now being taken seriously, against all the odds. As the young Einstein once remarked: "Foolish faith in authority is the worst enemy of truth."
Faster Than the Speed of Light: The Story of a Scientific Speculation
by João Magueijo
The Guardian, 29 March 2003
John Dryden's poem "Annus Mirabilis: The Year of Wonders", 1666, celebrated the Royal Navy's victory over the Dutch and the failure of the great fire of London to consume the entire city. Yet as significant as these events were, they pale in comparison to one of the true high points of human achievement that occurred during that same year: the 24-year-old Isaac Newton laid the foundations of calculus and the theory of gravity, and outlined his theory of light. Only one other year and one other scientist bear comparison with Newton and his annus mirabilis .
Albert Einstein's "miraculous year" was 1905. The unknown 26-year-old patent clerk produced - in breathtaking succession - the special theory of relativity, the quantum theory of light and a convincing argument for the existence of atoms. As preparations for the centennial celebrations get under way, João Magueijo has written a gripping, no-holds-barred account of his challenge to one of the central tenets of relativity and its implications for our understanding of how the universe works.
That the universe began with a "big bang" is something that most of us now accept without question, yet there remain puzzling features about the universe that the big bang theory cannot explain. Why does the universe look the same over such vast distances? Why is it so large? Why does it have the shape it has? Why does the universe exist at all? For years cosmologists have been looking at the infant universe for "clues to its adult behaviour".
Some of these questions, Magueijo realised, could be answered if he broke just one golden rule. It was a simple solution to the cosmological problems, but it presented him with a problem of his own. For his answer "involved something that for a trained scientist approaches madness". What Magueijo proposed was that light travelled faster in the infant universe than it does now. In doing so, he risked "career suicide" by questioning the validity of a perhaps the most fundamental rule of modern physics: that the speed of light is a constant.
Magueijo, a reader in theoretical physics at Imperial College, London, is no madman. But some have called him a heretic and dismissed his theory. After all the constancy of the speed of light is, as he points out, "woven into the fabric of physics, into the way that the equations are written". Frankly, the reaction of his critics is understandable, since Magueijo's proposal would entail the wholesale revision of the entire framework of 20th-century physics.
Undaunted by the hostile reactions, Magueijo continued to investigate the possible consequences of a varying speed of light (VSL) in the very early universe. Whereas others may have been intimidated, he had the courage to follow where VSL led. Disappointingly, "for a long while it led nowhere".
Once he teamed up with his first collaborator on VSL, the American cosmologist Andy Albrecht, new avenues began to open up through regular brainstorming sessions. At the end of each session, conducted behind locked doors, the blackboard calculations were wiped clean. They wanted to keep their ideas under wraps until they were ready to publish a fully fledged theory, since "publish first or perish" is a sad fact of a life for all scientists.
Magueijo provides a highly readable account of the problems besetting modern cosmology and how they appear to be resolved by VSL. Better still, he gives an honest and revealing insight into what it's like to carry out scientific research: the endless frustrations, the fear of being beaten by competitors, the ebb and flow of tension between collaborators, the numerous dead ends, the unexpected moments of inspiration, and the often tedious task of checking and rechecking calculations.
Finally, Magueijo offers a glimpse into the often fraught process of peer review that begins after a finished article is submitted to a journal for publication. He and Albrecht had to bite the bullet, more than once, through a year-long review process, before their paper was finally accepted.
Magueijo finds it difficult to conceal his contempt for the reports written by referees that are at the heart of peer review. For him they are "often empty of scientific content and reflect nothing but the authors' social standing, or their good or bad relations with the referee". In fact, so scathing was he about one well-known journal that the libel lawyers were called out and the original print run of the UK edition of his book had to be shredded.
Whatever the final verdict on VSL, where experimental results will act as the ultimate referee, Magueijo and his collaborators have developed a theory that is now being taken seriously, against all the odds. As the young Einstein once remarked: "Foolish faith in authority is the worst enemy of truth."
Subscribe to:
Posts (Atom)