• Home
  • Clery, Daniel
  • Piece of the Sun : The Quest for Fusion Energy (9781468310412) Page 3

Piece of the Sun : The Quest for Fusion Energy (9781468310412) Read online

Page 3


  Thonemann came from a comfortable suburban background in Melbourne, with a stockbroker father, mother, two brothers and a sister. At university in Melbourne he had enjoyed tennis and skiing, while in Sydney he made his own surfboard to ride the waves back at Rye near his home town. He was entertaining in company and played the piano well. But he gave up this seemingly idyllic existence to go somewhere he thought would help him in his quest for fusion: Oxford University.

  Stepping off a ship into the Britain of 1946 must have been a shock. Southern Australia was physically untouched by the war, so the bomb-damaged English cities, the rationing of food and clothes, and the general air of exhaustion must have made it seem an alien world. But Oxford did have what he was looking for: the Clarendon Laboratory, chock full of some of the biggest scientific names of the day – many of whom had only recently been released from war work – along with all the experimental apparatus and skilled technicians that he would need. Thonemann had been accepted to study for a doctorate in nuclear physics at a salary of £750 per year. He came armed with notebooks full of calculations describing the conditions necessary to achieve a fusion reaction. But his supervisor, Douglas Roaf, had other ideas and set Thonemann to work developing ion sources. The topics were related, however, and Thonemann was able to simultaneously carry out work on fusion ‘under the counter’ – the hunt was on.

  The roots of the search for fusion stretch back a century before Thonemann began tinkering away in Oxford. At that time an argument developed between physicists, geologists and biologists over the age of the Sun. Physicists of the nineteenth century had made huge strides in understanding the world around them and, emboldened by their success, were applying their theories to ever grander problems. One pivotal achievement was the laws of thermodynamics, the principles that govern the behaviour of heat. According to the First Law of Thermodynamics, energy, or heat, cannot be created or destroyed but can only flow from one place to another and transform from one form to another. Thus the gravitational energy of a ball at the top of a slope is converted into kinetic energy as the ball rolls down the hill. Or the electrical energy of a current in a wire is converted into heat and light by a lightbulb.

  Physicists found that such laws seemed to apply in every situation that they studied, and rightly concluded that they must be universal. But when these fearless physicists applied the First Law to the case of the Sun, this produced troubling conclusions. Scientists could estimate the energy being pumped out by our neighbourhood star by measuring the solar heat falling on a patch of the Earth’s surface and then extrapolating from that to the heat on the inside of a sphere with a radius equal to the Earth-Sun distance. The amount of heat was phenomenal and that then begged the question: if heat cannot be created out of nothing, where is it all coming from? The best source of energy at the time was coal, but if you had a ball of coal the size of the Sun burning to produce heat at the rate scientists had calculated, it would be reduced to ash in around 3,000 years, far too short a time for the formation of the whole Solar System.

  Two of the titans of mid-nineteenth century science had a different idea: German physiologist and physicist Herman von Helmholtz proposed in 1854 that the Sun’s heat came from gravitational energy. As the Sun contracts this energy is transformed into heat and the body of the Sun glows hot, radiating light. Scottish physicist William Thompson, later Lord Kelvin, came to a similar conclusion and calculated that with this source of energy a body the size of the Sun could have been around for 30 million years.

  This seemed a much more reasonable figure for the age of the Sun, but it didn’t please everyone. Charles Darwin had published his new theory of Evolution in 1859 in his book On the Origin of Species by Natural Selection where he included a rough calculation of the age of the Earth made by studying erosion processes in the part of Kent where he lived, known as the Weald. His estimate was 300 million years, and he also concluded that Evolution would need this sort of time to produce the variety of life he saw around him. As the heat of the Sun was needed for life to exist, the Sun must be at least that age. Geologists, too, required an Earth aged in the hundreds of millions of years to explain the transformations of rock that they observed. The debate over the age of the Sun and Earth raged for decades and Darwin was sufficiently troubled by Thompson’s argument that he removed any mention of timescales from his last editions of On the Origin of Species.

  The solution to this mystery began to come together late in the century with a discovery entirely unrelated to astrophysics, geology or Evolution: radioactivity. French physicist Henri Becquerel first noticed that uranium salts, when he left them on top of a photographic plate wrapped in black paper, left an image of themselves when the plate was developed. The salts were emitting some sort of invisible radiation that could penetrate paper and expose the plate. Marie Curie and her husband Pierre, as well as others, continued the study of this phenomenon, which the Curies dubbed radioactivity, identifying different sorts of radiation and isolating two new radioactive elements, radium and polonium. Radium, in particular, was highly radioactive – more than a million times that of the same mass of uranium – and to those early pioneers it exhibited a fascinating property: it was hot, all the time, irrespective of the surrounding conditions. The metal appeared to be breaking the First Law of Thermodynamics. Where was all the heat coming from?

  That question was answered in the following decade by Albert Einstein as a consequence of his theory of Special Relativity. His famous equation E=mc2 broadened the First Law of Thermodynamics by including matter. Energy could seem to disappear if it is converted into matter, and similarly matter can be transformed into energy. Because the speed of light (c in the equation) is a large number, a very small mass (m) of matter converts into a huge amount of energy (E).

  Scientists soon realised the atoms discovered by Becquerel, the Curies and others are radioactive because they are unstable, so over time the nuclei of their atoms split apart into other, smaller nuclei. With each decay, a tiny bit of the nucleus’ mass is converted into energy, explaining the heat produced by radium and the rays that were blackening photographic plates. What those researchers didn’t know was that radiation could endanger your health. Marie Curie carried around samples in her pockets and kept them in her desk, enjoying looking at the blue-green light they gave off in the dark. She died in 1934 of aplastic anaemia, almost certainly as a result of exposure to radiation. Today, her notebooks and even her cookbook from the 1890s are considered too dangerous to handle without protective clothing and are kept in lead-lined boxes.

  Scientists almost immediately began to wonder whether radio-activity was the source of the Sun’s heat. But observations of the Sun showed that it didn’t contain much radioactive material. It was mostly made of hydrogen, the smallest and lightest element which couldn’t decay into anything smaller.

  The decisive clue to the source of the Sun’s energy was provided by the British chemist Francis Aston who, in 1920, was trying to prove the existence of isotopes, versions of an element that have different masses but identical chemical properties. Later he proved that isotopes do exist and, significantly, that their masses are always rough multiples of the mass of hydrogen. So the most common isotope of carbon has the mass of roughly twelve hydrogens, but there are also isotopes of carbon that weigh thirteen hydrogens and fourteen hydrogens. Although it wasn’t known at the time, this is because atomic nuclei are made up of both protons and neutrons which have roughly the same mass. The normal hydrogen nucleus is just a single proton, while the carbon nucleus has six protons plus six, seven or eight neutrons. But in 1920, as part of his search for isotopes, Aston took a number of different elements and made very precise measurements of the masses of their atoms. As everyone expected, the mass of helium, the second-smallest nucleus, was around four times the mass of hydrogen. Aston’s measurements were so accurate, however, that it was possible to conclude that while helium’s mass was close to that of four hydrogens, it wasn’t exactly the same –
helium weighed slightly less than four hydrogens.

  At that time, it was thought that helium really was made from four hydrogens, so the fact that the masses were slightly different was significant. One who spotted the importance of this result was Arthur Eddington of Cambridge University, one of the leading astrophysicists of the day. Eddington was an enthusiastic early advocate of the theory of Relativity and maintained contact with Einstein during the First World War when most British scientists shunned any contact with their German colleagues. Eddington was a committed pacifist and when he was called up for military service in 1918 he refused, risking prison. Prominent scientists rallied to his cause and the Astronomer Royal, Frank Watson Dyson, argued that his expertise was essential to an experiment they were to carry out to put Relativity to the test.

  The scientists’ pleas won the day and in 1919 Eddington and Dyson travelled to the island of Principe off the west coast of Africa to observe the solar eclipse of 29th May. One of the predictions of Einstein’s General Theory of Relativity is that the gravity of a massive object would deflect the path of a beam of light. The object had to be very, very massive to observe this weak effect and Eddington and Dyson’s aim was to use the Sun. With the Sun’s light blocked out by the moon during the eclipse, it would be possible to see stars whose light passes close to the Sun. If gravity does indeed bend beams of light then, as the Sun moves across the sky, just before it obscures a star, the light from the star would appear to move as its path is curved by the Sun’s gravity. The pair did see stars appear to move and when they revealed their results back in Britain the news was reported around the world as the first conclusive proof of the truth of Relativity. Eddington, and Einstein, became household names.

  At that time, Eddington was also working on a theoretical model of the interior conditions of stars, even though the source of their energy was still unknown. Some still adhered to Kelvin and Helmholtz’s gravitational explanation but Eddington was convinced that some kind of nuclear process was more likely. As a result, he jumped on the measurements made by Aston and in August 1920, in an address to the British Association for the Advancement of Science, proposed a new theory. He suggested that in the searing heat at the centre of the Sun hydrogen atoms are fusing to form helium atoms and, if the loss in mass in this process measured by Aston is converted into energy, this could prove to be the Sun’s energy source. Eddington estimated that if 5% of the sun’s mass is hydrogen (we now know that it is actually around 75%) and if, according to Aston, 0.8% of the hydrogen’s mass is converted into energy during fusion, then the Sun – at its current rate of heat production – will last about 15 billion years. He added, somewhat prophetically:

  If, indeed, the subatomic energy in the stars is being freely used to maintain their great furnaces, it seems to bring a little nearer to fulfilment our dream of controlling this latent power for the well-being of the human race – or for its suicide.

  If the Sun had indeed burned for billions of years, that gave scientists – be they evolutionists, geologists or astrophysicists – all the time they needed.

  Eddington continued to try to incorporate nuclear reactions into his theory of the interior of stars but it was still beset with problems. Although Aston had shown that combining four hydrogens to produce a helium freed up mass to convert into energy, no one knew how to make that process happen. And of the nuclear reactions that could be performed in the laboratory, none of them released enough energy to power the Sun.

  Another worry was that, according to the classical physics that prevailed at the time, hydrogen nuclei just would not fuse. To react hydrogen it would be necessary to strip off its outer negatively-charged electron, leaving just the tiny exposed nucleus with its positive charge. For fusion, two such nuclei must slam into each other with such force that they get so close together that it is then more advantageous for them to merge than fly apart again. It’s similar to what happens when two droplets of water are pushed together: at first they seem to try to stay separate as if surrounded by elastic skins, even though they are squashed up against each other, until eventually the best way to relieve the pressure is to merge into a single drop. The problem with two nuclei is that they carry the same positive electric charge and so repel each other – just as bringing the same poles of two magnets together creates a repulsive force – and the closer they get together, the stronger the repulsion. Classical physics predicts that it is virtually impossible to bring two nuclei close enough together to fuse.

  In the 1920s, however, a new show arrived in town: quantum mechanics. In quantum mechanics there are fewer yes or no answers and more probabilities. Impossible things are allowed by quantum mechanics, they just have low probabilities of happening. A young Russian physicist called Georgii Gamow was in 1928 the first to apply quantum mechanics to nuclear reactions. He reasoned that it was not impossible for two nuclei to get close enough to fuse, and he developed a formula to find the probability of such a reaction.

  Using Gamow’s formula, Fritz Houtermans, then at the University of Göttingen in Germany, and Welsh-born astronomer Robert Atkinson began to look at what might happen to nuclei knocking around together under the sort of conditions that Eddington predicted would exist in the heart of the Sun. The two scientists complemented each other perfectly for this task: Houtermans was an experimental physicist who had worked with Gamow at Göttingen and knew about the application of quantum mechanics to the nucleus but not about the interior of the Sun; Atkinson knew all about Eddington’s theory of the Sun but little about quantum mechanics. They calculated that under Eddington’s predicted conditions there would be a healthy rate of reactions between colliding hydrogen nuclei. Their 1929 paper on the topic is thought by many to be the starting point for thermonuclear fusion energy research.

  It was now time for experiments to take centre stage. During a visit to Cambridge, Gamow discussed his work on the quantum mechanics of nuclei with a young physicist at the Cavendish Laboratory there called John Cockcroft. This spurred Cockcroft to develop, along with his colleague Ernest Walton, a device for accelerating hydrogen nuclei or, as they were then becoming known, protons. Protons are a constituent part of all nuclei, along with neutrons. Hydrogen, the simplest nucleus, is made up of a single proton. Cockcroft believed that if Gamow was right, his accelerator would be able to accelerate protons to a high enough speed so that, if they collided with other nuclei, some fusions might take place.

  By 1932, Cockcroft and Walton had built their accelerator and used it to fire protons at a sample of the metal lithium. They found that at relatively modest energies, the protons were able to penetrate into the lithium nuclei and split each one into two helium nuclei. This was hailed as a triumph at the time: the very first ‘splitting of an atom’. Many years later, the pair would share a Nobel Prize for their achievement. It was also significant because this reaction was the first to produce a large amount of energy. The resulting pair of helium nuclei carried more than 100 times the energy of the proton that caused the reaction. The reaction was too difficult to achieve to be a practical source of energy for the Sun, but it at least showed that it was possible to liberate a lot of power from nuclei.

  Back at the Cavendish Laboratory, a colleague of Cockcroft’s called Mark Oliphant made some improvements to the design of the Cockcroft-Walton accelerator so that it could separate out and accelerate deuterium nuclei. Deuterium, with its extra neutron in the nucleus, is identical to hydrogen in every way except that it is twice as heavy. This similarity makes it very hard to distinguish from hydrogen – its existence had only been confirmed a couple of years earlier. Deuterium’s discoverer, American physical chemist Gilbert Lewis, had only just managed in 1933 to separate out a usable quantity of so-called heavy water, made from oxygen and deuterium rather than hydrogen. As soon as he had enough, Lewis sent a sample over to Ernest Rutherford, the formidable director of the Cavendish Laboratory.

  Rutherford was a towering figure in early twentieth-century physics, having earned
a Nobel Prize in 1908 for the discovery that natural radioactivity was due to atoms disintegrating and that it produced two different sorts of radiation. In 1911 he overturned the prevailing ‘plum pudding’ model of the atom – that it was a positively charged ball peppered with negatively charged electrons – by proving that an atom has a tiny but dense nucleus and electrons orbiting around it, a description that still holds true today. Rutherford had a domineering personality, a very loud voice, and ran the Cavendish as his personal fiefdom. He had overseen the work of Cockcroft and Walton and now, with Oliphant, he was going to see what he could do with deuterium.

  Rutherford and Oliphant fired deuterium nuclei, or deuterons, at lithium and a number of different elements to see what nuclear reactions they could cause. Eventually they collided deuterium with deuterium and found they produced two different reactions: one producing an isotope of helium known as helium-3 (two protons and a neutron) and the other an even heavier isotope of hydrogen (one proton and two neutrons) that would eventually be called tritium. Both of these reactions produced excess energy, roughly ten times that of the incoming deuteron. But Rutherford, for one, was not convinced that you could ever produce useful amounts of energy by this method because, although individual reactions produce energy, only around one in every 100 million accelerated protons or deuterons actually caused a reaction. So overall there was a huge energy loss. Rutherford famously said at the time:

  The energy produced by the breaking down of the atom is a very poor kind of thing. Anyone who expects a source of power from the transformation of these atoms is talking moonshine.