Cold War nuclear tests did change the weather in the 1960s. The Earth did not catch fire, but a hard rain did begin to fall.
LONDON, 19 May, 2020 – Sixty years on, British scientists have confirmed a once-popular belief: that atmospheric nuclear tests of early weapons under development affected the daily weather. A new study of weather records from 1962 to 1964 reveals the signature of experimental atomic and thermonuclear explosions during the early days of the Cold War.
The scientists measured atmospheric electric charge and cloud data to find that on those days when radioactively-generated electric charge was higher, clouds were thicker and there was up to a quarter more rain than on those days when charge was low.
The climate impact of nuclear detonations may not have been as devastating as many older lay people appeared to think at the time, and some good came of the tests: researchers who studied radiation distribution as it spread around the planet from weapons test sites built up a body of data that delivered a new way to follow atmospheric circulation patterns.
“We have now re-used this data to examine the effect on rainfall,” said Giles Harrison of the University of Reading in the UK. “The politically charged atmosphere of the Cold War led to a nuclear arms race and worldwide anxiety. Decades later, that global cloud has yielded a silver lining, in giving us a unique way to study how electric charge affects rain.”
Between 1945 and 1980 US, Soviet, British and French governments exploded 510 megatons of nuclear weaponry underground, under water and in the lower and upper atmosphere. Of this, 428 megatons – the equivalent of 29,000 bombs of the size dropped onto Hiroshima in Japan at the end of the Second World War – was in the open air, and the greatest concentration of tests was in the late 1950s and early 1960s.
Scientists began to collect strontium-90 isotopes and other radioactive fission products in the rain that fell after such tests. By 1960, people in Europe and the US could be heard grumbling about the supposed impact on the weather of tests carried out 10,000 kilometres away.
British cinemagoers were treated to an improbable vision of climate catastrophe triggered by nuclear tests in the 1961 film The Day the Earth Caught Fire. The US government commissioned the Rand Corporation to deliver an inconclusive report in 1966 on the effect upon weather, but by then an international treaty had banned tests in the atmosphere, in the water and in space.
Very slowly, public concern about radioactive fallout and its consequences for the weather began to fade.
Scientists continued to contemplate the climate effects of nuclear confrontation in other ways: in 1983 US researchers proposed a possible nuclear winter, triggered by radioactive mushroom clouds from burning cities that would reach the stratosphere and dim the sun’s light for a decade.
But long before then, peace and prosperity had created another climatic danger: the accelerating combustion of fossil fuels had begun to raise atmospheric greenhouse gas levels to trigger global warming, and climate scientists began to adopt nuclear yardsticks to measure the effect.
“The atmospheric conditions of 1962-64 were exceptional and it is unlikely they will be repeated, for many reasons”
One calculation is that by flying in jet planes or driving cars or generating electric power, humankind is now adding the equivalent in heat energy of five Hiroshima explosions every second to the world’s atmosphere, thus inexorably altering the global climate.
That has not stopped other scientists from worrying about the chilling effects upon climate and human civilisation of even a limited nuclear exchange. But the supposed impact of bursts of nuclear radiation upon the weather has been more or less forgotten.
Now Professor Harrison and colleagues have returned to the puzzle in the journal Physical Review Letters, to find that the answer could be disentangled from weather records collected in Kew, near London, and 1000 kms away in Lerwick in the Shetland Islands north-east of Scotland, a site selected because it would be least affected by soot, sulphur particles and other kinds of industrial pollution.
Nuclear radiation ionises the matter in its path to create electrically-charged atoms and molecules. Electric charge changes the way water droplets in clouds collide and combine – think of dramatic thunderstorms, lightning and torrential rain – and this affects the size of the droplets and the volume of rain: that is, the rain doesn’t fall at all until the droplets get big enough.
Usually, the sun does most of the work, but in comparing the weather records from two stations, the researchers were for the first time able to factor in the contribution from Cold War test explosions in the Nevada desert, or the Siberian Arctic, or the faraway south Pacific, on Scottish rainfall between 1962 and 1964.
They found 150 days in which atmospheric electricity was high or low, while cloudy in Lerwick: they also found a difference in precipitation which, they say, disappeared once the build-up of nuclear radioactive fallout had vanished.
Their statistical analyses suggest no serious or lasting change, but the connection was there: where radioactivity was high, rainfall increased from 2.1mm per day to 2.6mm – a 24% increase in daily rain. Clouds, too, were thicker.
The study remains as one more piece of the climate jigsaw, as a test of measuring technique, and one more reminder of the lessons still to be learned from the Cold War.
It confirms a deepening understanding of the intricate machinery that delivers the first drops of rain, and ideally scientists won’t get many chances to test their understanding in the same way again.
The authors conclude, in the clipped tones favoured by research publications: “The atmospheric conditions of 1962-64 were exceptional and it is unlikely they will be repeated, for many reasons.” – Climate News Network