Nuclear test explosion in Mururoa atoll, French Polynesia, in 1971. The official expert group says the Anthropocene should begin about 1950 and is likely to be defined by the radioactive elements dispersed across Earth by nuclear bomb tests. Photograph: AFP/Getty Images
Several nuclear explosion tests during the Cold War are responsible for altering space weather, including the Earth’s magnetic environment, according to a Nasa study that examined newly-declassified data.
The study concludes that Cold War-era nuclear tests created layers of artificial radiation belts typically generated by the sun. This led to extremely rare natural phenomena in space where geomagnetic disturbances grew so strong it could be compared to the energy of 10 million atomic bombs.
Not surprisingly, the erstwhile Soviet Union and the US caused most of man-made space weather disturbance between 1958 and 1962 while conducting high-altitude tests.
“The tests were a human-generated and extreme example of some of the space weather effects frequently caused by the sun,” says Dr Phil Erickson, an observatory director at the Massachusetts Institute of Technology (MIT) and co-author of the research.
For instance, when the Argus tests were conducted, they were at higher altitudes than previous nuclear tests, pushing harmful particles much further into space. Geomagnetic storms were observed from Sweden to Arizona post the test.
Although atmospheric nuclear testing ended in 1962 and the present space environment is no longer dominated by artificial radiation elements, the data helps scientists understand how the tests would impact activities on Earth. If similar tests were carried out today, scientists say it would cause $600bn (£460bn) to $2.6tn in damages to the US alone and would stall much of the internet taking down all satellite communications, and knocking out most of the global electricity grid. – IBTimes