How the US nuclear test moratorium started a supercomputing revolution

How the US nuclear test moratorium started a supercomputing revolution

Thirty years ago, on September 23, 1992, the United States conducted its 1,054th nuclear weapons test.

When this test, named Divider, exploded in the morning underground in the Nevada desert, no one knew it would be the last American test for at least the next three decades. But by 1992, the Soviet Union had formally dissolved and the United States government decreed what was then seen as a short-term moratorium on testing that continues today.

This moratorium came with an unexpected benefit: no longer testing nuclear weapons ushered in a revolution in high-performance computing that has far-reaching national and global security impacts that few are aware of. The need to maintain our nuclear weapons in the absence of testing has led to an unprecedented need for increased scientific computing power.

At Los Alamos National Laboratory in New Mexico, where the first atomic bomb was built, our primary mission is to maintain and verify the safety and reliability of the nuclear stockpile. To do this, we use non-nuclear and subcritical experiments coupled with advanced computer modeling and simulations to assess the health and extend the life of US nuclear weapons.

But as we all know, the geopolitical landscape has changed in recent years, and while nuclear threats still loom, a host of other emerging crises threaten our national security.

Pandemics, sea level rise and coastal erosion, natural disasters, cyberattacks, the spread of disinformation, energy shortages: we have seen firsthand how these events can destabilize nations, regions and the world. At Los Alamos, we use high-performance computing that has been developed over decades to simulate nuclear weapon explosions with extraordinarily high fidelity to address these threats.

When the Covid pandemic first took hold in 2020, our supercomputers were used to help predict the spread of the disease, as well as model vaccine deployment, the impact of variants and their spread, counties at high risk of vaccine hesitancy and the impacts of various vaccine distribution scenarios. They also helped model the impact of public health orders, such as face mask mandates, to stop or slow the spread.

This same computing power is used to better understand DNA and the human body at fundamental levels. Los Alamos researchers have created the largest simulation to date of an entire DNA gene, a feat that required the modeling of a billion atoms and will help researchers better understand and develop cures for diseases such as cancer.

What are Los Alamos supercomputers used for?

The Laboratory also uses the power of secure and classified supercomputers to examine the national security implications of climate change. For years, our climate models have been used to predict Earth’s responses to change with ever-increasing resolution and accuracy. But the usefulness of our climate models to the national security community has been limited. This is changing, given recent advances in modelling, increasing resolution and computing power, and combining climate models with infrastructure and impact models.

We can now use our computing power to observe climate change at extraordinarily high resolution in areas of interest. Because the work is done on secure computers, we don’t reveal to potential adversaries exactly where (and why) we are looking. Additionally, the use of these supercomputers allows us to incorporate classified data into the models which can further increase accuracy.

Los Alamos supercomputers are also used for earthquake prediction, coastal erosion impact assessment, wildfire modeling, and a host of other national security challenges. We also use supercomputers and data analytics to optimize our nonproliferation threat detection efforts.

Of course, our Laboratory is not alone in this effort. Other Department of Energy labs are using their intensive computing power to tackle similar and additional challenges. Likewise, private companies pushing the boundaries of computing are also helping to advance national security-focused computing efforts, much like the work of our nation’s top universities. As the saying goes, a rising tide lifts all boats.

And we have the moratorium on nuclear weapons testing, at least in part, to thank. We did not know 30 years ago how much we would benefit from the supercomputing revolution that followed. As a nation, itContinuing to invest in supercomputing not only ensures the safety and efficiency of our nuclear stockpile, but also advances scientific exploration and discovery that benefits everyone. Our national security depends on it.

Bob Webster is the assistant director of weapons at the Los Alamos National Laboratory. Nancy Jo Nicholas is Associate Laboratory Director for Global Security, also at Los Alamos.

Have an opinion?

This article is an Op-Ed and the opinions expressed are those of the author. If you would like to respond or would like to submit your own editorial, please email Cary O’Reilly, C4ISRNET Senior Editor.

Similar Posts

Leave a Reply

Your email address will not be published.