New Twists in the Intertwining of HPC and AI

The alliance between high performance computing and artificial intelligence is proving to be a fruitful one for researchers looking to accelerate scientific discovery in everything from climate prediction and genomics to particle physics and drug discovery. The impetus behind this is to use AI, and machine learning in particular, to augment the predictive capacity of numerical simulations.

The Department of Energy has been particularly active in integrating traditional HPC with AI, in no small part because it operates some of the most powerful supercomputers on the planet. Some of these machines, like the existing “Summit” system at Oak Ridge National Laboratory and the “Sierra” and “Lassen” systems at Lawrence Livermore National Laboratory are heavily laden with GPUs, the compute engine of choice for training neural networks.  And that’s encouraging researchers with access to those systems to explore the utility of AI in their particular areas of interest.

Lawrence Livermore has been pushing ahead with these technologies on a number of fronts and recently won an HPC Innovation Excellence Award from Hyperion Research for work that applied machine learning to a fusion simulation problem.  Specifically, the LLNL team trained a neural network model to help understand the results of Inertial Confinement Fusion (ICF) simulations in order to predict the behavior of fusion implosions. They ran 60,000 simulations on LLNL’s Trinity supercomputer and fed the results into their training workflow.

Read the full article, by Michael Feldman, on TheNextPlatform.com.

Comments are closed.