To meet the needs of tomorrow’s supercomputers, the National Nuclear Security Administration’s (NNSA’s) Lawrence Livermore National Laboratory (LLNL) has broken ground on its Exascale Computing Facility Modernization (ECFM) project, which will substantially upgrade the mechanical and electrical capabilities of the Livermore Computing Center.Read more
To assist in the COVID-19 research effort, Lawrence Livermore National Laboratory, Penguin Computing and AMD have reached an agreement to upgrade the Lab’s unclassified, Penguin Computing-built Corona high performance computing (HPC) cluster with an in-kind contribution of cutting-edge AMD Instinct™ accelerators, expected to nearly double the peak performance of the machine.Read more
Profiled is Mitch Allmond of Oak Ridge National Laboratory, who conducts experiments and uses theoretical models to advance our understanding of the structure of atomic nuclei.
Lawrence Berkeley National Laboratory’s decades of leadership in designing & enhancing energy-efficient data centers is being applied to NERSC supercomputing resources through a collaboration that’s using operational data analytics to optimize cooling systems & save electricity.Read more
Globus, a leading research data management service, today announced the general availability of Globus for Google Cloud, a new solution for accessing and managing data stored in Google Cloud object storage.Read more
Los Alamos National Laboratory and Arm are teaming up to make efficient, workload-optimized processors tailored to the extreme-scale computing requirements of the Laboratory’s national-security mission.Read more
Profiled is physicist Gaute Hagen of the Department of Energy’s Oak Ridge National Laboratory, who runs advanced models on powerful supercomputers to explore how protons and neutrons interact to “build” an atomic nucleus from scratch.Read more
What will scientific computing at scale look like in 2030? With the impending demise of Moore’s Law, there are stillRead more