Today, ultra-fast computers are leading the way.
A billion-billion floating point operations per second–that’s the power of exascale. The first exascale computer in the world, Frontier, resides at the Department of Energy (DOE) Office of Science Oak Ridge Leadership Computing Facility. The DOE’s Office of Science Advanced Scientific Computing Research program has worked for decades to build supercomputers that break barriers in scientific discovery.
Throw in the latest improvements to code–the engine that gives the powerhouse machines the ability to compute–and the possibilities are endless.
Three sets of recently updated exascale codes – Cholla, HACC, and Parthenon – now allow researchers to explore virtual domains of the cosmos that were previously far beyond the reach of science. It’s the culmination of years of hard work through a variety of funding mechanisms and partnerships. That includes SciDAC, a partnership involving all six of the DOE Office of Science programs: Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High-Energy Physics, and Nuclear Physics. The Office of Nuclear Energy is included as well in the effort to dramatically accelerate progress in scientific computing that delivers breakthrough scientific results through partnerships composed of applied mathematicians, computer scientists, and scientists from other disciplines.
“The new and improved updated astrophysical codes provide some of the clearest demonstrations of the most empowering features of exascale computing for science,” said Bronson Messer, a computational astrophysicist at the DOE’s Oak Ridge National Laboratory. “All these teams are simulating an array of physical processes happening on scales ranging over many orders of magnitude — from the size of stars to the size of the universe — while incorporating feedback between one set of physics to others and vice versa. They represent some of the most challenging problems that will be attacked on Frontier, and I expect the results to be remarkably impactful.”
It’s a leap forward in advancing the understanding of the universe with models of unprecedented scale and resolution.
In this case, researchers are using Frontier to curate observational data from astronomers that are already detailed and quite complex. As telescopes become more powerful, tools need to be more and more sophisticated to keep pace.
Astrophysicists make virtual mockups of galaxies and the universe as a whole. They then compare these simulations to observational data. If measurements don’t match up properly, there is a disparity to resolve.
All the above requires a tremendous amount of power and storage to simulate, or model, different scenarios. Building on the previous generations of supercomputers, exascale computers now allow researchers to expand the volumes and important processes required for these types of comparisons.
It’s one more step in better understanding the universe and exploring our solar system and beyond. Doing so deepens our understanding of the cosmos.
We can’t yet simulate stars or galaxies or even the entire universe to answer every question about what we see. But exascale computers and codes for these machines are making a huge difference in our understanding.