At the Department of Energy’s Oak Ridge National Laboratory, scientists use artificial intelligence, or AI, to accelerate the discovery and development of materials for energy and information technologies.
The Comet supercomputer will end formal service as an NSF resource and transition to exclusive use by the Center for Western Weather and Water Extremes to leverage computing capabilities to enhance decision-making associated with reservoir management over California.
ORNL story tips: Volcanic microbes, unbreakable bonds and flood mapping
Connected moments math shortcut shaves time and cost of quantum calculations while maintaining accuracy
A Georgia State University team has used the nation’s fastest supercomputer, Summit at the US Department of Energy’s Oak Ridge National Laboratory, to find the optimal transition path that one E. coli enzyme uses to switch between building and editing DNA to rapidly remove misincorporated pieces of DNA.
PNNL, in partnership with industry, has developed a computational tool called HIPPO, which accelerates the increasingly complex calculations grid operators must make in scheduling energy resources to meet the next day’s forecasted electricity demand.
A new Physics Frontier Center at UC Berkeley, supported by the National Science Foundation, expands the reach and depth of existing capabilities on campus and at neighboring Berkeley Lab in modeling one of the most violent events in the universe: the merger of neutron stars and its explosive aftermath.
A team used the Summit supercomputer to simulate transition metal systems—such as copper bound to molecules of nitrogen, dihydrogen, or water—and correctly predicted the amount of energy required to break apart dozens of molecular systems, paving the way for a greater understanding of these materials.
A team at Stanford University used the OLCF’s Summit supercomputer to compare simulations of a G protein-coupled receptor with different molecules attached to gain an understanding of how to minimize or eliminate side effects in drugs that target these receptors.
A team of scientists led by Abhishek Singharoy at Arizona State University used the Summit supercomputer at the Oak Ridge Leadership Computing Facility to simulate the structure of a possible drug target for the bacterium that causes rabbit fever.
Major upgrades of particle detectors and electronics prepare CERN experiment to stream a data tsunami
For an experiment that will generate big data at unprecedented rates, physicists led design, development, mass production and delivery of an upgrade of novel particle detectors and state-of-the art electronics.
Fighting COVID with computing: Fermilab, Brookhaven, Open Science Grid dedicate computational power to COVID-19 research
Scientists and engineers at Fermilab and Brookhaven are uniting with other organizations in the Open Science Grid to help fight COVID-19 by dedicating considerable computational power to researchers studying how they can help combat the virus-borne disease.
Candace Culhane, a program/project director in Los Alamos National Laboratory’s Directorate for Simulation and Computation, has been selected as the general chair for the 2022 SC Conference (SC22).
British Petroleum researchers invited ORNL data scientists to give the company’s high-performance computing team a tutorial of the laboratory’s ADIOS I/O middleware. ADIOS has helped researchers achieve scientific breakthroughs by providing a simple, flexible way to describe data in their code that may need to be written, read, or processed outside of the running simulation. ORNL researchers Scott Klasky and Norbert Podhorszki demonstrated how it could help the BP team accelerate their science by helping tackle their large, unique seismic datasets.
In this Q&A, Oak Ridge National Laboratory’s Dan Jacobson talks about his team’s work on a genomic selection algorithm, his vision for the future of environmental genomics, and the space where simulation meets AI.
A team at Georgia Tech created a new turbulence algorithm optimized for the Summit supercomputer. It reached a performance of less than 15 seconds of wall-clock time per time step for more than 6 trillion grid points—a new world record surpassing the prior state of the art in the field for the size of the problem.
A team used the Summit supercomputer to simulate a 10,000-atom magnesium dislocation system at 46 petaflops, a feat that earned the team an ACM Gordon Bell Prize finalist nomination and could allow scientists to understand which alloying materials to add to improve magnesium alloys.
A team simulated a 10,000-atom 2D transistor slice on the Summit supercomputer and mapped where heat is produced in a single transistor. Using a new data-centric version of the OMEN nanodevice simulator, the team sustained the code at 85.45 petaflops and earned a Gordon Bell Prize finalist nomination.