Argonne National Laboratory (Argonne) in collaboration with Oak Ridge National Laboratory (ORNL), has awarded Codeplay a contract implementing the oneAPI DPC++ compiler, an implementation of the SYCL open standard software, to support AMD GPU-based high-performance compute (HPC) supercomputers.
A Lawrence Livermore National Laboratory team has taken a closer look at how nuclear weapon blasts close to the Earth’s surface create complications in their effects and apparent yields. Attempts to correlate data from events with low heights of burst revealed a need to improve the theoretical treatment of strong blast waves rebounding from hard surfaces.
A team performed simulations on the Summit supercomputer and found that electrons in cuprates interact with phonons much more strongly than was previously thought, leading to experimentally observed “kinks” in the relationship between an electron’s energy and the momentum it carries.
To rapidly advance the field of artificial intelligence and autonomy, FAU’s College of Engineering and Computer Science recently unveiled its “Center for Connected Autonomy and Artificial Intelligence.”
Physicists have been studying the question of how supernova explosions occur for more than 60 years. Thanks to the increasing power of supercomputing resources such as those at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory, they’re moving ever closer to an answer.
A team used two DOE supercomputers to complete simulations of the full-power ITER fusion device and found that the component that removes exhaust heat from ITER may be more likely to maintain its integrity than was predicted by the current trend of fusion devices.
Using the nation’s fastest supercomputer, Summit at Oak Ridge National Laboratory, a team of nuclear physicists developed a promising method for measuring quark interactions in hadrons and applied the method to simulations using quarks with close-to-physical masses.
The UC Santa Cruz professor uses computing resources at Brookhaven Lab’s Center for Functional Nanomaterials to run calculations for quantum information science, spintronics, and energy research.
Responding to COVID-19 has required a huge coordinated effort from the scientific community. The Department of Energy’s Office of Science has spearheaded several scientific efforts, including the National Virtual Biotechnology Laboratory.
Kalyan R S Perumalla is a Distinguished Research and Development Staff Member at Oak Ridge National Laboratory, whose work on reversible computing for exascale computers also provides insights applicable to next generation programming.
A Georgia State University team has used the nation’s fastest supercomputer, Summit at the US Department of Energy’s Oak Ridge National Laboratory, to find the optimal transition path that one E. coli enzyme uses to switch between building and editing DNA to rapidly remove misincorporated pieces of DNA.
A machine learning model developed by a team of Lawrence Livermore National Laboratory (LLNL) scientists to aid in COVID-19 drug discovery efforts is a finalist for the Gordon Bell Special Prize for High Performance Computing-Based COVID-19 Research.
ORNL story tips: Ice breaker data, bacterial breakdown, catching heat and finding order
Lawrence Livermore National Laboratory and its partners AMD, Supermicro and Cornelis Networks have installed a new high performance computing (HPC) cluster with memory and data storage capabilities optimized for data-intensive COVID-19 research and pandemic response.
Lawrence Livermore National Laboratory (LLNL) has installed a state-of-the-art artificial intelligence (AI) accelerator from SambaNova Systems, the National Nuclear Security Administration (NNSA) announced today, allowing researchers to more effectively combine AI and machine learning (ML) with complex scientific workloads.
As the future home to the Aurora exascale system, Argonne National Laboratory has been ramping up efforts to ready the supercomputer and its future users for science in the exascale era.
With funding from the Coronavirus Aid, Relief and Economic Security (CARES) Act, Lawrence Livermore National Laboratory, chipmaker AMD and information technology company Supermicro have upgraded the supercomputing cluster Corona, providing additional resources to scientists for COVID-19 drug discovery and vaccine research
Lawrence Livermore National Laboratory (LLNL) will provide significant computing resources to students and faculty from nine universities that were newly selected for participation in the National Nuclear Security Administration (NNSA)’s Predictive Science Academic Alliance Program (PSAAP).
An international collaboration of theoretical physicists has published a new calculation relevant to the search for an explanation of the predominance of matter over antimatter in our universe. The new calculation gives a more accurate prediction for the likelihood with which kaons decay into a pair of electrically charged pions vs. a pair of neutral pions.
Lawrence Livermore National Laboratory (LLNL) and artificial intelligence computer company Cerebras Systems have integrated the world’s largest computer chip into the National Nuclear Security Administration’s (NNSA’s) Lassen system, upgrading the top-tier supercomputer with cutting-edge AI technology.
Computational scientific research is no longer one-size-fits-all. The massive datasets created by today’s cutting-edge instruments and experiments — telescopes, particle accelerators, sensor networks and molecular simulations — aren’t best processed and analyzed by a single type of machine.
A team led by Dan Jacobson of the Department of Energy’s Oak Ridge National Laboratory used the Summit supercomputer at ORNL to analyze genes from cells in the lung fluid of nine COVID-19 patients compared with 40 control patients.
Scientists at the Department of Energy’s Oak Ridge National Laboratory used neutron scattering and supercomputing to better understand how an organic solvent and water work together to break down plant biomass, creating a pathway to significantly improve the production of renewable biofuels and bioproducts.
A team used the Summit supercomputer to simulate transition metal systems—such as copper bound to molecules of nitrogen, dihydrogen, or water—and correctly predicted the amount of energy required to break apart dozens of molecular systems, paving the way for a greater understanding of these materials.
To meet the needs of tomorrow’s supercomputers, the National Nuclear Security Administration’s (NNSA’s) Lawrence Livermore National Laboratory (LLNL) has broken ground on its Exascale Computing Facility Modernization (ECFM) project, which will substantially upgrade the mechanical and electrical capabilities of the Livermore Computing Center.
A team at Stanford University used the OLCF’s Summit supercomputer to compare simulations of a G protein-coupled receptor with different molecules attached to gain an understanding of how to minimize or eliminate side effects in drugs that target these receptors.
A team of scientists led by Abhishek Singharoy at Arizona State University used the Summit supercomputer at the Oak Ridge Leadership Computing Facility to simulate the structure of a possible drug target for the bacterium that causes rabbit fever.
ORNL story Tips: Mining for COVID, rules to grow by and the 3D connection
Using ORNL’s now-decommissioned Titan supercomputer, a team of researchers estimated the combined consequences of many different extreme climate events at the county level, a unique approach that provided unprecedented regional and national climate projections that identified the areas most likely to face climate-related challenges.
Two decades ago, an experiment at Brookhaven National Laboratory pinpointed a mysterious mismatch between established particle physics theory and actual lab measurements. A multi-institutional research team (including Brookhaven, Columbia University, and the universities of Connecticut, Nagoya and Regensburg, RIKEN) have used Argonne National Laboratory’s Mira supercomputer to help narrow down the possible explanations for the discrepancy, delivering a newly precise theoretical calculation that refines one piece of this very complex puzzle.
For an experiment that will generate big data at unprecedented rates, physicists led design, development, mass production and delivery of an upgrade of novel particle detectors and state-of-the art electronics.
An ORNL team developed the XACC software framework to help researchers harness the potential power of quantum processing units, or QPUs. XACC offloads portions of quantum-classical computing workloads from the host CPU to an attached quantum accelerator, which calculates results and sends them back to the original system.
This is a continuing profile series on the directors of the Department of Energy (DOE) Office of Science User Facilities. Michael E. Papka is the director of the Argonne Leadership Computing Facility.
To assist in the COVID-19 research effort, Lawrence Livermore National Laboratory, Penguin Computing and AMD have reached an agreement to upgrade the Lab’s unclassified, Penguin Computing-built Corona high performance computing (HPC) cluster with an in-kind contribution of cutting-edge AMD Instinct™ accelerators, expected to nearly double the peak performance of the machine.
In the race to identify solutions to the COVID-19 pandemic, researchers at the Department of Energy’s Oak Ridge National Laboratory are joining the fight by applying expertise in computational science, advanced manufacturing, data science and neutron science.
Lawrence Livermore National Laboratory scientists are contributing to the global fight against COVID-19 by combining artificial intelligence/machine learning, bioinformatics and supercomputing to help discover candidates for new antibodies and pharmaceutical drugs to combat the disease.
The U.S. Department of Energy (DOE) announced a plan to provide $60 million to establish multidisciplinary teams to develop new tools and techniques to harness supercomputers for scientific discovery.
The Department of Energy has a vital role to play in the national response to COVID-19. Researchers have already used tools at national laboratories to make major inroads to analyzing the virus and its spread.
Researchers at the Department of Energy’s Oak Ridge National Laboratory have used Summit, the world’s most powerful and smartest supercomputer, to identify 77 small-molecule drug compounds that might warrant further study in the fight against the SARS-CoV-2 coronavirus, which is responsible for the COVID-19 disease outbreak.
Lawrence Livermore National Laboratory (LLNL), Hewlett Packard Enterprise (HPE) and Advanced Micro Devices, Inc. (AMD) today announced the selection of AMD as the node supplier for El Capitan, projected to be the world’s most powerful supercomputer when it is fully deployed in 2023.
Valentino Cooper of Oak Ridge National Laboratory uses theory, modeling and computation to improve fundamental understanding of advanced materials for next-generation energy and information technologies.
Lawrence Berkeley National Laboratory’s decades of leadership in designing & enhancing energy-efficient data centers is being applied to NERSC supercomputing resources through a collaboration that’s using operational data analytics to optimize cooling systems & save electricity.
To better leverage cancer data for research, scientists at ORNL are developing an artificial intelligence (AI)-based natural language processing tool to improve information extraction from textual pathology reports. In a first for cancer pathology reports, the team developed a multitask convolutional neural network (CNN)—a deep learning model that learns to perform tasks, such as identifying key words in a body of text, by processing language as a two-dimensional numerical dataset.
Researchers at the Department of Energy’s Oak Ridge National Laboratory (ORNL) have developed a quantum chemistry simulation benchmark to evaluate the performance of quantum devices and guide the development of applications for future quantum computers.
Globus, a leading research data management service, today announced the general availability of Globus for Google Cloud, a new solution for accessing and managing data stored in Google Cloud object storage.
In this Q&A, Oak Ridge National Laboratory’s Dan Jacobson talks about his team’s work on a genomic selection algorithm, his vision for the future of environmental genomics, and the space where simulation meets AI.
A team at Georgia Tech created a new turbulence algorithm optimized for the Summit supercomputer. It reached a performance of less than 15 seconds of wall-clock time per time step for more than 6 trillion grid points—a new world record surpassing the prior state of the art in the field for the size of the problem.
Computational scientists, biophysicists and statisticians from Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) are leading a massive multi-institutional collaboration that has developed a machine learning-based simulation for next-generation supercomputers capable of modeling protein interactions and mutations that play a role in many forms of cancers.
A team used the Summit supercomputer to simulate a 10,000-atom magnesium dislocation system at 46 petaflops, a feat that earned the team an ACM Gordon Bell Prize finalist nomination and could allow scientists to understand which alloying materials to add to improve magnesium alloys.
A team simulated a 10,000-atom 2D transistor slice on the Summit supercomputer and mapped where heat is produced in a single transistor. Using a new data-centric version of the OMEN nanodevice simulator, the team sustained the code at 85.45 petaflops and earned a Gordon Bell Prize finalist nomination.