“This is one of the key milestones, and there will be more such milestones from Argonne,” said Sibendu Som, manager of Argonne’s Computational Multi-Physics group in the Energy Systems (ES) division, of the groundbreaking simulation.
About a year-and-a-half ago, Som and Muhsin Ameen, a research scientist at the Center for Transportation Research in ES, came up with the idea of conducting a direct numerical simulation (DNS), meant to accurately resolve all the turbulent flow scales inside an internal combustion engine. Before that simulation could be performed, however, smaller simulations were necessary to ensure that the largest-ever would go as planned, said Ameen.
“This is one of the key milestones, and there will be more such milestones from Argonne.” — Sibendu Som, section manager of Argonne’s Computational Multi-Physics group in the Energy Systems division
As simulations can provide a more detailed view of turbulent flow, automotive manufacturers rely on them to evaluate several potential engine designs and determine the best ones, but their resources are limited.
Performing simulations at such a grand scale requires bigger and better resources, such as the Theta supercomputer at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.
Ameen and Som collaborated with Saumil Patel, an assistant computational scientist in Argonne’s Computational Science division, who helped with pre- and post-processing, as well as developing the algorithms.
In the summer of 2019, with the help of Patel, Ameen was awarded computing time on Theta through DOE’s Advanced Scientific Computing Research (ASCR) Leadership Computing Challenge.
The calculations on Theta were performed with Argonne’s fluid-thermal simulation code, Nek5000, which was recognized with the Gordon Bell prize for its outstanding scalability on high-performance parallel computers in 1999.
The present day Nek5000, which scales to millions of processors, was developed primarily at Argonne. A new version, NekRS, is under development to target accelerator-based machines and has been supported in the Center for Efficient Exascale Discretizations, part of the DOE’s Exascale Computing Project.
The chief architect of the Nek5000, Paul Fischer, was consulted in the early stages of the development of the present calculations. Fischer is a senior scientist in Argonne’s Mathematics and Computer Science division and a professor in the Departments of Computer Science and Mechanical Science & Engineering at the University of Illinois at Urbana-Champaign.
After years of work to adapt Nek5000 for improved combustion modeling, the scientists performed the DNS of flow inside an internal combustion engine this spring.
“The current simulation effort is the first-ever direct numerical simulation of the flow and heat transfer inside an internal combustion engine for a real engine geometry and operating conditions,” Ameen said.
This simulation required the solution of 2 billion degrees of freedom — which track things such as velocity, pressure and temperature — on 51,328 cores of the Theta supercomputer.
“This is one of the most detailed simulations ever of the flow in an internal combustion engine,” Ameen said.
The DNS dataset generated from the current work will be useful to automotive manufacturers in several ways. The detailed information about the velocity, pressure and temperature distribution within the engine will illuminate in-cylinder processes that are inaccessible to experimentation or low-fidelity simulations. Additionally, the dataset will serve as a simulation benchmark that engine modelers can use to assess and improve the accuracy of the engineering submodels.
The research might also benefit heavy-duty engine companies.
This project was funded by the DOE’s Office of Energy Efficiency and Renewable Energy, Vehicle Technologies Office, under the Partnership on Advanced Combustion Engines consortium umbrella.
The Office of Energy Efficiency and Renewable Energy supports early-stage research and development of energy efficiency and renewable energy technologies to strengthen U.S. economic growth, energy security, and environmental quality.
The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines. Supported by the U.S. Department of Energy’s (DOE’s) Office of Science, Advanced Scientific Computing Research (ASCR) program, the ALCF is one of two DOE Leadership Computing Facilities in the nation dedicated to open science.
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.
The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.
Original post https://alertarticles.info