sciencenewsnet.in

New Great Lakes modeling improves operational forecast system

Though the Great Lakes are called lakes, because of their sheer size they are truly inland seas. They affect regional weather patterns, provide drinking water to millions of people and drive the economies of several states.

At Michigan Technological University, Pengfei Xue, associate professor of civil and environmental engineering and director of the Numerical Geophysical Fluid Dynamics Laboratory at the Great Lakes Research Center, is helping improve regional forecast models.

Forecasting the water levels, temperatures and currents of the lakes is highly important because of the myriad ways lake conditions affect commerce, recreation and community well-being. These forecasts comprise the Great Lakes Operational Forecast System (GLOFS), an automated model-based prediction system operated by the National Oceanic and Atmospheric Administration (NOAA).

“The system information allows decision makers to make informed decisions and the forecast products have been used by a wide variety of users on a regular basis,” said Philip Chu, supervisory physical scientist of the integrated physical and ecological modeling and forecasting branch of NOAA’s Great Lakes Environmental Research Laboratory (GLERL).

“Water levels are used by power authorities; wave and currents conditions are used by the U.S. Coast Guard for search and rescue missions and temperature profiles have been used by recreational boaters and fishermen,” he said. “The information has also been used to predict harmful algal blooms as well as hypoxia (low dissolved oxygen) conditions in the Great Lakes.”

While NOAA operates its own modeling team to maintain the system, the agency also works with university researchers to continually improve GLOFS. Michigan Tech’s Xue is aiding NOAA by adding a data assimilation component.

“All models contain some uncertainties and the observation also has noise, which can be large or small in fieldwork, depending on different cases,” Xue said. “Which should you believe? Your best bet is something in between. When we quantify the model and the observation uncertainties by assessing their historical performances, we can quantitatively combine the observational data and the numerical model results with different weights and give a more accurate estimate.”

Computer modeling is much more complicated than this example, Xue noted. One key advantage of a model, especially in a large and complex environment like the Great Lakes, is that it can produce continuous fields in 3D space, predicting — at any time and any place — temperature, water levels, and currents. On the other hand, in situ observations provide “ground truth,” but they are often limited in time and space.

“Quantifying the model and observation uncertainties is at the heart of data assimilation techniques,” Xue explained. “The beauty of data assimilation is to use the information of the misfits between the model results and observations, which are only known at limited observation locations, to correct model bias in a 3D space beyond the observation locations. Hence, it improves model accuracy for the entire simulation fields.”

Xue’s work utilizes the Superior, a high-performance computing infrastructure at Michigan Tech, to build high-fidelity models. Model results are being used to build a long-term, data assimilative temperature database for Lake Erie for use by resource managers and researchers in the Great Lakes community. The Lake Erie simulation is a proof of concept prior to GLOFS being entirely refitted using data assimilation. Xue’s project will also apply machine learning to further enhance model performance and adaptive in situ sampling, with the goal to extend the method to all five Great Lakes.