Marine ecosystems are in the midst of a conservation crisis, with coral reefs in particular facing numerous challenges as a result of climate change. In an effort to better understand these environments and the threats they face, researchers collect huge image libraries of these underwater environments, using 3D imagery collected from divers and snorkelers, as well as 2D images collected from satellites. These approaches provide researchers with huge amounts of data, but to extract value from these libraries requires a method to quickly analyze for patterns or ‘classifications’.
In a new study in
Frontiers in Marine Science
, researchers at NASA’s Ames Research Center’s Laboratory for Advanced Sensing automated this process through the use of an artificial intelligence tool called a convolutional neural network (CNN), as lead author Jarrett van den Bergh of the Bay Area Environmental Research Institute explains:
“Vast amounts of 3D coral reef imagery need to be classified so that we can get an idea of how coral reef ecosystems are faring over time. Making this classification process as efficient as possible drove us to look at automation with CNNs.”
CNNs are an artificial intelligence model loosely based on biological neurons and brains that are used to analyze images and look for features, such as different coral species on a reef, or even fish swimming through an underwater scene, as well as where these features are in relation to everything else in the image. This layered depth is what makes CNNs such a good fit for analyzing complex images, such as coral reefs.
Jarrett van den Bergh explains, however, that using CNNs can also present additional challenges when classifying data: “CNNs require lots of training data to function correctly, so it was vital for us to build a large database of data that we could use to train the CNN on how to classify these complex 3D images of coral reefs.”
To overcome this challenge, the researchers used a citizen science approach in the form of a video game called
NeMO-Net
, which harnesses the power of citizen scientists to generate training datasets. As players explore virtual underwater worlds, they can learn about and classify coral species, and their classification labels are then used to train NeMO-Net’s CNN.
Mr van den Bergh also highlights the more rewarding aspects of the NeMO-Net project: “NeMO-Net collects data primarily, but it is also an educational tool that gives people a more intimate understanding of our coral reefs. To date the game has reached over 300 million people in the 7 months since release.”
The researchers are hopeful that their work in developing the NeMO-Net video game and CNN will be valuable for other conservation and mapping projects, and further research into the potentials of machine learning should be explored:
“As our technology progresses, machine learning might be able to give us a good estimation of what our coral reefs will look like 2 or 5 years from now. This could be extremely useful for coral reef conservationists who want to see the impact of their work. We are only just beginning to see the impacts of machine learning in conservation.”
###
This part of information is sourced from https://www.eurekalert.org/pub_releases/2021-04/f-nnv042021.php