NeMo-Net on a tablet (NASA Laboratory for Advanced Sensing/YouTube)

Game generates training data for supercomputer mapping coral reefs

To get a better idea the changes impacting coastal coral reefs, researchers at NASA’s Ames Research Center are combining remote sensing data, satellite imagery, supercomputer-powered machine learning and crowdsourced gamification.  The NeMo-Net project aims to improve scientists’ ability to remotely sense and analyze changes in underwater ecosystems caused by rising ocean temperatures, pollution and ocean acidification.

With funding from NASA’s Earth Science Technology Office, Ved Chirayath a scientist at the NASA Ames Research Center, first developed the FluidCam and fluid lensing software to collect video through the ocean’s surface. The FluidCam is a high-performance digital camera with 16 cores of processing power and about a terabyte of memory mounted on an orbiting CubeSat. It processes megabytes of imaging data per second, Chirayath said in a video interview.

The fluid lensing software on the camera removes the refraction and distortion in images of the sea floor caused by surface sunlight and waves, allowing scientists to see objects at the centimeter scale 10 meters below the surface.

Multispectral Imaging, Detection and Active Reflectance, meanwhile, provides real-time video from data on an object’s reflectance. MiDAR uses an array of LED emitters on drones or autonomous underwater vehicles to illuminate targets far below the surface and to measure their spectral reflectance, which is then combined with the FluidCam video to produce 3D multispectral scenes and high-resolution underwater imagery.

The final piece of the puzzle is NeMO-Net, the Neural Multi-Modal Observation and Training Network for Global Coral Reef Assessment. The machine-learning technology exploits high-resolution data from FluidCam and MiDAR to improve low-resolution sensing data from aircraft and satellites, Chirayath explained in a paper on the technologies.

NeMO-Net also features an interactive video game by the same name that challenges players to identify and classify images of coral. Players travel the ocean in their own research vessel, the Nautilus, and colorize 3D and 2D images of the ocean floor. 

Interactive tutorials train players on domain-specific knowledge, and the game periodically checks their labeling against pre-classified coral imagery to improve their classification skills, a paper on the project explained. Players can rate other players’ classifications and unlock rewards as they label items in shallow marine environments.

As they play the game, players’ actions help train NASA's Pleiades supercomputer to recognize corals, even those images taken with instruments less powerful than FluidCam and MiDAR. The supercomputer uses machine learning to abstract knowledge from the coral classifications players make by hand so that it can eventually classify images on its own, NASA officials said.

As more people play NeMO-NET, Pleiades' mapping abilities will improve until it can classify corals from low-resolution data, which in turn will allow scientists to more readily see what is happening to coral reefs and find ways to preserve them.

"NeMO-Net leverages the most powerful force on this planet: not a fancy camera or a supercomputer, but people," Chirayath said. "Anyone, even a first grader, can play this game and sort through these data to help us map one of the most beautiful forms of life we know of."

In fact, NeMO-Net was tested by fourth-graders studying coral reef ecosystems at the Town School for Boys in San Francisco. The students identified several bugs and offered valuable feedback about ease of use and replayability.

NeMO-Net is available for iOS devices and Mac computers in the Apple App store, and a version for Android systems will be released soon.

About the Author

Susan Miller is executive editor at GCN.

Over a career spent in tech media, Miller has worked in editorial, print production and online, starting on the copy desk at IDG’s ComputerWorld, moving to print production for Federal Computer Week and later helping launch websites and email newsletter delivery for FCW. After a turn at Virginia’s Center for Innovative Technology, where she worked to promote technology-based economic development, she rejoined what was to become 1105 Media in 2004, eventually managing content and production for all the company's government-focused websites. Miller shifted back to editorial in 2012, when she began working with GCN.

Miller has a BA and MA from West Chester University and did Ph.D. work in English at the University of Delaware.

Connect with Susan at [email protected] or @sjaymiller.

Featured

  • senior center (vuqarali/Shutterstock.com)

    Bmore Responsive: Home-grown emergency response coordination

    Working with the local Code for America brigade, Baltimore’s Health Department built a new contact management system that saves hundreds of hours when checking in on senior care centers during emergencies.

  • man checking phone in the dark (Maridav/Shutterstock.com)

    AI-based ‘listening’ helps VA monitor vets’ mental health

    To better monitor veterans’ mental health, especially during the pandemic, the Department of Veterans Affairs is relying on data and artificial intelligence-based analytics.

Stay Connected