This article is more than 1 year old

NASA's latest AI will navigate the Moon using landmarks

Compass, protractor and map not required to triangulate lunar coordinates

NASA scientists are developing an AI to help future astronauts traverse the Moon's surface without the aid of satellite fixes, instead relying on landmarks to pinpoint lunar location.

Anyone who has participated in a land navigation exercise using a compass and topographical map will be familiar with the process NASA is developing its AI to perform: locate an object on the horizon, shoot an azimuth, and duplicate the process to get a triangulated location.

Of course, this being a NASA project to triangulate and navigate the Lunar surface, the endeavor is a bit more complicated.

"While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet (9 meters)," said NASA Goddard Space Flight Center research engineer Alvin Yew, who is developing the system.

"It's important for explorers to know exactly where they are as they explore the lunar landscape," Yew said. 

Getting the deets from LOLA

The Lunar Reconnaissance Orbiter, which has been snapping photos of the Moon's surface for more than a decade, hosts a sensor bundle known as LOLA that's a critical component of Yew's project. More precisely known as the Lunar Orbiter Laser Altimeter, the device has been snapping topographical images of the Moon's surface for as long as the LRO has been in orbit, and Yew's AI's first project is turning those images into surface-level horizon maps.

To quickly analyze the images generated by Yew's AI, an additional software tool built at the Goddard Center, known as the Goddard Image Analysis and Navigation Tool (GIANT) will step in. Armed with a 3D view of the surrounding lunar landscape, Yew's AI and GIANT, a lost explorer could scan the horizon and be given a precise location, along with directions back to safety.

"In contrast to radar or laser-ranging tools that pulse radio signals and light at a target to analyze the returning signals, GIANT quickly and accurately analyzes images to measure the distance to and between visible landmarks," NASA said.

And this is the backup?

Yew said that portable devices equipped with local maps would enable the AI to navigate a variety of missions, but this visual Lunar navigation system isn't even a primary system for future Moon missions; it's envisioned as a backup. 

"It's critical to have dependable backup systems when we're talking about human exploration," Yew said, and that's precisely what his AI would be: a backup for more the traditional networking solution NASA is planning for the Moon, LunaNet.

Lunar navigation is a central part of LunaNet's planned services, though NASA hasn't described how LunaNet's navigation system would work in much detail. 

The space agency said that LunaNet navigation, like the rest of the LunaNet, would maintain "operational independence from data processing on Earth while maintaining high precision," but otherwise will make navigation on the Moon's surface and orbit more practical.

If and when LunaNet goes down or a surface mission ends up out of range, Yew's landmark navigation service would step in. NASA said it also sees the system having applications on Earth in situations where explorers can't obtain a GPS satellite fix. ®

More about

TIP US OFF

Send us news


Other stories you might like