NASA’s latest AI will navigate the moon using landmarks • The Register

NASA scientists are developing AI to help future astronauts traverse the lunar surface without the aid of satellite fixations, rather than relying on landmarks to determine the lunar position.

Anyone who has participated in a land navigation exercise using a compass and a topographic map will be familiar with the process that NASA is developing its AI for: locate an object on the horizon, shoot an azimuth, and duplicate the process to get a triangulated position to obtain.

Of course, since this is a NASA project to triangulate and navigate the lunar surface, the task is a bit more complicated.

“While a rough estimate of location may be easy for one person, we want to demonstrate accuracy on the ground to less than 30 feet (9 meters),” said NASA Goddard Space Flight Center research engineer Alvin Yew, who is developing the system.

“It’s important for explorers to know exactly where they are when exploring the lunar landscape,” Yew said.

Getting the deets from LOLA

The Lunar Reconnaissance Orbiter, which has been taking photos of the lunar surface for more than a decade, houses a sensor cluster called LOLA, which is a critical component of Yew’s project. More specifically known as the Lunar Orbiter Laser Altimeter, the device has taken topographical images of the lunar surface while the LRO was in orbit, and Yew’s AI’s first project is converting those images into surface-level horizon maps.

To quickly analyze the images generated by Yew’s AI, an additional software tool developed at the Goddard Center and known as the Goddard Image Analysis and Navigation Tool (GIANT) will step in. Equipped with a 3D view of the surrounding lunar landscape, Yew’s AI and GIANT, a lost explorer, could scan the horizon and get an exact location and directions back to safety.

“Unlike radar or laser rangefinders, which pulse radio signals and light at a target to analyze the returning signals, GIANT quickly and accurately analyzes imagery to measure the distance to and between visible landmarks,” NASA said.

And this is the fuse?

Yew said wearable devices equipped with local maps would allow AI to navigate a variety of missions, but this visual lunar navigation system isn’t even a primary system for future lunar missions; It is intended as a backup.

“It’s critical to have reliable backup systems when we’re talking about human exploration,” Yew said, and that’s exactly what his AI would be: a backup for more of the traditional network solution NASA plans for the moon, LunaNet.

Lunar navigation is a core part of LunaNet’s planned services, although NASA did not detail how LunaNet’s navigation system would work.

The space agency said that LunaNet navigation, like the rest of LunaNet, would preserve “operational independence from computing on Earth while maintaining high precision,” but would otherwise make navigation on the lunar surface and in orbit more practical.

If LunaNet went down or a surface mission went out of range, Yew’s pioneering navigation service would step in. NASA said the system also has applications on Earth in situations where researchers can’t get GPS satellite fix. ® NASA’s latest AI will navigate the moon using landmarks • The Register

Rick Schindler

World Time Todays is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – The content will be deleted within 24 hours.

Related Articles

Back to top button