NASA and Microsoft have teamed up to develop a new technology that will enable researchers to work virtually on Mars using wearable technology.
Developed by NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California, the OnSight technology will give scientists the means to plan and, along with the Mars Curiosity rover, conduct experiments on the Red Planet.
"OnSight gives our rover scientists the ability to walk around and explore Mars right from their offices," said Dave Lavery, programme executive for the Mars Science Laboratory mission at NASA Headquarters in Washington.
It fundamentally changes the perception of Mars and "how we understand the Mars environment surrounding the rover," he added.
OnSight will use real rover data and extend the Curiosity mission's existing planning tools by creating a 3D simulation of the Martian environment where scientists around the world can meet.
Programme scientists will be able to examine the rover's worksite from a first-person perspective, plan new activities and preview the results of their work firsthand.
More From This Section
"We believe OnSight will enhance the ways in which we explore Mars and share that journey of exploration with the world," said Jeff Norris, JPL's OnSight project manager.
Until now, rover operations required scientists to examine Mars imagery on a computer screen, and make inferences about what they are seeing.
But images, even 3D, lack a natural sense of depth that human vision employs to understand spatial relationships.
The OnSight system uses holographic computing to overlay visual information and rover data into the user's field of view.
Holographic computing blends a view of the physical world with computer-generated imagery to create a hybrid of real and virtual.
"Previously, our Mars explorers have been stuck on one side of a computer screen. This tool gives them the ability to explore the rover's surroundings much as an Earth geologist would do field work here on our planet," Norris said.
The OnSight tool will also be useful for planning rover operations, for example, scientists can program activities for many of the rover's science instruments by looking at a target and using gestures to select menu commands.