NASA’s Curiosity Mars rover autonomously selects some targets for the laser and telescopic camera of its ChemCam instrument. For example, on-board software analyzed the Navcam image at left, chose the target indicated with a yellow dot, and pointed ChemCam for laser shots and the image at right. Credit: NASA/JPL-Caltech/LANL/CNES/IRAP/LPGNantes/CNRS/IAS
NASA’s Mars rover Curiosity is now selecting rock targets for its laser spectrometer—the first time autonomous target selection is available for an instrument of this kind on any robotic planetary mission.
Using software developed at NASA’s Jet Propulsion Laboratory, Pasadena, California, Curiosity is now frequently choosing multiple targets per week for a laser and a telescopic camera that are parts of the rover’s Chemistry and Camera (ChemCam) instrument. Most ChemCam targets are still selected by scientists discussing rocks or soil seen in images the rover has sent to Earth, but the autonomous targeting adds a new capability.
During Curiosity’s nearly four years on Mars, ChemCam has inspected multiple points on more than 1,400 targets by detecting the color spectrum of plasmas generated when laser pulses zap a target—more than 350,000 total laser shots at about 10,000 points in all. ChemCam’s spectrometers record the wavelengths seen through a telescope while the laser is firing. This information enables scientists to identify the chemical compositions of the targets. Through the same telescope, the instrument takes images that are of the highest resolution available from the rover’s mast.
AEGIS software, for Autonomous Exploration for Gathering Increased Science, had previously been used on NASA’s Mars Exploration Rover Opportunity, though less frequently and for a different type of instrument. That rover uses the software to analyze images from a wide-angle camera as the basis for autonomously selecting rocks to photograph with a narrower-angle camera. Development work on AEGIS won a NASA Software of the Year Award in 2011.
“This autonomy is particularly useful at times when getting the science team in the loop is difficult or impossible—in the middle of a long drive, perhaps, or when the schedules of Earth, Mars and spacecraft activities lead to delays in sharing information between the planets,” said robotics engineer Tara Estlin, the leader of AEGIS development at JPL.
Find your dream job in the space industry. Check our Space Job Board »
The most frequent application of AEGIS uses onboard computer analysis of images from Curiosity’s stereo Navigation Camera (Navcam), which are taken routinely at each location where the rover ends a drive. AEGIS selects a target and directs ChemCam pointing, typically before the Navcam images are transmitted to Earth. This gives the team an extra jump in assessing the rover’s latest surroundings and planning operations for upcoming days.
To select a target autonomously, the software’s analysis of images uses adjustable criteria specified by scientists, such as identifying rocks based on their size or brightness. The criteria can be changed depending on the rover’s surroundings and the scientific goals of the measurements.
Another AEGIS mode starts with images from ChemCam’s own Remote Micro-Imager, rather than the Navcam, and uses image analysis to hone pointing of the laser at fine-scale targets chosen in advance by scientists. For example, scientists might select a threadlike vein or a small concretion in a rock, based on images received on Earth. AEGIS then controls the laser sharpshooting.
“Due to their small size and other pointing challenges, hitting these targets accurately with the laser has often required the rover to stay in place while ground operators fine tune pointing parameters,” Estlin said. “AEGIS enables these targets to be hit on the first try by automatically identifying them and calculating a pointing that will center a ChemCam measurement on the target.”
From the top of Curiosity’s mast, the instrument can analyze the composition of a rock or soil target from up to about 23 feet (7 meters) away.
“AEGIS brings an extra opportunity to use ChemCam, to do more, when the interaction with scientists is limited,” said ChemCam Science Operation Lead Olivier Gasnault, at the Research Institute in Astrophysics and Planetology (IRAP), of France’s National Center for Scientific Research (CNRS) and the University of Toulouse, France. “It does not replace an existing mode, but complements it.”
Source: NASA