The remote sensing mast on NASA Mars rover Curiosity holds two science instruments for studying the rover surroundings and two stereo navigation cameras for use in driving the rover and planning rover activities.
The Curiosity Mars Rover is now smart enough to pick its own targets for exploration, according to a new study.
The secret to Curiosity’s better brain was a software update sent from the ground in October 2015, called the Autonomous Exploration for Gathering Increased Science (AEGIS). This was the first time artificial intelligence had been tried on a remote probe, and the results have shown that similar AI techniques could be applied to future missions, according to the NASA scientists working on the project.
AEGIS allows the rover to be “trained” to identify rocks with certain characteristics that scientists on the ground want to investigate. This is valuable because Curiosity’s human controllers can’t be in direct contact with the rover all the time. Instead of waiting for instructions to “go there and sample that piece of rock,” Curiosity can now look for targets even when it isn’t in contact with its human controllers, according to a new study that describes Curiosity’s use of the software. [Amazing Mars Rover Curiosity’s Latest Photos ]
“We can’t be in constant contact with the rover — Mars rotates and when [Curiosity is] on the far side we can’t get in touch with it ,” Raymond Francis, lead system engineer for the deployment of AEGIS, told Space.com.
According to the study, once the AEGIS system was deployed, it was used 54 times between May 13, 2016, and April 7, 2017. Without intelligent targeting, Curiosity could be expected to hit a target the scientists were interested in about 24 percent of the time; with AEGIS, the rover managed 93 percent, according to the study.
Even when the rover is in contact, the signals from Earth to Mars take time to get there and back. In May 2016, Mars was the closest it had been to Earth in 11 years — 46.8 million miles. A radio signal would take just over 4 minutes to get there and four more to get back. So if there is something planetary scientists want a closer look at, it can take a while to send the commands.
Idle time is often lost science time for the rover mission, and because sending a robot to Mars is expensive and difficult , it’s not ideal. A few hours hanging around each day may not seem like much, but it adds up over the course of an entire mission. With AEGIS, the rover could drive to a location, choose targets for investigation and gather data while it waits for radio contact with Earth again. That means Earth-bound scientists are free to choose a new target once they re-establish contact with the rover.
For the study, the NASA team trained Curiosity, with the AEGIS software, to analyze bedrock in a feature called the Murray formation after each drive. The Murray formation is a rocky outcrop with characteristic bands of mudstone, possibly laid down by lakes of liquid water. One question the scientists wanted to answer was whether the chemical composition of the Murray formation changed over time, because that could reveal changes in the water chemistry, divulging more about the history of water on Mars.
Examples of AEGIS target selection, collected from Martian day 1400 to 1660. Targets outlined in blue were rejected; those outlined in red were retained. Top-ranked targets are shaded green, and second-ranked targets are shaded orange.
Credit: Francis et al., Science Robotics
This analysis of the Murray formation required taking many samples of the mudstone, but doing them would take time away from other experiments and observations. With AEGIS, Curiosity took care of these repetitive observations when it was out of touch with Earth, and researchers would not be using it for more advanced tasks. One could use AEGIS to train Curiosity to look for other types of rock, Francis said.
The AEGIS system works by using two of the rover’s cameras, the Chemistry and Camera instrument (ChemCam) and the navigation cameras. The software uses images captured by the cameras, and tries to recognize edges of objects in the frame, and looks for edges that connect to create a “loop.”
“If you find edges that close into a loop you’ve found an object and on Mars that’s usually a rock,” Francis said. AEGIS can also look at the relative brightness of the various objects in the frame (the navigation cameras don’t have color vision). The combination of edges and brightness allows AEGIS to identify objects.
The science team will have criteria for what kinds of things count as interesting — for example, brightly colored bedrock — and the rover can then use the cameras to “choose” a target. The ChemCam can then use a powerful instrument called a laser spectrometer that uses light to find out what a target is made of.
There are limitations to AEGIS’ abilities; for example, the rover sometimes identified a rock’s shadow as part of the rock’s outline. Even so, the software has proved a useful tool, the study said.
Francis notes that the autonomy will likely become a fixture for many future robotic missions.
“The farther you go in the solar system, the longer the light time delay, the more decisions need to be made on the spot,” he said.
The study appears in the June 21 issue of the journal Science Robotics.