You can help a Mars Rover’s AI learn to tell rocks from dirt – TipsClear

Mars Rover Curiosity has been on the Red Planet for eight years, but its journey is nowhere near finished – and it is still being upgraded. You can help it by spending a few minutes labeling the raw data to feed its geo-scanning AI.

Curiosity does not navigate on its own; There is a whole team of people on Earth who analyze the imagery coming back from Mars and pave the way for the Mobile Science Laboratory. However, in order to do so, they need to carefully examine the imagination to understand about rocks, soil, sand, and other characteristics.

This is exactly the kind of work that is good in machine learning systems: you give them lots of images with clearly labeled key attributes, and they learn to find similar features in unlisted images.

The problem is that while there are plenty of ready-made data sets of images with faces, cats, and cars, there are not many Martian surfaces annotated with different terrain.

“Typically, hundreds of thousands of examples are required to train an intensive learning algorithm. Algorithms for self-driving cars, for example, are trained with multiple images of roads, signs, traffic lights, pedestrians, and other vehicles. Other public datasets for deep learning include people, animals and buildings – but there is no single Martian landscape, ”NASA / JPL said AI researcher Hiro Ono in a news release.

So NASA is building one, and you can help.

Image Credit: NASA / JPL

To be precise, they already have an algorithm, called Soil Property and Object Classification, or SPOC, but are asking for help improving it.

The agency has uploaded thousands of Zunivers from Mars, and it can take a few minutes for anyone to annotate them – after reading through the tutorial. It may not seem so difficult to draw rocks, sandy stretches and shapes around it, but you, as I did, will immediately run into trouble. Is that “big rock” or “bedrock”? Is it more than 50 centimeters wide? how long is it?

So far the project has labeled about half of the 9,000 pictures it wants to achieve (with more probably to come), and you can help them with that goal if you have a few minutes’ duration – No commitment is required. The site is now available in English, with Spanish, Hindi, Japanese and other translations along the way.

Improvements in AI can tell the Rover not only where it can run, but the possibility of losing traction and other factors that may affect individual wheel placement. Things also get easier for the team planning Curiosity’s movements, because if they don’t believe in the classification of SPOCs, they don’t need to spend much time to double-check the spec.

Monitor Curiosity’s progress on the mission’s webpage.

Related Posts