Scientists saw the Earth from a new perspective on 24th October, 1946, when a V-2 rocket lifted the first ever space-bound camera above New Mexico on a short sub-orbital flight. From an altitude of 65 miles, a 35-millimeter motion picture camera captured grainy monochromatic views of cloud tops over a desert background. This was the beginning of imaging from space.
Rapid technological developments soon led to researchers receiving close-up views of the Moon and the inner planets. Within a few decades, by the late 1970s, planetary researchers were receiving thousands of colour images from Mars revealing evidence of dry riverbeds hinting at the planet’s wetter, warmer history. These insights into the Martian past demonstrated the science potential of investing in imaging of the planets from orbit. By the end of their missions in the early 1980s, the Viking landers and orbiters had accumulated over 50,000 pictures of Mars, an unprecedented quantity of close-up planetary image data which occupied scientists for many years. Today, the imaging of planets continues to play a significant role in space missions. As technology has improved ever more images from ever more targets, at ever higher resolutions, have been acquired.
At present there are millions of images from targets including all of the inner planets, several asteroids and comets, the Moon, and the gas giants along with some of their moons. Unfortunately, only a small percentage of these images have been studied in any great detail, and many have not yet been studied at all. Over the past couple of decades the ability of planetary researchers to visually analyse the surfaces of bodies within the solar system has been exceeded by the quantity of available imagery. With new data being generated almost daily there is a growing need to tackle the lack of analytical power.
There are two practical solutions for easing the burden of analysing all of the images being beamed back to Earth: enlist the help of more people to study these images; or provide an automated alternative. Projects including Moon Zoo and Planet Four have adopted the former solution by asking members of the public, via websites, to help identify features on the Moon and Mars, respectively. Meanwhile, researchers at the University of Manchester (including myself, under the banner of “Map the Planets Project”, which can be found on Facebook) have been working on the latter solution and have made significant progress towards the creation of an automated tool, potentially allowing computers to take over. But now, a hybrid solution is beginning to emerge, as Moon Zoo and the University of Manchester join forces…
Our plan is to take data about the location and size of craters from Moon Zoo and use that data to train Manchester’s automated planetary analysis system. Once it has learned what craters look like, it is hoped the automated system will be able to analyse images to find other craters, at least as accurately as Citizen Scientists using the Moon Zoo website. But before this can happen the raw Moon Zoo data needs to be processed to turn it into something more friendly.

Craters from Moon Zoo users, rendered using Manchester’s filtering software. Proprietary data provided by the Moon Zoo science team, and many thousands of users of the Moon Zoo citizen science project http://www.moonzoo.org
The enthusiasm and contribution of Moon Zoo’s Citizen Scientists comes at a price. Firstly, humans are not perfect and do not all agree on exactly where, or how big, all craters are. Many craters are marked-up multiple times by different people, but each person will have a slightly different idea about exact locations and diameters. Secondly, humans can make mistakes and indicate the presence of craters which are not really there. We call these “false positives”. A minority of Moon Zoo users may also intentionally introduce false positives as malicious acts of cyber-vandalism. Before the data can be used, craters must be filtered to coalesce multiple mark-ups into individual crater, and also to remove as many false-positives as possible.
Over the past couple of weeks we have been developing the necessary Moon Zoo data filtering software. Not only is the filtering necessary for our automated system to be trained correctly, it’s also necessary before Moon Zoo researchers can use their data to answer planetary science questions. Over the coming weeks we’ll be fine-tuning our filtering system and feeding the results into our automated analysis software. We’re not expecting things to work right away, but we’re moving methodically forward towards what will hopefully be a successful set of experiments. With time it is hoped that our methods may lead to new ways of analysing vast quantities of planetary image data reliably and consistently. Once we’ve cracked craters on the Moon there’s a whole solar system’s worth of craters to count. And if we are successful in counting craters we could potentially extend to the analysis of dune fields, carbon dioxide eruptions, drainage channels and more. With enough Citizen Scientists willing to identify lots of examples of interesting features for use as training data, computers and humans could work together to advance our understanding of the planets, moons and asteroids more quickly and efficiently than ever before.
Anybody who would like to get involved can contribute by signing up to the Moon Zoo website. People interested in following the daily progress of the automated analyse tools can also “like” the Map the Planets Project Facebook page where more images, presentation slides and links to technical documents can be found.