In E&T's second International Year of Astronomy feature, we discover that destinations for future space missions may be decided not by agencies like NASA, but by groups of ordinary people engaged in citizen computing.
Question: what kind of person does it require to detect gravity waves in the universe, or calculate the 46th prime number? If your guess is a doctoral-degree holding scientist or mathematician, you are incorrect.
We live in a world where cutting-edge technical research can be done at home. Now, with nothing but a computer with an Internet connection, any user can participate in over 50 'citizen-science' projects around the world. They are for amateur astronomers, ordinary people, groups of volunteers, or even anonymous users.
Today, anyone can donate spare processing capabilities to contribute to large-scale computing projects to perform computational tasks, known as volunteer computing. This type of project requires only computers' spare resources to process data sent from a server. Users can also donate their own spare time to other projects that require active data analysis.
Volunteer computing should not be confused with crowd-sourcing. Crowdsourcing is an open-call outsourcing contest by a client to the general community, often offering prizes for citizen scientist volunteers to find the best solution for a task. Crowdsourcing is used in technology development or data analysis tasks, similar to those in volunteer computing projects.
Although both are equally helpful and noble pursuits in the world of citizen science, one particular aspect separates crowdsourcing projects from volunteer computing. In crowdsourcing, users participate in the data analysis by finding, categorising and performing other active tasks. In volunteer computing, users participate by volunteering their computers' resources to automatically process data downloaded from a server.
There are about 500,000 volunteers worldwide contributing to a few dozen citizen science projects. Some are performing computational tasks, such as SETI@home and Einstein@home, while others require users to actively participate in computational, observational, or measurement data analysis, such as Great World Wide Star Count, Stardust@home, and Galaxy Zoo. There are many of these in astronomy - from the ones that categorise irregular galaxies to those that search for extraterrestrial signals.
One volunteer computing project, Einstein@home, hosted by the University of Wisconsin-Milwaukee, searches data from the Laser Interferometer Gravitational Wave Observatory (LIGO). The program is named after Einstein's general theory of relativity, which predicts how gravity waves distort space and time. This citizen science project harnesses the idle time of volunteers' computers to search for gravitational waves that have been predicted but never directly observed.
Einstein@home developers hope that at least one million volunteers will donate their computers' idle resources to running this downloadable program. It automatically searches a small chunk of data sent from a network of three interferometers to a computer that looks out for changes in the path lengths of laser light going through them. These gravitational waves would be spotted by searching sources that send out cosmic 'ripples' of gravitational energy, such as binary systems, background emissions, gamma ray bursts, and pulsars.
More than 70 teraflops of computational power, well outpacing other available computing resources, are being used for this project. According to the American Physical Society (APS), even if one million computers helped with it, only a small fraction of the total data would be analysed.
ET phone home
The most well-known volunteer computing project is SETI@home, developed by the University of California, Berkeley. The Search for Extraterrestrial Intelligence (SETI) seeks out irregular signals from space from the Arecibo radio telescope in Puerto Rico. It was started recently to increase the power of the search. Volunteers download a screensaver that uses their computer's idle time to process small segments of the telescope's signals.
The project uses an open-source software platform for computing with volunteered resources from BOINC, the Berkeley Open Infrastructure for Network Computing, to maximise idle time and to use resources more efficiently for stand-alone projects.
BOINC allows users to decide how much idle computer power to allocate for SETI@home and how much for other scientific BOINC programs. They can ration a percentage of their computer's power to run other useful programs such as Docking@home, in which computers process protein-binding data for medical applications, or MilkyWay@home, in which they help build a 3D model of our galaxy.
Recently, as much as 450 teraflops have been processed by SETI@home's 200,000 users. "The potential power of volunteer computing exceeds by several orders of magnitude the power available via paradigms like cluster, grid and cloud computing," says David Anderson, BOINC founder and SETI@home director.
Of course, computers do not get to have all of the fun. Citizen projects, like The Great World Wide Star Count, allow active user observation and analysis. This project's goal is to discover how much light pollution is in the night sky. Over 31,000 observers from 60 countries participated by reporting on the quality of their nighttime sky. The initiative was organised by researchers at the University Corporation for Atmospheric Research in Boulder, Colorodo, US and funded by the National Science Foundation.
Contributors compare the magnitude of the constellations Cygnus or Sagittarius and match their observations to magnitude charts download from the project's site, which has in-depth directions and information for non-experts.
This is a cornerstone project of the 2009 International Year of Astronomy, "a global effort initiated by the International Astronomical Union (IAU) and UNESCO to help the citizens of the world … engage a personal sense of wonder and discovery," according to the official website.
Finding the needle in a haystack
One other Internet astronomy citizen-science project, Stardust@home, allows users worldwide to actively participate in searching for interstellar dust grains brought to Earth from space.
NASA's Stardust Mission was launched in 1999 and flew through the coma of comet Wild 2 in 2004. It captured comet and interstellar dust grains, the first interstellar grains ever brought back to Earth. These particles were caught in an aerogel array of tiles on the spacecraft and now need to be extracted. The grains may be microscopic, but they provide some stellar clues into the evolution of star and solar systems.
Stardust@home's approach differs from volunteer computing programs. This "utterly new territory" for doing science, says the project's website, employs the users' spare resources: time, patience, and communication with the other volunteers.
"We have a huge amount of data and don't have enough people to effectively search it in our research group," says Andrew Westphal, Stardust@home project director at the University of California at Berkeley, who developed the technique that NASA will use to scan the aerogel in which the grains are embedded.
Accuracy is not compromised just because volunteers are untrained scientists. They must pass an online test and qualify to view 'movies' of the images collected from an automated scanning microscope and seen in a Virtual Microscope in a Web browser. "We treat the whole ensemble of volunteers, 'dusters', as one big science instrument," says Dr Westphal. The dusters "are incredibly efficient and have low noise rates". The efficiency rate of the project is 90 per cent, according to a calibration technique built into the program that grades dusters on their accuracy.
The new astronomy
In 1965, Intel co-founder Gordon Moore introduced a theory that the number of transistors per square inch on integrated circuits doubles yearly; he later said that they double every 18 months.
Moore's Law has been generalised for the growing power of computation and manifested in volunteer computing since there are over one billion privately owned PCs in the world. Together, they could provide power orders of magnitude larger than currently available through cluster and grid computing. Perhaps Moore's Law predicts the future success of citizen science projects.
But, with great power comes great responsibility. "Several volunteer computing projects have encountered users who falsify results or claim credit for work not actually done. The number of such users is tiny, but the problem must be addressed," according to Dr Anderson.
Such projects are not without user error. Still, the pros outweigh the cons: computer-run, user-driven projects encourage public interest and support, especially for such obscure and misunderstood sciences as cosmology, and allow the research to be performed at scales that were never before possible.
By allowing a large-scale collaboration, citizen science can be carried out on larger magnitudes - at home, at any hour, without specialisation, by thousands - than with traditional research methods: computers, or even supercomputers.
With 22 per cent of the world's population, over one billion people, now using the Internet, and with 305 per cent worldwide growth in Internet usage since 2000, it seems natural to use collaborative computing projects as the next growing scientific trend.
And soon other sciences will see a boom in citizen projects. According to Dr Westphal, the at-home imaging technique may soon be used to help paleoanthropologists locate rare fossil hominids more efficiently in scanned images of desert terrain.
While still in its infancy, the statistics are in favour of such cutting-edge collaborative success in the near future. Whether you are interested in helping to verify Einstein's theories, finding interstellar dust from distant stars, or perhaps setting up your own computational Your-Project@home, volunteer computing is a click, login, and a stand-by away.
Why not help shape the direction of science? Why not donate your idle resources?