Artist's impression of San Francisco in the grip of a grey goo attack. Credit: Jeff Darling.
In the 1960s the Hungarian-born American mathematician John von Neumann wrote about machines that could make exact copies of themselves. He envisaged a kind of robot equipped with a computer brain that could be programmed to reproduce itself from raw materials taken from its surroundings. The copy would be so perfect that it too would carry the instructions needed for making further clones, so that the process could continue indefinitely.
It wasn't long before some people suggested that von Neumann machines, in the form of robot spacecraft, would be an ideal way for an advanced civilization to learn more about the Galaxy without having to venture out in person. Simply build a von Neumann probe and launch it toward a nearby star. Upon arrival the probe would replicate itself, over and over again, down to the last nut and bolt, from materials found on the surface of one of the star's planets. The new probes would then set off for other stars and their worlds, where they would reproduce again, spawning another generation of self-replicating spacecraft. Given not-unreasonable assumptions about how fast the probes could travel, calculations showed that all the stars in the Galaxy could be visited and explored in this way within a few million years.
But then someone pointed out that if this were such a great way for a smart race to find out about the Milky Way, then we ought to have come across some of these von Neumann probes by now. In fact, given the ease with which they could copy themselves almost ad infinitum, they should be pretty much everywhere. The Solar System ought to be like a junkyard or parking lot of alien self-replicating vessels. Because it obviously isn't that could be taken as a sign that there aren't any intelligent extraterrestrials anywhere in the Galaxy.
Not so fast, replied astronomer Carl Sagan. Any aliens capable of building star-faring von Neumann probes would be clever enough to realize the danger of launching them in the first place: namely that, in time, in slavish obedience to their programs, the spacecraft would convert almost all the matter in the Galaxy into von Neumann probes!
The information technology that drives our mobile phones or laptops already has nanoscale components. The latest generation of microprocessors, for example, uses transistors that measure only a few tens of nanometers across. Another few decades of development and there'll be true nanocomputers which, working at the heart of tiny assemblers of the type first described by Eric Drexler, may make possible von Neumann machines of microscopic dimensions.
At first sight, such devices look just the job – ideal, for instance, for mopping up after oil spills. A tailor-made von Neumann chemical vacuum cleaner might seem perfect for breaking down the tar and other poisons from an Exxon Valdez or BP Gulf of Mexico disaster. The problems start if such a device goes wrong. What if the program becomes corrupted and the "nanobots" released to clean up an oil spill start to attack all carbon-based molecules, including living things in the ocean, all the while making more and more copies of themselves? "We cannot afford certain kinds of accidents with replicating assemblers," wrote Drexler in 1986. Nanobots whose instructions mutate, or become corrupted, might end up consuming not just the toxins they were meant to attack but the wider environment in a process that's been called ecophagy. This is the dreaded gray goo scenario: the reduction of everything – the eating of the biosphere and perhaps the entire planet – by ravenous, mutant nano-von Neumann replicators.
Bearing in mind such a risk, however remote, it seems likely that nano-devices aimed at cleaning up oil spills and other big tasks won't be allowed to reproduce on their own. Instead the chances are they'll be manufactured in large quantities by "nanofactories" – pieces of equipment that might be small enough to fit on a table-top, but whose fabricating machinery would be inert if removed or unplugged.
Still there are dangers. Just as some people are both ingenious and irresponsible enough to release computer viruses onto the Internet, so some personality types might be drawn to the challenge of building nanoscale von Neumann self-replicators. More likely a bigger threat would come from rogue nations and terrorist groups who developed the technology as a tool for blackmail. A single gray goo outbreak in, say, New York or Washington, D.C., would be disastrous, causing widespread disruption. The entire affected area would have to be isolated and treated just as if there'd been an outbreak of ebola or the plague. The situation would be even worse if the weapon were deployed in the atmosphere or the ocean, where it would be hard to limit its spread. A single replicator that was able to make a copy of itself every 15 minutes or so could result in a population of tens of billions of replicators after just ten hours; within two days, assuming it could find enough material to consume, the von Neumann swarm would be as massive as the Earth; in fact, it would be the Earth – transformed entirely into goo.
Perhaps this fate has already befallen other planets in space. Their inhabitants, just a bit more advanced than ourselves, may have looked forward to a bright, nanoengineered tomorrow in which molecular manufacturing would lead to more efficient solar panels, miracles of medicine, superfast computers, and so forth. But then it all went horribly wrong. Either by accident or malicious design, self-replicating nanobots were set free and went AWOL, eating their designers and their designers' worlds. If this is one of the ways that technological races sometimes meet their ends, those seeking evidence of extraterrestrial intelligence might think about tuning their telescopes to the spectral signatures, not of rocky worlds like the Earth, but of balls of gray goo.