Why the Future Doesn’t Need Us

the future

Why the future doesn’t need us‘ is an article written by Bill Joy (then Chief Scientist at Sun Microsystems) in the April 2000 issue of ‘Wired’ magazine. In the article, he argues (quoting the sub title) that ‘Our most powerful 21st-century technologies — robotics, genetic engineering, and nanotech — are threatening to make humans an endangered species.’ Joy warns: ‘The experiences of the atomic scientists clearly show the need to take personal responsibility, the danger that things will move too fast, and the way in which a process can take on a life of its own. We can, as they did, create insurmountable problems in almost no time flat. We must do more thinking up front if we are not to be similarly surprised and shocked by the consequences of our inventions.’

While some critics have characterized Joy’s stance as obscurantism (the practice of deliberately preventing the facts or the full details of some matter from becoming known) or neo-Luddism, others share his concerns about the consequences of rapidly expanding technology.

Joy argues that developing technologies provide a much greater danger to humanity than any technology before it has ever presented. In particular, he focuses on genetics, nanotechnology and robotics. He argues that 20th century technologies of destruction, such as the nuclear bomb, were limited to large governments, due to the complexity and cost of such devices, as well as the difficulty in acquiring the required materials. While emerging technology is increasingly available to smaller entities. He references the novel ‘The White Plague,’ in which a mad scientist creates a virus capable of wiping out humanity.

He also voices concern about increasing computer power. His worry is that computers will eventually become more intelligent than humans are, leading to such dystopian scenarios as robot rebellion. He notably quotes the Unabomber on this topic.

In ‘The Singularity Is Near,’ Ray Kurzweil questioned the regulation of potentially dangerous technology, asking ‘Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk that these same technologies may someday be used for malevolent purposes?’ However, some authors, such as John Zerzan and Chellis Glendinning, believe that modern technologies are bad for both freedom and the problem of cancer, and that the two issues are connected.

Martin Ford makes the case that the risk posed by accelerating technology may be primarily economic in nature. Ford argues that before technology reaches the point where it represents a physical existential threat, it will become possible to automate nearly all routine and repetitive jobs in the economy. In the absence of a major reform to the capitalist system, this could result in massive unemployment, plunging consumer spending and confidence, and an economic crisis potentially even more severe than the Great Depression. If such a crisis were to occur, subsequent technological progress would dramatically slow because there would be insufficient incentive to invest in innovation.

After the publication of the article, Bill Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.