Roboethics

roboEthics

The term roboethics was coined by roboticist Gianmarco Veruggio in 2002, who also served as chair of an Atleier (workshop) funded by the European Robotics Research Network to outline areas where research may be needed. The road map effectively divided ethics of artificial intelligence into two sub-fields to accommodate researchers’ differing interests:

Machine ethics is concerned with the behavior of artificial moral agents (AMAs); and Roboethics is concerned with the behavior of humans, how humans design, construct, use and treat robots and other artificially intelligent beings.

Robotics is rapidly becoming one of the leading fields of science and technology, so that very soon humanity is going to coexist with a totally new class of technological artifacts: robots. It will be an event rich in ethical, social and economic problems. ‘Roboethics is an applied ethics whose objective is to develop scientific/cultural/technical tools that can be shared by different social groups and beliefs. These tools aim to promote and encourage the development of Robotics for the advancement of human society and individuals, and to help preventing its misuse against humankind.’ It is the first time in history that humanity is approaching the challenge to replicate an intelligent and autonomous entity. This compels the scientific community to examine closely the very concept of intelligence — in humans, animals, and of the mechanical — from a cybernetic standpoint.

Complex concepts like autonomy, learning, consciousness, evaluation, free will, decision making, freedom, and emotions will be analyzed, taking into account that the same concept shall not have, in humans, animals, and machines, the same reality and semantic meaning. From this standpoint, it can be seen as natural and necessary that robotics drew on several other disciplines, like Logic, Linguistics, Neuroscience, Psychology, Biology, Physiology, Philosophy, Literature, Natural history, Anthropology, Art, and Design. Robotics de facto unifies the so called two cultures, science and humanities. The effort to design Roboethics should take care of this specificity. This means that experts must view robotics as a whole — in spite of the current early stage which recalls a melting pot — so they can achieve the vision of the robotics’ future.

Since the ‘First International Symposium on Roboethics’ in Italy in 2004, three main ethical positions emerged from the robotics community: Not interested in ethics (those who consider that their actions strictly technical, and do not think they have a social or a moral responsibility in their work); Interested in short-term ethical questions (those who express their ethical concern in terms of ‘good’ or ‘bad,’ and who refer to some cultural values and social conventions); and Interested in long-term ethical concerns (those who express their ethical concern in terms of global, long-term questions).

As Roboethics is a human-centered ethics, it has to comply with the principles of Human Rights such as dignity equality, justice, pluralism, autonomy, informed consent, and privacy. Additionally, roboethics shares with the other fields of science and technology most of the ethical problems derived from the Second and Third Industrial Revolutions such as dual-use technology, environmental impact, effects on the global distribution of wealth, digital divide, fair access to technological resources, dehumanization of humans in the relationship with the machines, technology addiction, and anthropomorphization of the machines.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.