Sunday, June 3, 2012

Three laws for the laws of robotics

Ridley Scott's Prometheus opens next week in the United States.  I recently watched the 1979 classic Alien to get into the mood.  The slimy critter bursting from John Hurt's chest is still the unbeatable best in the context of science fiction horror.

An interesting sub-plot in Alien is the man-made entity - - an android played by Ian Holm (as the role of Ash, the science officer).  Ash was designed to do that which his human colleagues lacked the cold-blooded guile to pull off - - poaching the lethal species from space.

I bring this up because the current issue of the Economist has a well-timed cover story - - Morals and the machines.  The article looks at a central theme that engineers need to be aware of.  As robots grow more autonomous, society needs to develop rules to manage them.  As designers, we have yet intersected with the Ashs of the world.  But this point is becoming rapidly closer - - society needs to develop ways of dealing with the ethics of robotics.  This needs to go beyond the guidelines of writer Isaac Asimov written in 1942.  Known as the three laws of robotics - - robots should be required to protect humans, obey orders, and preserve themselves.

The article highlights three 2012 laws for the laws of robotics.  Robot designing engineers may want to consider these three points and watch Alien
  1. Laws are needed to determine where the designer, the programmer, the manufacturer or the operator is at fault if an autonomous drone strike that goes wrong or a driveless car has an accident (or when the "science officer" attempts to choke you to death with a newspaper on a spaceship running from an killing machine).  This has implication for the engineer - - you might want to rule out the use of artificial neural networks, decision-making systems that learn from example rather than obeying predefined rules.
  2. Where ethical systems are embedded into robots, the judgements they make need to be the ones that seem right to most people (and not just the Weyland-Yutani Corp.).  This puts the engineers directly into the interdisciplinary matrix.  The techniques of experimental philosophy, which studies how people respond to ethical dilemmas, should be able to help.
  3. More collaboration is required between engineers, ethicists, lawyers, and policymakers, all of whom would draw up very different types of rules if they were left to their own devices.  Both ethicists and engineers stand to benefit from working together: ethicists may gain a greater understanding of their field by trying to teach ethics to machines, and engineers need to reassure society that they are not taking any ethical short-cuts.
As Ash pointed out 33-years ago, we need to start getting serious about the ethics and responsibilities of new autonomous things like driverless cars.  Prometheus may start us thinking about new themes and responsibilities.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.