FREEDOM FORUM: Add a Comment

Article Image

Wired

For years, science fiction author Issac Asimov’s Three Laws of Robotics were regarded as sufficient for robotics enthusiasts. The laws, as first laid out in the short story “Runaround,” were simple: A robot may not injure a human being or allow one to come to harm; a robot must obey orders given by human beings; and a robot must protect its own existence. Each of the laws takes precedence over the ones following it, so that under Asimov’s rules, a robot cannot be ordered to kill a human, and it must obey orders even if that would result in its own destruction.

But as robots have become more sophisticated and more integrated into human lives, Asimov’s laws are just too simplistic, says Chien Hsun Chen, coauthor of a paper published in the International Journal of Social Robotics last month. The paper has sparked off a discussion among robot experts who say it is time for humans to get to work on these ethical dilemmas.


Max 1000 chars
Tag as "Crude or Lewd"

You are free to comment on this discussion in any way you feel is appropriate. If you choose to use to use any language which our editors feel is vulgar -- by their standards -- your comment may be tagged "Crude or Lewd" and may be filtered out of the discussion by those who prefer not to read that sort of thing. If you know you have entered something which will cause your comment to be tagged, we ask that you tag it yourself to save us the time. We do encourage everyone to be civil and not make rude attacks on other people in the Forum. We don't censor out those remarks, but few people enjoy reading them and we would like participation in our Forums to be a pleasant experience for everyone. And, by concentrating on what is said instead of who is saying it, even those who may disagree with you will be more likely to consider your opinions valid.

Thank you for your cooperation!