“The Devil Made Me Do it”: Computational Ethics for Robots- A talk by Jerry Kaplan

- This event has passed.
Before we set robots and other autonomous systems loose in the world, we need to ensure that they will adhere to basic moral principles and human social conventions. This is easier said than done.
Science fiction writer Issac Asimov famously proposed three laws of Robotics:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Less well known was the purpose of Asimov’s proposal: to point out that these simple rules are woefully inadequate as a design criteria for building ethical robots. So if his laws aren’t sufficient, what is?
Join me as we window shop through two millennia of moral theories to find a suitable foundation for the emerging discipline of “Computational Ethics” — and explore the darkly hilarious ways these theories often fail in practice!
Come join SAILS and the Robotics Seminar for this fascinating talk by Jerry Kaplan. Lunch will be provided.The event will be held at McCullough room 115.