Friday, 12 August 2016

RISK: How do you program ethics into a driverless car?

http://autoinsurancereview.tk/wp-content/uploads/2016/08/driverless_sized-316x190.jpg


RISK: How do you program ethics into a driverless car?


MIT Media Lab’s Moral Machine forces us to consider whose lives we would prioritize


Staff on August 12, 2016


driverless_sized

Driverless cars aren’t that far away. In 2035, IHS Automotive predicts, 21 million of them will be sold around the world.

But before that happens, automakers and insurers need to figure out what driverless cars will do when an accident is inevitable. Will they prioritize passengers’ lives? Pedestrians? Animals? Babies? The elderly? Pregnant women? What if the car still needs to choose between killing a cyclist or injuring all its passengers?


Read: How driverless cars will change insurance


And then there’s the matter of individual choice. Will each automaker pre-program all their driverless cars with the same set of ethics and priorities, or will owners get to choose?


As executives, insurers and philosophers debate these questions–and as we wait for the technology to evolve–we can test our own moral fibre with MIT Media Lab’s Moral Machine.


Screen Shot 2016-08-12 at 9.51.27 AM


“The greater autonomy given machine intelligence in these roles can result in situations where they have to make autonomous choices involving human life and limb,” the Moral Machine website states. “This calls for not just a clearer understanding of how humans make such choices, but also a clearer understanding of how humans perceive machine intelligence making such choices.”


Read: Canadians split on driverless cars


Users are given 13 situations in which they decide who is saved and who is killed when a driverless car’s breaks fail: passengers, pedestrians, cats, dogs, women, men, doctors, executives, homeless people, athletes and large people.


At the end of the “judgement” section, the Moral Machine compares your preferences to those of the average user. “Our goal is not to judge anyone, but to help the public reflect on important and difficult decisions,” the website states in a disclaimer.


So give it a go, and let us know what you think.


Read: What’s it really like in the driver’s seat of a driverless car?


Image: MIT Media Lab’s Moral Machine



Source link



RISK: How do you program ethics into a driverless car?

No comments:

Post a Comment