This paper on ethical autonomous cars asks whether or autonomous cars would have ethics -essentially prioritise the risks of one participant in a collision over another.
Whoever wrote that is fucking naive.
The whole SUV trend is prioritising the risks of others over that of the owner -or at least appearing to. If you look at the safey ratings of cars they all highlight their passenger safety -but nobody highlights their pedestrian safety numbers. Because if they started to do that, either their cars with awful figures -Jeep Cherokee, Audi TT- wouldn't sell. Or if they did sell, their owners might be slightly self conscious tht their toys were considered more dangerous than others. Though of course the sales figures of SUVs shows that few customers have ethics there.
Everyone who buys an SUV is saying "I don't give a fuck about the safety of others, I want something that bullies them out the way. And if pedestrians, cyclists or people in small cars die -tough".
Does anyone seriously think the owner of an automated SUV would want one that is prepared to prioritise the lives of others over themselves? Not a fucking chance.
But SUVs are about passive safety: the ethics of the situation are implicit in the vehicle, but they aren't implemented in the software. Write those decisions in code and all of a sudden the authors of the software are the one making conscious decisions. Which brings one word to the surface: lawsuit.
If anyone is in a collision with an autonomous vehicle, an immediate question is going to become "did the car make a conscious decision to crash with me?", or "what were its priorities in this situation?". And if it became clear that the collision was a result of explicit decisions in the firmware: google, Mercedes, Volvo or whoever are in deep fucking trouble.
That is: unless they can get the legal system set up to grant them indemnity.
Expect the car and computer companies to be talking to the politicans already, saying things like "Autonomous cars will improve road safety -but if we are exposed to lawsuits then those lives won't be saved", and the politicans to nod with them as they listen to the stories of a brave new world of cars that fend for themselves.
If an autonomous car really did have ethics, would it do the school run? Would it drive to corner shop? Would it refuse to work on a high-smog day? Because that would be ethical. Are we going to see that? Not a fucking chance.
Surely an ethical car would refuse to start up and say "talk a walk fatty" in the ~40% of UK journeys under 2 miles, wouldn't take much ethical cogitation to realise endangering the car occupants and everyone else in the area (plus various eco damage) for such a short journey is not worth it. Mind you, get into an I.Robot situation and they'll probably just shut themselves down permanently for the good of society.ReplyDelete
No I don't think much ethics will be involved.
Donk: either that or the other extreme (the Skynet scenario).ReplyDelete