News Driverless cars: Who should die in a crash?

Notebook

Addon Developer
Addon Developer
News Reporter
Donator
Joined
Nov 20, 2007
Messages
11,816
Reaction score
640
Points
188
Well, a moral dillema, or just old fashioned heads or tails?

https://www.bbc.co.uk/news/technology-45991093

People were presented with several scenarios. Should a self-driving car sacrifice its passengers or swerve to hit:
a successful business person?
a known criminal?
a group of elderly people?
a herd of cows?
pedestrians who were crossing the road when they were told to wait?

To be honest, yes it should swerve. Shame about the cows, but unless they were Friesians they should have been on a pedestrian crossing. (Friesians are black and white, should be never anywhere near a pedestrian crossing).

https://www.google.co.uk/search?biw......0i67k1.0.3aOVk8zjsxg#imgrc=-vATK_4-ihGc9M:


N.
 
Last edited:

Face

Well-known member
Orbiter Contributor
Addon Developer
Beta Tester
Joined
Mar 18, 2008
Messages
4,398
Reaction score
578
Points
153
Location
Vienna
I tend to see this in a pragmatic capitalistic way: how many people would buy cars that would opt to sacrifice them in any given scenario? Call me egoistical, but I wouldn't put my family at risk by letting it drive around in technology that puts them in danger for some moral attitude.

I think many fathers will think the same, and therefore such a product would quickly lose market share IMHO. Of course there are always those pharisee high-moral people that think different, but in the end they will reduce their own numbers quite naturally.

So the car passengers should always be priority one for the car AI. On the question what to sacrifice if there are more than one victim (e.g. swerve left to kill the elderly people, swerve right to kill the known criminal), the only technical answer I have is: decide on score points people give to themselves. If the society the car is in scores criminals so low that they lose against many elderly people, the criminal gets killed, which would be the norm, I guess.
If the decision is between e.g. 10 elderly and 5 children, and the society gives more than double points to young people, the elderly will be killed.
 

Zandy12

Add-on Developer
Donator
Joined
Nov 19, 2017
Messages
170
Reaction score
2
Points
18
Location
Ames
It sounds like a thought experiment regarding the self-driving car and the problems associated with some difficult scenarios which unfortunately do need to be solved. I agree with Face, I don't think the self-driving car will be implemented into the mainstream market for a while because of potential problems like the one mentioned in the article. The key here is high competency from the car itself and the developers that will fix these issues. Interesting topic.
 

Notebook

Addon Developer
Addon Developer
News Reporter
Donator
Joined
Nov 20, 2007
Messages
11,816
Reaction score
640
Points
188
Its an interesting problem from the AI point of view. Maybe Asimov's three laws need an update.
Simplest thing would be to have all self-drive cars fitted with ejection seats. Then the car can save itself.

N.
 

Frilock

Donator
Donator
Joined
Mar 23, 2010
Messages
696
Reaction score
260
Points
78
Doesn't matter, Skynet or Agent Smith will eventually turn them all into weapons eventually anyway.
 

ADSWNJ

Scientist
Addon Developer
Joined
Aug 5, 2011
Messages
1,667
Reaction score
3
Points
38
I think Face nails it perfectly. I found the questions unnecessarily identity politics and virtue signalling (e.g. do you want to kill the female athlete or the male executive, do you want to kill the old lady or the young lady).

Basically - I want the car to protect the passengers through design (strong safety crumple zones, airbags), and through expecting occupants to avail themselves of basic safety (seat belts). The AI should pick better options - e.g. hard slam into the side wall, rather than taking a head on accident, or an intentional handbrake turn. In worst case, I would expect the car to not kill its occupants in a head-on accident where there is a pedestrian crossing, because I would not expect it to be going fast enough to kill them in that situation.

MIT ... failing grade. Please do better.
 

Thorsten

Active member
Joined
Dec 7, 2013
Messages
785
Reaction score
56
Points
43
I tend to see this in a pragmatic capitalistic way: how many people would buy cars that would opt to sacrifice them in any given scenario? Call me egoistical, but I wouldn't put my family at risk by letting it drive around in technology that puts them in danger for some moral attitude.

Call me an egoist, but I as a pedestrian/cyclist would never accept that cars are legal that opt to sacrifice me in a given situation for the sake of the passengers because of some immoral attitude and would give my vote to any party which proposes to ban such cars.

So there's that - it's not only about being able to sell these cars, it about being able to make (and keep) them legal.

It's part of humanities moral setup that we ponder decisions such as when it is acceptable to risk our life for the sake of others (what for example if you're driving home and the car has the choice to risk you or your own kids who are on the road?) and that can't be solved in a simple way.
 
Last edited:

Face

Well-known member
Orbiter Contributor
Addon Developer
Beta Tester
Joined
Mar 18, 2008
Messages
4,398
Reaction score
578
Points
153
Location
Vienna
Call me an egoist, but I as a pedestrian/cyclist would never accept that cars are legal that opt to sacrifice me in a given situation for the sake of the passengers because of some immoral attitude and would give my vote to any party which proposes to ban such cars.

So there's that - it's not only about being able to sell these cars, it about being able to make (and keep) them legal.

Sure, but you as the hypothetical pedestrian in that scenario did not pay for the car. That's the big difference here.
Vote all the parties you want to ban such cars, in the end it will be a government that wants to see them sold for a growing economy. And legal is what the government makes it.

The moral setup you mentioned is a funny thing, because it changes all of the time, and differs from culture to culture and even between religions. Yet those with the index finger up in the air treat it like something absolute, and with a surprisingly intolerant attitude against those who don't. In addition, high morals often also show a significant ignorance of reality, unfortunately.
 

ADSWNJ

Scientist
Addon Developer
Joined
Aug 5, 2011
Messages
1,667
Reaction score
3
Points
38
Call me an egoist, but I as a pedestrian/cyclist would never accept that cars are legal that opt to sacrifice me in a given situation for the sake of the passengers because of some immoral attitude and would give my vote to any party which proposes to ban such cars.

The interesting thing, though, is what would a human do in such a circumstance? Firstly, they would probably not know the brakes had failed until they needed them (vs a AP knowing in milliseconds that they were failing). Secondly, as the human panics with no brakes, they would probably freeze up and follow the exact same course into the pedestrians (vs the AP at least evaluating thousands of more options). Finally - if there was anything the AP could do, it would do it, versus likely the human executing a less than optimal response under the surprise and immediate blood pressure/stress/adrenaline surge.
 

MaverickSawyer

Acolyte of the Probe
Joined
Apr 11, 2011
Messages
3,919
Reaction score
5
Points
61
Location
Wichita
In the spirit of the question... I would prefer that the car select a personal injury attorney as the sacrificial victim. :lol:

In response to the wording of the question... How is the car able to tell that the person in front of them is a female athlete, or a male executive? How does it know that someone has a criminal record? That implies a connection to a Big Brother network, so it should be able to either predict an impending accident and adjust traffic controls and other autonomous vehicles according, or do so in response to suddenly unsafe conditions.
 

Linguofreak

Well-known member
Joined
May 10, 2008
Messages
5,031
Reaction score
1,271
Points
188
Location
Dallas, TX
To be honest, yes it should swerve. Shame about the cows, but unless they were Friesians they should have been on a pedestrian crossing. (Friesians are black and white, should be never anywhere near a pedestrian crossing).


Actually, the bias against hitting cows should be fairly strong. Don't get me wrong: I'm a bovivore, so I have no problem with killing cows, but animals of that size are themselves a danger to the occupants. Sure, if the choice is plowing into a brick wall or a herd of cattle at 60 mph, you go for the cows, but if it's a choice between a herd of cows and rear-ending an F-250 or a Hummer at 60, the occupants are probably safer if you hit the vehicle. The impact with the vehicle will be harder as such things tend to weigh more than cows, but the vehicle is less likely to come crashing down on your windshield.

But as for the broader moral problem of hitting pedestrians vs. causing the death of the occupants of the vehicle, I doubt that there are very many accidents where, given a human driver with decades of experience with human culture and human moral values:

1) There is little enough time to react that there is no third option that kills nobody.

2) There is enough time to react that both choices can be taken without losing control of the vehicle and having subsequent events be unpredictable.

and

3) There is enough time to react for a human to make a reasoned value judgement on moral principles rather than a reflex reaction on the spur of the moment without using up so much time that 2) is no longer true.

So in most cases with a *human driver*, there is not time for there to be a moral dilemma. , and most of the moral choices that will contribute to a given accident will have occured long before the event itself (e.g, "Am I going to make a habit of driving like a maniac?", "Am I going to get drunk tonight and then drive myself home?", etc.).

For a machine, which will not have the cultural resources and biological instincts that humans have to aid it in making a moral decision, there will be an even smaller proportion of cases in which there is any time for there to be a moral dilemma.

So I frankly think that, to the degree that we can provide it with the instrumentation to determine such things, self-driving vehicles should make decisions about what to hit based on physical characteristics of the collision, such as the mass of the object to be hit, the height of its center of gravity (if it's higher than the hood, it's more likely to come crashing down on the windshield and kill the front seat occupants), and how much maneuvering to avoid it would test the handling limits of the vehicle (if we lose control, depart the road sideways and roll over, there's a considerable danger to the occupants involved, and anybody by the side of the road).

"Minimize the amount of mass you make contact with" is, I think, a fairly good rule to start with. Mass is a good proxy for the severity of the accident for the occupants, the property value of any inanimate objects or animals you might strike, the size of any crowd of people, and the likelyhood of losing control and endangering the occupants and who knows who else after the impact. This won't produce a morally perfect choice every time, but I think it's a good heuristic. Now, it may not be realistically possible even to evaluate mass well enough to actually use this heuristic, but it's a whole lot more realistic than having your car evaluate human moral considerations in a situation that gives flesh-and-blood humans little enough time to do so.
 

Face

Well-known member
Orbiter Contributor
Addon Developer
Beta Tester
Joined
Mar 18, 2008
Messages
4,398
Reaction score
578
Points
153
Location
Vienna
"Minimize the amount of mass you make contact with" is, I think, a fairly good rule to start with. Mass is a good proxy for the severity of the accident for the occupants, the property value of any inanimate objects or animals you might strike, the size of any crowd of people, and the likelyhood of losing control and endangering the occupants and who knows who else after the impact. This won't produce a morally perfect choice every time, but I think it's a good heuristic.

I can already imagine the "think about the children" :censored:-storm. The smaller the human (i.e. less mass), the higher its moral "value", it seems. Your rule is really good for maximizing security for the occupants, though.
 

Notebook

Addon Developer
Addon Developer
News Reporter
Donator
Joined
Nov 20, 2007
Messages
11,816
Reaction score
640
Points
188
I have to point out I have an anti-bovine bias.
During a previous life as a car mechanic I was sent off to fix a puncture in a farmers field with the trusty Land-Rover.
To cut it short, a herd of cows, who were miles away, steathily crept up on me and knocked the farmers Land-Rover off the jack. Luckily I'd parked mine at right-angle so it didn't move much.
Ended up on the roof as the cows were bigger than me, and I couldn't get in the doors.

Ate as much beef as I could after that, and stayed off dairy products.

N.
 

RisingFury

OBSP developer
Addon Developer
Joined
Aug 15, 2008
Messages
6,427
Reaction score
492
Points
173
Location
Among bits and Bytes...
This kind of a discussion is almost entirely worthless. As self-driving cars are perfected, they'll avoid getting into accidents in the first place, but once they do get into accidents, the number of instances where they can't just avoid anyone's death will be tiny. And when that instance does happen, I don't even know how much choice the car will have in the first place - if it's trained to choose at all.

The sooner we get self-driving cars with better-than-human driving performance on the road, the fewer people will die in accidents.
 

Urwumpe

Not funny anymore
Addon Developer
Donator
Joined
Feb 6, 2008
Messages
37,605
Reaction score
2,327
Points
203
Location
Wolfsburg
Preferred Pronouns
Sire
The key question is: How should the car tell, who it is about to sacrifice? Real world is no computer game. In the worst case, we would get the same prejudice programmed into the AI, that already makes humanity unbearable. For example prefering to collide with black people, because chances are higher that they earn less and can't afford a good lawyer to sue you. Welcome to racism 2.0

I think a perfect car should simply pick the trajectory that minimizes harm to owner and other humans according to its limited perception. Knowing that you are deciding on flawed and incomplete information leads to different decisions, than somebody would do with perfect knowledge. As RisingFury said: It might than be smarter to not get into such dangerous situations at all. In such a situation there is no right decision. Only the least wrong.
 

jedidia

shoemaker without legs
Addon Developer
Joined
Mar 19, 2008
Messages
10,866
Reaction score
2,127
Points
203
Location
between the planets
The scenario in which the car has enough time to correctly identify the persons from camera footage and run exhaustive queries on their personal and occupational information, as well as apparently their criminal records (why not include medical records as well? run over the guy that is terminally ill!), but not enough time to brake down to a speed at which nobody has to die, is completely absurd... Why do people think that AIs will have infinite bandwith and processing power at their disposal?
 

Face

Well-known member
Orbiter Contributor
Addon Developer
Beta Tester
Joined
Mar 18, 2008
Messages
4,398
Reaction score
578
Points
153
Location
Vienna
The scenario in which the car has enough time to correctly identify the persons from camera footage and run exhaustive queries on their personal and occupational information, as well as apparently their criminal records (why not include medical records as well? run over the guy that is terminally ill!), but not enough time to brake down to a speed at which nobody has to die, is completely absurd... Why do people think that AIs will have infinite bandwith and processing power at their disposal?

Technically, you can pre-process this by means of mandatory tags people have to wear, containing their "score-points". That would be quite a fascist scenario and equally absurd, but technically feasible with current technology.
Come to think, perhaps we shouldn't get certain governments started on that idea. :lol:
 

Linguofreak

Well-known member
Joined
May 10, 2008
Messages
5,031
Reaction score
1,271
Points
188
Location
Dallas, TX
I can already imagine the "think about the children" :censored:-storm. The smaller the human (i.e. less mass), the higher its moral "value", it seems. Your rule is really good for maximizing security for the occupants, though.

I think it's even good for optimizing safety for non-occupants, with a few exceptions, such as what you've mentored.
 

steph

Well-known member
Joined
Mar 22, 2008
Messages
1,394
Reaction score
714
Points
113
Location
Vendee, France
Perhaps something like avoid pedestrians at all costs if going above a certain speed; even if it means swerving into oncoming traffic, since chances are you'll at least avoid a fatality that way. Some sort of limited split-second assessment. Of course, the complications are endless. What if it swerves in front of a driven car and its driver, with a reflex turn of the wheel, plows into other pedestrians? Would it then be the driver's fault for, perhaps involuntarily, disagreeing with the driverless car's decision?
Then again, I guess the approach may be to simply go slow enough that it can stop if any realistic situation arises on that road.

Road rules may also apply. Just like drivers, if it has right of way and something/someone arises from somewhere they're not supposed to, it can just hit the brakes and that's it, without further avoidance. It's not like you don't have all the data to analyze after the crash

Evil me is curious what it would do if it gets tailgated, as it will often be the case, and it computes that if it brakes to stop ahead of the pedestrian, the rear vehicle will not be able to avoid it in time.
 

jedidia

shoemaker without legs
Addon Developer
Joined
Mar 19, 2008
Messages
10,866
Reaction score
2,127
Points
203
Location
between the planets
Technically, you can pre-process this by means of mandatory tags people have to wear, containing their "score-points".

The evil regime is entirely composed of radical traffic wardens. Now there's a fun idea for a dystopia! :lol:
We could get a Brazil sequel out of this concept!
 
Top