Philosophy of programming a driverless car. Kill or be killed.

Technology in driving is becoming more dominant...
CautiousD
Posts: 32
Joined: Wed Oct 07, 2015 2:29 pm

Philosophy of programming a driverless car. Kill or be killed.

Postby CautiousD » Tue Nov 03, 2015 8:01 am

Should driverless cars kill their own passengers to save a pedestrian?

http://qz.com/536738/should-driverless- ... edestrian/

This is apparently assuming a single response for all situations where a collision would be inevitable and not an array of outcomes for the programme to scan. Interesting philosophical proposition.

User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby akirk » Tue Nov 03, 2015 8:19 am

The interesting omission in many of these discussions is how predictive could an autonomous car be... i.e will they ever get into a situation where such a choice is needed, or will they always predict it and stop first...
Alasdair

CautiousD
Posts: 32
Joined: Wed Oct 07, 2015 2:29 pm

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby CautiousD » Tue Nov 03, 2015 9:19 am

My opinion is that unless/ until external factors are equally controllable, that the only choice available to programmers is to mitigate against those factors by assuming responsibility for them. So does the car automatically divert (if safe), brake (if safe) and possibly impact with an immovable object, thereby injuring or killing the occupants or if children are on board, does the car 'decide' to opt for binary programme B and protect their lives and brake before impacting with the old lady crossing the road?

Moral conundrum equating to moral can-o-worms or Ethics for our future?

User avatar
Horse
Posts: 3557
Joined: Mon Sep 28, 2015 9:20 am

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby Horse » Tue Nov 03, 2015 10:20 am

Could be some interesting 'electronic sums' going on rapidly:

> If deer = hit animal
_ = Don't swerve to avoid
__ = Don't risk head-on collision

All well and good as long as it really is a Muntjac deer and not a guide dog on a lead . . . ;)
Your 'standard' is how you drive alone, not how you drive during a test.

Gareth
Posts: 980
Joined: Mon Sep 28, 2015 2:44 pm
Location: Berkshire
Contact:

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby Gareth » Tue Nov 03, 2015 10:29 am

CautiousD wrote:Should driverless cars kill their own passengers to save a pedestrian?

Wrong question ... Should driverless cars kill their own passengers to save a cyclist? :twisted:
there is only the road, nothing but the road ...

User avatar
Horse
Posts: 3557
Joined: Mon Sep 28, 2015 9:20 am

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby Horse » Tue Nov 03, 2015 12:23 pm

FWIW . . .

PRELIMINARY ANALYSIS OF REAL-WORLD CRASHES INVOLVING SELF-DRIVING VEHICLES
Brandon Schoettle, Michael Sivak

The University of Michigan Transportation Research Institute
Ann Arbor, Michigan 48109-2150 U.S.A.
Report No. UMTRI-2015-34 October 2015

This study performed a preliminary analysis of the cumulative on-road safety record of selfdriving vehicles for three of the ten companies that are currently approved for such vehicle testing in California (Google, Delphi, and Audi). The analysis compared the safety record of these vehicles with the safety record of all conventional vehicles in the U.S. for 2013 (adjusted for underreporting of crashes that do not involve a fatality).

Two important caveats should be considered when interpreting the findings. First, the distance accumulated by self-driving vehicles is still relatively low (about 1.2 million miles, compared with about 3 trillion annual miles in the U.S. by conventional vehicles). Second, selfdriving vehicles were thus far driven only in limited (and generally less demanding) conditions (e.g., avoiding snowy areas). Therefore, their exposure has not yet been representative of the exposure for conventional vehicles.

With these caveats in mind, there were four main findings. First, the current best estimate is that self-driving vehicles have a higher crash rate per million miles traveled than conventional vehicles, and similar patterns were evident for injuries per million miles traveled and for injuries per crash. Second, the corresponding 95% confidence intervals overlap. Therefore, we currently cannot rule out, with a reasonable level of confidence, the possibility that the actual rates for selfdriving vehicles are lower than for conventional vehicles. Third, self-driving vehicles were not at fault in any crashes they were involved in. Fourth, the overall severity of crash-related injuries involving self-driving vehicles has been lower than for conventional vehicles.
Your 'standard' is how you drive alone, not how you drive during a test.

Pyrolol
Posts: 73
Joined: Sun Oct 04, 2015 1:52 am
Location: San Francisco

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby Pyrolol » Tue Nov 03, 2015 1:08 pm

Horse wrote:FWIW . . .
...
Second, the corresponding 95% confidence intervals overlap.


So they have no clue at all then...

User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby akirk » Tue Nov 03, 2015 2:16 pm

the debate about who was at fault is at present too simplistic from what I have read...
e.g. another car rear-ending an autonomous car... yes at a simplistic level it is the fault of the other driver, but is it possibly also partly to do with autonomous cars doing unexpected things? There was one example of an autonomous car stopping when it spotted a set of deckchairs and didn't know what they were - if a car stops suddenly with no warning then it is not unlikely that another car will drive into it!

Alasdair

User avatar
Horse
Posts: 3557
Joined: Mon Sep 28, 2015 9:20 am

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby Horse » Tue Nov 03, 2015 2:22 pm

akirk wrote: if a car stops suddenly with no warning then it is not unlikely that another car will drive into it!


So are you saying that more likely encounters with autonomous vehicles might lead to a general improvement in driving standards from increased expectation of such events? ;)
Your 'standard' is how you drive alone, not how you drive during a test.

User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Re: Philosophy of programming a driverless car. Kill or be killed.

Postby akirk » Tue Nov 03, 2015 2:38 pm

Horse wrote:
akirk wrote: if a car stops suddenly with no warning then it is not unlikely that another car will drive into it!


So are you saying that more likely encounters with autonomous vehicles might lead to a general improvement in driving standards from increased expectation of such events? ;)


logic applied to how people drive, then yes, but sadly perhaps not!

Alasdair


Return to “Technology”

Who is online

Users browsing this forum: No registered users and 12 guests