Autonomous car ethics

Technology in driving is becoming more dominant...
User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Autonomous car ethics

Postby akirk » Mon Oct 19, 2015 8:14 am

http://gulfnews.com/business/sectors/fe ... -1.1602196

It’s nothing more than a dune buggy on a cordoned-off street, but it’s headed for trouble.

A jumble of sawhorses and traffic cones simulates a road crew working over a manhole, and the driverless car must decide: obey the law against crossing a double-yellow line, or break the law and spare the crew. It splits the difference, veering at the last moment and nearly colliding with the cones.


On a recent day, Gerdes met separately at his lab with the CEOs of General Motors Co. and Ford Motor Co. That came about a week after he hosted a workshop on driverless ethics for 90 engineers and researchers, including from electric carmaker Tesla Motors and tech giant Google, which has pledged to put out a robot car as soon as 2017. This year, Tesla will introduce an autopilot feature.

GM will debut a 2017 Cadillac that drives hands-free. Ford CEO Mark Fields says driverless cars will arrive by 2020.

Gerdes’ message: not so fast.


“We need to take a step back and say, ‘Wait a minute, is that what we should be programming the car to think about? Is that even the right question to ask?’ “ Gerdes said. “We need to think about traffic codes reflecting actual behaviour to avoid putting the programmer in a situation of deciding what is safe versus what is legal.”


Moral choice

He soon came to see both its significance and its painful complexity. For example, when an accident is unavoidable, should a driverless car be programmed to aim for the smallest object to protect its occupant? What if that object turns out to be a baby stroller? If a car must choose between hitting a group of pedestrians and risking the life of its occupant, what is the moral choice? Does it owe its occupant more than it owes others?

When human drivers face impossible dilemmas, choices are made in the heat of the moment and can be forgiven. But if a machine can be programmed to make the choice, what should it be?

“It’s important to think about not just how these cars will drive themselves, but what’s the experience of being in them and how do they interact,” Gerdes said. “The technology and the human really should be inseparable.”


good to see someone talking sense! and who is not making decisions /creating press coverage based on upping their future profits ;)

ALasdair

User avatar
jont-
Posts: 1522
Joined: Mon Sep 28, 2015 7:12 am
Location: Herefordshire

Re: Autonomous car ethics

Postby jont- » Mon Oct 19, 2015 8:19 am

Straw man arguments. The evidence from google car suggests in the first instance the thing would just come to a stop and wait for the roadworks to finish. Similarly the last paragraph suggests that autonomous cars /will/ crash. Is it really inevitable?

User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Re: Autonomous car ethics

Postby akirk » Mon Oct 19, 2015 8:35 am

if an autonomous car is driving through a town at a steady 15mph and a pedestrian steps out from in front of a bus etc. (i.e. not able to have been anticipated by the car) then, yes the car may have a range of choices - from crashing with passengers to hitting the pedestrian...

if we look at all forms of technology in production world-wide, where is the item which works perfectly? It doesn't exist. So, whatever is produced will be coded by humans, will have faults, will have system bugs, etc.

The Google cars are a clear example also of the types of assumption made by coders - they assumed that all you need is for the driverless car to stop if not sure and safety will be maintained... what happened? other cars kept crashing into the rear of the google cars because they didn't expect that reaction / getting frustrated because the google car kept stopping...

there is a lot of optimism without necessarily the healthy dose of realism we need in this industry so it is good to see someone adding it

Alasdair

User avatar
Horse
Posts: 3558
Joined: Mon Sep 28, 2015 9:20 am

Re: Autonomous car ethics

Postby Horse » Mon Oct 19, 2015 9:34 am

akirk wrote: ... what happened? other cars kept crashing into the rear of the google cars because they didn't expect that reaction / getting frustrated because the google car kept stopping...

there is a lot of optimism without necessarily the healthy dose of realism we need in this industry so it is good to see someone adding it


IIRC there is research going on to identify how 'other' drivers will react to autonomous vehicles.

However, playing devil's advocate: unless the AV is very obviously different from a 'driver driven' car, how will the following driver 'know' to react any differently to the car ahead slowing and stopping?
Your 'standard' is how you drive alone, not how you drive during a test.

User avatar
Horse
Posts: 3558
Joined: Mon Sep 28, 2015 9:20 am

Re: Autonomous car ethics

Postby Horse » Mon Oct 19, 2015 9:40 am

News article wrote:
A jumble of sawhorses and traffic cones simulates a road crew working over a manhole, and the driverless car must decide: obey the law against crossing a double-yellow line, or break the law and spare the crew. It splits the difference, veering at the last moment and nearly colliding with the cones.


Really no different from passing a parked car, is it?

If the AV is programmed to use, for example, Highway Code rules, then wouldn't the sequence be:
- Aware of double white line system
- Drive according to rules
- Identify hazard ahead
- Aware of exemptions for crossing dwl
- Check for oncoming traffic etc
- Pass hazard
- Return to normal driving according to dwl rules

:?:
Your 'standard' is how you drive alone, not how you drive during a test.

User avatar
jont-
Posts: 1522
Joined: Mon Sep 28, 2015 7:12 am
Location: Herefordshire

Re: Autonomous car ethics

Postby jont- » Mon Oct 19, 2015 9:45 am

Horse wrote:
News article wrote:
A jumble of sawhorses and traffic cones simulates a road crew working over a manhole, and the driverless car must decide: obey the law against crossing a double-yellow line, or break the law and spare the crew. It splits the difference, veering at the last moment and nearly colliding with the cones.


Really no different from passing a parked car, is it?

If the AV is programmed to use, for example, Highway Code rules, then wouldn't the sequence be:
- Aware of double white line system
- Drive according to rules
- Identify hazard ahead
- Aware of exemptions for crossing dwl
- Check for oncoming traffic etc
- Pass hazard
- Return to normal driving according to dwl rules

:?:

But that doesn't generate a newsworthy article, does it? ;)

User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Re: Autonomous car ethics

Postby akirk » Mon Oct 19, 2015 9:51 am

it also highlights that the issues don't lie in what the car can do - but in how it is programmed by the manufacturers...
The logic rules it uses will need to cope with every scenario - ai is really not as developed yet as some make out, so they will have to consider coding for every imaginable scenario - if the manufacturers have any sense they would jointly fund a university to study every accident for which there is analysis and ensure that their rules cover every eventuality down to the smallest detail - and then build a common set of rules - and then add in lots more!

Alasdair

User avatar
jont-
Posts: 1522
Joined: Mon Sep 28, 2015 7:12 am
Location: Herefordshire

Re: Autonomous car ethics

Postby jont- » Mon Oct 19, 2015 9:58 am

akirk wrote:it also highlights that the issues don't lie in what the car can do - but in how it is programmed by the manufacturers...
The logic rules it uses will need to cope with every scenario - ai is really not as developed yet as some make out, so they will have to consider coding for every imaginable scenario - if the manufacturers have any sense they would jointly fund a university to study every accident for which there is analysis and ensure that their rules cover every eventuality down to the smallest detail - and then build a common set of rules - and then add in lots more!

If accidents were studied in the same sort of detail as air accidents, I'd agree, but they aren't, so I'd be very skeptical of their value in drawing general conclusions about future accident avoidance.

Now, where's VW with one set of rules to pass the "autonomous driving test" and another one for the real road :twisted:

User avatar
akirk
Posts: 1659
Joined: Sun Sep 27, 2015 6:58 pm
Location: Bristol

Re: Autonomous car ethics

Postby akirk » Mon Oct 19, 2015 10:01 am

stresseddave might have more knowledge about that, but wouldn't accidents above a certain level of severity probably be recorded somewhere in police / fire service files? and others might be found in insurance company files... I suspect there is a wealth of information which could be analysed - and presumably there is a certain level to which you could explore accidents world wide to help develop the logic - traffic laws are different, but humans are quite similar in their irrationality and it is that aspect which is most challenging from a coding perspective

Alasdair

User avatar
Strangely Brown
Posts: 1018
Joined: Sun Sep 27, 2015 8:06 pm
Location: Sussex

Re: Autonomous car ethics

Postby Strangely Brown » Mon Oct 19, 2015 10:31 am

akirk wrote:it also highlights that the issues don't lie in what the car can do - but in how it is programmed by the manufacturers...
The logic rules it uses will need to cope with every scenario - ai is really not as developed yet as some make out, so they will have to consider coding for every imaginable scenario - if the manufacturers have any sense they would jointly fund a university to study every accident for which there is analysis and ensure that their rules cover every eventuality down to the smallest detail - and then build a common set of rules - and then add in lots more!


Let's apply that to the wetware. Just for fun:

It also highlights that the issues don't lie in what the car can do - but in how the driver is taught to operate it...
The rules the drivers use will need to cope with every scenario - most new drivers are really not as developed yet as some make out, so they will have to consider teaching for far more scenarios - if the DVSA have any sense they would jointly fund a university to study every accident for which there is analysis and ensure that their teaching covers far more eventualities in more detail - and then update the teaching - and then add in lots more!


Return to “Technology”

Who is online

Users browsing this forum: No registered users and 11 guests