The all things EV chat thread

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site
I was thinking of a situation where the car is in reasonably fast moving traffic and the cars in front start to brake heavily. In that situation would the autopilot also apply the brakes? If so, if it was turned off just as the cars in front started to brake then the driver would have to be ready to immediately take over.

Autopilot works independently from other systems though, automatic emergency braking for example

from the manual

When Full Self-Driving (Beta) (also referred to as Autosteer on City Streets) is engaged, Model 3 attempts to drive to your destination by following curves in the road, stopping at and negotiating intersections, making left and right turns, navigating roundabouts, and entering/exiting highways.

Unlike Traffic-Aware Cruise Control, Autosteer, and Navigate on Autopilot, which are intended for use on multi-lane roadways with clear lane markings, Full Self-Driving (Beta) is meant to work in a variety of driving scenarios. You can use Full Self-Driving (Beta) on any type of roadway, including residential and city streets.

Always remember that Full Self-Driving (Beta) does not make Model 3 autonomous and requires a fully attentive driver who is ready to take immediate action at all times. While Full Self-Driving (Beta) is engaged, you must monitor your surroundings and other road users at all times.

Driver intervention may be required in certain situations, such as on narrow roads with oncoming cars, in construction zones, or while going through complex intersections. For more examples of scenarios in which driver intervention might be required, see Limitations and Warnings.

Full Self-Driving (Beta) uses inputs from cameras mounted at the front, rear, left, and right of Model 3 to build a model of the area surrounding Model 3 (see Cameras). The Full Self-Driving computer installed in Model 3 is designed to use this input, rapidly process neural networks, and make decisions to safely guide you to your destination.

Like other Autopilot features, Full Self-Driving (Beta) requires a fully attentive driver and will display a series of escalating warnings requiring driver response. You must keep your hands on the steering wheel while Full Self-Driving (Beta) is engaged. In addition, the cabin camera monitors driver attentiveness (see Cabin Camera).

Use Full Self-Driving (Beta) in limited Beta only if you will pay constant attention to the road and be prepared to act immediately, especially around blind corners, crossing intersections, and in narrow driving situations. For more information, see Limitations and Warnings.
 

cliveb

Head Pro
Joined
Oct 8, 2012
Messages
2,728
Visit site
You test..and test...and test
Off public roads but in a similar environment until you're 100000000% sure..
Anything else is madness.
A laudable aim in principle, but this is software.
How much testing do you think it's possible to do?
Think how long Windows has been around, and security flaws are still being found on a depressingly regular basis.

There's an emotional aspect at work here. If a self-driving car turns out to be responsible for a single fatal crash, the gut reaction is that the sky is falling in.
But statistically, if self-driving cars kill X people a year through software faults, compared to the same number of normal cars killing 10X people a year through driver error, that's a win, isn't it?

(PS. I'm not a Tesla fan, BTW. I think as a company they seem to have a "we know best" attitude similar to the likes of Apple and Google).
 

clubchamp98

Journeyman Pro
Joined
Jan 23, 2014
Messages
17,891
Location
Liverpool
Visit site
A laudable aim in principle, but this is software.
How much testing do you think it's possible to do?
Think how long Windows has been around, and security flaws are still being found on a depressingly regular basis.

There's an emotional aspect at work here. If a self-driving car turns out to be responsible for a single fatal crash, the gut reaction is that the sky is falling in.
But statistically, if self-driving cars kill X people a year through software faults, compared to the same number of normal cars killing 10X people a year through driver error, that's a win, isn't it?

(PS. I'm not a Tesla fan, BTW. I think as a company they seem to have a "we know best" attitude similar to the likes of Apple and Google).
Who do you hold to blame in a driverless car for an accident that kills a loved one ?

Thats going to be an interesting court case!
 

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site
At the moment yes but when we’re sitting in the back ?

Tesler should not call it Autopilot as it’s clearly not.
They need a better name. It’s just ACC.

Maybe it will go down the blackbox route like air planes? They been flying themselves in the main for decades
 

Bunkermagnet

Journeyman Pro
Joined
May 14, 2014
Messages
8,546
Location
Kent
Visit site
I'm suprised there isn't the concern about making changes to the car software remotely. If Tesla can do it, whats to stop some ransomware outfit making changes not wanted unless you pay up?
 

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site

Bunkermagnet

Journeyman Pro
Joined
May 14, 2014
Messages
8,546
Location
Kent
Visit site

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site

cliveb

Head Pro
Joined
Oct 8, 2012
Messages
2,728
Visit site
Who do you hold to blame in a driverless car for an accident that kills a loved one ?

Thats going to be an interesting court case!
It will be the driver considering all the warnings put out in the Manuals etc. clearly shifts blame to the driver
It's not that cut and dried. Future legislation is leaning towards putting the responsibility on the manufacturer rather than the driver:

The legal responsibility for driverless vehicles was considered in the Automated & Electric Vehicles Act 2018, which simply stated that insurers are directly liable for accidents caused by vehicles driving themselves and not passengers being transported in them.

In a nutshell, the Law Commission agrees with the current position that passengers in a vehicle being operated by a fully Automated Driving System (“ADS”), as opposed to a driver support feature such as cruise control, should not be legally responsible for its actions. However, the Law Commission has recommended that under a new Automated Vehicles Act, the manufacturer should be responsible for the actions of automated vehicles and that passengers should be immune from prosecution if the vehicle were to speed, jump a red light, strike a pedestrian or crash.

The manufacturer, or Authorised Self-Driving Entity, would be responsible for putting all automated driving features and systems in a vehicle through a two stage approval and authorisation process before the vehicle was permitted on British roads.

However, the Law Commission has recommended that all Automated Vehicles have a human passenger called a “User-In-Charge” who is qualified to drive and is able to take over in the event of a problem. It is stressed that the User-In-Charge would not need to be actively monitoring the vehicle or what is happening on the road ahead, but would need to be ready to take over after a reasonable time, called the Transition Period, if the Automated Driving System encounters a problem and makes a “Transition Demand” for the human to take over. The Law Commission has stressed that the User-In-Charge would have no legal responsibility until after the expiry of the Transition Period.

The User-in-Charge would remain responsible for maintaining and insuring the vehicle, checking all loads are secure before setting off, ensuring all children are wearing seat belts and exchanging details in the event of a collision.

Sadly, the dream of being able to have a few drinks and get your vehicle to take you home in place of a taxi must stay in the realms of fantasy, as the Law Commission states that the User-In-Charge must be fit to drive at all times! Similarly, if you have hopes of sitting back and letting the vehicle take the strain whilst watching your favourite Netflix series or taking a nap then forget it, as the Law Commission has also recommended that the User-In-Charge should not be allowed to use a mobile, screen device or go to sleep!
 

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site
It's not that cut and dried. Future legislation is leaning towards putting the responsibility on the manufacturer rather than the driver:

The legal responsibility for driverless vehicles was considered in the Automated & Electric Vehicles Act 2018, which simply stated that insurers are directly liable for accidents caused by vehicles driving themselves and not passengers being transported in them.

In a nutshell, the Law Commission agrees with the current position that passengers in a vehicle being operated by a fully Automated Driving System (“ADS”), as opposed to a driver support feature such as cruise control, should not be legally responsible for its actions. However, the Law Commission has recommended that under a new Automated Vehicles Act, the manufacturer should be responsible for the actions of automated vehicles and that passengers should be immune from prosecution if the vehicle were to speed, jump a red light, strike a pedestrian or crash.

The manufacturer, or Authorised Self-Driving Entity, would be responsible for putting all automated driving features and systems in a vehicle through a two stage approval and authorisation process before the vehicle was permitted on British roads.

However, the Law Commission has recommended that all Automated Vehicles have a human passenger called a “User-In-Charge” who is qualified to drive and is able to take over in the event of a problem. It is stressed that the User-In-Charge would not need to be actively monitoring the vehicle or what is happening on the road ahead, but would need to be ready to take over after a reasonable time, called the Transition Period, if the Automated Driving System encounters a problem and makes a “Transition Demand” for the human to take over. The Law Commission has stressed that the User-In-Charge would have no legal responsibility until after the expiry of the Transition Period.

The User-in-Charge would remain responsible for maintaining and insuring the vehicle, checking all loads are secure before setting off, ensuring all children are wearing seat belts and exchanging details in the event of a collision.

Sadly, the dream of being able to have a few drinks and get your vehicle to take you home in place of a taxi must stay in the realms of fantasy, as the Law Commission states that the User-In-Charge must be fit to drive at all times! Similarly, if you have hopes of sitting back and letting the vehicle take the strain whilst watching your favourite Netflix series or taking a nap then forget it, as the Law Commission has also recommended that the User-In-Charge should not be allowed to use a mobile, screen device or go to sleep!

With the cost of insurance already at ridiculous levels surely this tech will take it even further into stupid costs if they have to be liable

Best to just get a cab
 

cliveb

Head Pro
Joined
Oct 8, 2012
Messages
2,728
Visit site
This whole idea of requiring a human to monitor how well the computer is driving is bonkers.
It was demonstrated many years ago that it's far better to have a computer monitor people rather than the other way around.
Computers don't get bored. People do. Expecting a driver to stay vigilent while the car is driving itself is foolish.
 

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site
There's also a Sky News article about who is liable:

Same insurance companies that have insured houses for "1 million building insurance" but because the rebuild cost has gone up from say 300k to 400k are leaving their customers out of pocket .. even with their headline "1 million"

Yeah I'm sure they won't wiggle out of liability
 

clubchamp98

Journeyman Pro
Joined
Jan 23, 2014
Messages
17,891
Location
Liverpool
Visit site
With the cost of insurance already at ridiculous levels surely this tech will take it even further into stupid costs if they have to be liable

Best to just get a cab
Yes I agree.
The whole point surely is you can get in the back pissed and wake up home.🏠

Cant see the point if your “ in charge “ all the time..
 

PJ87

Journeyman Pro
Joined
Apr 1, 2016
Messages
21,845
Location
Havering
Visit site
Yes I agree.
The whole point surely is you can get in the back pissed and wake up home.🏠

Cant see the point if your “ in charge “ all the time..

I felt exactly the same when they tried to enforce the no drinking and don't be too drunk on the tube

I mean come on it's public transport that's the whole point of it to get home drunk or not
 
Top