Team-BHP - NTSB links Tesla crash to Autopilot; 3rd such fatal accident
Team-BHP

Team-BHP (https://www.team-bhp.com/forum/)
-   Electric Cars (https://www.team-bhp.com/forum/electric-cars/)
-   -   NTSB links Tesla crash to Autopilot; 3rd such fatal accident (https://www.team-bhp.com/forum/electric-cars/209394-ntsb-links-tesla-crash-autopilot-3rd-such-fatal-accident.html)

The National Transportation Safety Board (NTSB) has released a preliminary report on a fatal Tesla crash that happened in Delray Beach, Palm Beach County, Florida.

The incident occurred on March 1, 2019 at 6:17 am when the Tesla Model 3 was on the southbound right through lane of State Highway 441. The car collided with a semi-trailer that was crossing the highway to make a left turn and join the northbound lanes. The footage from nearby surveillance cameras and Tesla’s front camera reportedly shows the semi-trailer slow down in front of the Tesla, blocking its path. The car underrode the trailer after striking the left side and came to a halt 488 m from the crash site. The 50-year old male driver died due to the crash. The driver of the semi-trailer was uninjured.

The NTSB report also states that the Autopilot (autonomous system) on the Tesla was active at the time of the crash. The driver had engaged the system 10 seconds before the impact and in the last 8 seconds, the driver's hands were not detected on the steering wheel. There was no attempt to avoid the crash by the driver or the Autopilot system. The vehicle was travelling at 109 km/h when the accident occurred.

This is just the preliminary report and investigation on other factors like the actions of the driver of the tractor-trailer, highway conditions, survival factors amongst others will continue. NTSB further added that probable cause has not been determined. The cause which will help issue safety recommendations to prevent similar crashes.

In 2017, the NTSB revealed that over-reliance of technology lead to a crash that killed a Tesla driver on May 7, 2016. A second fatal accident occurred on March 2018, which involved a Model X in Autopilot mode in Mountain View, California.

NTSB links Tesla crash to Autopilot; 3rd such fatal accident-hwy19fh008fig1.jpg

NTSB links Tesla crash to Autopilot; 3rd such fatal accident-hwy19fh008fig2.jpg

Link to the Team-BHP News

Quote:

Originally Posted by blackwasp (Post 4590505)
The National Transportation Safety Board (NTSB) has released a preliminary report on a fatal Tesla crash that happened in Delray Beach, Palm Beach County, Florida.
[/url][/b]

I have said this before and I will say this again – Offering level 2.5 and 3 Autonomous technology to the general public is a big mistake. No automakers should offer this technology at this level. This is not because of the technology but because of the people using it. Current level of automation requires constant monitoring and taking over the control.

Despite several warnings from automakers that this current technology is an assistive one and people still have to keep monitoring the operation of the vehicle and take control as and when necessary, people still and will continue to misuse this technology. For eg. People handing over the control to the vehicle and doing other activities (sleeping, looking at phone or even having intercourse, yes, people do it)

Already there are numerous hurdles (especially legislative) for autonomous tech and more such accidents will only increase the fear among the public and delay launch of this tech or may even prove a death knell.

Quote:

Originally Posted by blackwasp (Post 4590505)
The cause which will help issue safety recommendations to prevent similar crashes.

One very effective solution is to limit speeds (say 60 kmph) when the vehicle is on Autopilot.

Sad, this will be a major hurdle for Tesla. It's unfortunate that people do not realise that these are assistive technologies and not auto pilot.

Maybe they should stop calling them 'auto pilot' for starters.

Quote:

Originally Posted by katchkamalesh (Post 4590535)
I have said this before and I will say this again – Offering level 2.5 and 3 Autonomous technology to the general public is a big mistake. No automakers should offer this technology at this level.

While I understand the point that they should not offer it at the current level , I think they will do so, simply because there is no way they can get access to the amount of data that is needed to get to the next level. I think Tesla is collecting the driving data and using it to train their systems to improve it. Without community/ crowd sourced data , it would take ages for manufactures to collect the required data , if they do it only through their test runs. But it is a fair point. The technology at current levels is not for common people.

Quote:

Originally Posted by katchkamalesh (Post 4590535)
Offering level 2.5 and 3 Autonomous technology to the general public is a big mistake. No automakers should offer this technology at this level... Current level of automation requires constant monitoring and taking over the control.

I agree, but at the same time, there is only so much testing and simulation one can do right? Not only is it time consuming, it will be really tricky to come up with all the situations encountered on a day to day basis on a closed off track / area.

Quote:

Originally Posted by SmartCat (Post 4590593)
One very effective solution is to limit speeds (say 60 kmph) when the vehicle is on Autopilot.

That will probably mitigate some dangers, but won't reduce them right? Further, interpolating slow speed date x times to get high speed behaviour won't work as systems work differently / need different inputs at faster speeds.

Quote:

Originally Posted by mpp19 (Post 4590595)
Maybe they should stop calling them 'auto pilot' for starters.

Yes, the Autopilot name itself is misleading and is over engineered to some extent - eg even if it asks driver to keep hands on wheels after some time, the car will maintain the lane / speed even if driver fails to do so. People are just taking advantage of this fact to do other things.

Quote:

Originally Posted by padmrajravi (Post 4590722)
I think Tesla is collecting the driving data and using it to train their systems to improve it. Without community/ crowd sourced data , it would take ages for manufactures to collect the required data , if they do it only through their test runs. But it is a fair point.

Thats the way to go. Imagine the time, effort and resources needed to recreate all possible scenarios in labs.

Quote:

The technology at current levels is not for common people.
I will disagree. A potential Tesla buyer is not your average 'common people'. That said I get the intent of your post and there should be better driver awareness, training etc. but we humans are bunch of impatient folks who will try to skip to the end of the training as fast as possible.

Quote:

Originally Posted by blackwasp (Post 4591467)
I agree, but at the same time, there is only so much testing and simulation one can do right? Not only is it time consuming, it will be really tricky to come up with all the situations encountered on a day to day basis on a closed off track / area.

I agree with the testing and validation part but at what cost? Many OEMs (BMW, Mercedes, Volvo, Ford etc) are skipping level 3 for this very reason.

Quoting a study published by Deloitte on Consumers’ appetite for self-driving vehicles:
“A series of high-profile incidents may have contributed to the plateau in consumer trust in this year’s study, but there will likely be a longer-term trend toward gradual acceptance,”

https://www2.deloitte.com/global/en/...-adoption.html

If consumers lose confidence in autonomous technology then no one will ever be interested even in sitting in one, forget buying one. To gain consumer confidence, it is imperative that autonomous technology should be able to eliminate all type of accidents.

In Europe, over 90% of accident is due to human error.

To put it in a number perspective, of 100 accidents 90 is cause by driver. Let’s say autonomous tech is able to reduce 98.89% of accidents which means that there will still be an (1) accident.

As a consumer I wouldn’t be willing to take that chance. Simply because I do not want to be that the guy who got killed or injured in car that I have no control over.

Quote:

Originally Posted by katchkamalesh (Post 4590535)
I have said this before and I will say this again – Offering level 2.5 and 3 Autonomous technology to the general public is a big mistake. No automakers should offer this technology at this level. This is not because of the technology but because of the people using it.

Completely agreed! The technology is so clearly a work-in-process. I blame Tesla more than I blame the drivers. With human beings, it should either be "all auto" or "no auto". Having this sort of "partial autonomous driving" is ridiculous. Hell, drivers get distracted or fall asleep when driving normal cars...imagine the situation with autonomous cars.

Quote:

Originally Posted by padmrajravi (Post 4590722)
While I understand the point that they should not offer it at the current level , I think they will do so, simply because there is no way they can get access to the amount of data that is needed to get to the next level.

If they want data, tell them to run 5,000 cars with a driver sitting inside & observing it, available to step in when required. Giving a regular Joe imperfect technology is inexcusable.

Quote:

Originally Posted by GTO (Post 4592162)
If they want data, tell them to run 5,000 cars with a driver sitting inside & observing it, available to step in when required. Giving a regular Joe imperfect technology is inexcusable.


Tesla delivered close to 2 lakh cars in 2018 only. So the amount of data they can get access to from the customers is far more than what they can do with 5000 cars. And that kind of data is needed to train these systems to get to the next level . But I agree with the point. Selling imperfect technology to regular Joe is inexcusable .

The issue I feel, is the way the tech is marketed to average consumers. You cannot expect consumers to know the differences between SAE levels.

The use of the word, Auto Pilot according to me is incorrect. I drive a Level 1 car with basic ADAS tech. Tesla's system is a more advanced ADAS system. But its marketed much higher. The company is to blame.

Quote:

Originally Posted by padmrajravi (Post 4592279)
Selling imperfect technology to regular Joe is inexcusable .

Here's one more reason to add to the list:
Quote:

A new automatic lane-change feature of Tesla's Autopilot system doesn't work well and could be a safety risk to drivers, according to tests performed by Consumer Reports.

The magazine and website tested "Navigate on Autopilot" and found it less competent than human drivers, cutting off other cars without leaving enough space.

Senior Director of Auto Testing Jake Fisher said in a statement Wednesday that the system doesn't appear to react to brake lights or turn signals, and it can't anticipate what other drivers will do. "It's incredibly nearsighted," Fisher says. "You constantly have to be one step ahead of it."
Source

Tesla Operating On Autopilot Slams Into Police Car and Ambulance. After the incident took place, Arizona’s Department of Public Safety tweeted that the driver told troopers his Tesla was on Autopilot at the time of the collision.

NTSB links Tesla crash to Autopilot; 3rd such fatal accident-20200715_224949.jpg

NTSB links Tesla crash to Autopilot; 3rd such fatal accident-smartselect_20200715225326_twitter.jpg

NTSB links Tesla crash to Autopilot; 3rd such fatal accident-smartselect_20200715225335_twitter.jpg

https://www.carscoops.com/2020/07/te...nto-ambulance/

Tesla's 'Autopilot' misleading, Germany rules.

The court, in Munich, said:
Quote:

"By using the term 'autopilot' and other wording, the defendant suggests that their vehicles are technically able to drive completely autonomously.
Link

It seems the Toyota guy was right, approximately 8.8 billion miles of testing before he felt autonomous driving would be safe.

Here's the link

https://www.forbes.com/sites/alanohn...ensure-safety/

Quoting from GTO's earlier post #11: "You constantly have to be one step ahead of it."

This is the "101" of all modern automated vehicles , especially airplanes. The only difference is that pilots are trained using state of the art simulators and peer-assessed via check-rides during regular operations.
As we unleash more automation in the automobile world, some serious thought needs to be given to applying rigorous criteria for driver training, fitness and certification, and continuous re-certification on a more regular basis.


All times are GMT +5.5. The time now is 05:55.