Team-BHP > Electric Cars
Register New Topics New Posts Top Thanked Team-BHP FAQ


Reply
  Search this Thread
49,509 views
Old 14th September 2017, 10:40   #61
Senior - BHPian
 
blackwasp's Avatar
 
Join Date: Apr 2015
Location: Navi Mumbai
Posts: 2,974
Thanked: 26,325 Times
NTSB (USA): Over-reliance on technology led to Tesla crash

The National Transportation Safety Board (NTSB) - an independent U.S. government investigative agency, has come up with a report on the May 7, 2016 crash that killed a Tesla driver when his car collided with a truck. The car's autopilot function was engaged at the time of the crash.

In the crash, a 2015 Tesla Model S 70D car struck a semitrailer. At the time of collision, the truck was making a left turn from the highway. The car struck the right side of the truck, crossed below it, and then went off the road. The impact with the underside of the semitrailer tore off the roof of the car.

The report indicates that the crash was a result of the truck driver not yielding to the car and the Tesla driver's lack of attention and over-reliance on vehicle automation. The frequent and excessive use of the autopilot function resulted in the driver getting too dependent on the technology and not paying attention on the road. The board not only faulted the Tesla driver for not paying attention, but also noted that the autopilot did not do an adequate job of detecting other traffic. The system also did not inform the driver early enough to allow for enough reaction time.

The board issued seven suggestions at the hearing - including three for Tesla and every other automaker that offers a level 2 self-driving car. The Society of Automobile Engineers defines 6 levels of automation from fully manual to fully autonomous. In a level 2 system, the car controls the steering, acceleration and braking, but it is the human that is responsible for monitoring the surroundings and he/she must be ready to take over in case the system fails. The NTSB board suggested that manufacturers should not let a product be used, in a manner that is "inconsistent with its design." The board members also asked for recommendation on systems for deciding whether drivers are actually paying attention behind the wheel.

A board member mentioned that Tesla allowed the driver to use the autopilot system outside the environment for which it was designed. Tesla has said that the company will work towards making customers understand that autopilot is not a fully autonomous system, but needs the driver to be attentive at all times.

The NTSB report can be downloaded here.

Link to Team-BHP News

Last edited by blackwasp : 14th September 2017 at 10:44.
blackwasp is offline   (6) Thanks
Old 14th September 2017, 11:34   #62
Senior - BHPian
 
abhishek46's Avatar
 
Join Date: Oct 2010
Location: Bangalore
Posts: 1,813
Thanked: 5,864 Times
Re: NTSB (USA): Over-reliance on technology led to Tesla crash

A level 2 system, as explained above, requires that the Driver is still involved in the act of driving, even though he is not actually driving it, because the system can 'legally' & 'technically' give up at any time.

In my opinion, this is more difficult than actual driving, mentally.

The reason being that, the auto-pilot could give up in such a horrible scenario, that it may not leave any room for the alarmed human to react.

Hence, the cautious user will always be wondering on the lines of "what is it doing" or "what is it doing again", just like an Airline Pilot.

I have utmost respect for NTSB and the way they do their investigation.
And in this instance, just one line summarizes it all.
"manufacturers should not let a product be used, in a manner that is inconsistent with its design".
In my view, the system should either work 'completely' or remained 'switched off'. Not 'toggling' between the two possibilities. (For a road Car)
abhishek46 is offline   (4) Thanks
Old 14th September 2017, 12:23   #63
Senior - BHPian
 
Join Date: Nov 2009
Location: RJ-02,DL,MH-12
Posts: 1,331
Thanked: 2,181 Times
Re: NTSB (USA): Over-reliance on technology led to Tesla crash

The level II automation is just few notches above "Cruise Control" as it seems from the investigation of National Transportation Safety Board.

Yes auto pilot is evolving, yet the present generation is far away from errors and human behaviour. The investigation clearly stated that the truck driver has not yielded to lane changing protocol what the Tesla Autopilot assumed (as per programming) that he will. In a hypothetical scenario if the truck was also on autopilot, this situation would not have been there.

Its basically a machine's expectation vs a human response.

Let this be a lesson for future development in which the programmers will create new subroutines for analysis of human behavior by machines !

Sad ! a life is lost.

Last edited by i74js : 14th September 2017 at 12:25.
i74js is online now   (2) Thanks
Old 14th September 2017, 15:02   #64
BHPian
 
jalajprakash's Avatar
 
Join Date: Aug 2015
Location: Delhi/Jaipur
Posts: 258
Thanked: 719 Times
Re: NTSB (USA): Over-reliance on technology led to Tesla crash

I think the fault in this case lies with Tesla.

Tesla's naming convention "AutoPilot" as well as how they depict a system at being closer to autonomous than to a simple cruise control is what provides customers with a false sense of security. A lot of other manufacturers also provide similar level 2 systems but are not so generous in their naming or description.

A level 2 system by definition is only required to control the steering and acceleration/deceleration while a human is responsible for everything else and any circumstances and is supposed to pay full attention to the road.

A level 3 system on the other hand controls everything about the car and a human is only expected to intervene if requested by the car.

Tesla cars are a little more advanced than Level 2 but not quite Level 3 yet, and I believe that Tesla misleads people to think that it is a complete level 3 system when it is not, as is apparent from the crash. If humans are supposed to pay attention at every moment then it is still a level 2 system and this should be conveyed to the customer fair and square.

I am a supporter of such systems and since driving on highways abroad becomes quite tiring and boring sometimes with one driving in a straight line mostly but having to make small corrections all the time to keep the car from going out of the lane.
jalajprakash is offline  
Old 14th September 2017, 16:51   #65
BHPian
 
sudeepg's Avatar
 
Join Date: Feb 2006
Location: Bangalore
Posts: 812
Thanked: 2,454 Times
Re: NTSB (USA): Over-reliance on technology led to Tesla crash

Personally, I find these automatic systems less appealing because it takes the fun away from driving. How about that split second decision of downshifting and slam the A Pedal to make a quick, safe overtake? Driving should be fun and if one doesn't want to drive, best to take a taxi or another means of transport.

I agree that the word AutoPilot is misleading. While at the time of delivery, due diligence may be given to warn the buyer that this is not a fully autonomous system, as days go by, people tend to forget that fact. Then some of the things become a routine and therein lies the danger and pitfall.
sudeepg is offline   (1) Thanks
Old 14th September 2017, 17:17   #66
Distinguished - BHPian
 
Join Date: Oct 2012
Location: Delhi
Posts: 8,100
Thanked: 50,861 Times
Re: NTSB (USA): Over-reliance on technology led to Tesla crash

This is one of the least understood aspects of self driving cars. How do you ensure that the driver takes control back in an appropiate matter.

The more we rely on automation the longer it will takes us to respond, take in the situation and take appropiate action. Some years back I was at Stanford, Palo Alto, and this topic was brought up by their reseachers as well:

http://www.team-bhp.com/forum/intern...omous-car.html

Quoting from my own post:

Quote:
So when the computer gets confused he tells you to take control again. A lot of research around this handing over of control was done in this simulator. They found it took on average 3.5 - 5.5 seconds for a driver to re-engage. Imagine, you are driving and your car is autonomous mode. You are maybe looking at your smart phone and the car tells you to take control. It would take you that long to fully understand what is happening, absorb the environment etc and be in actual control of the car. At a speed of around 100km/h that also means that the sensors must have the ability to see ahead for several hundred meters!
The idea that human are any good at continuously watching an automated system and step in smoothly has been proven wrong over the years in various industries.

Humans can only keep that sort of concentration on for a relatively short period. Then the mind starts to wonder and response time will start to increase.

If you look at for instance commercial planes. They typically are flown mostly on Autopilot. When they are at cruise altitude with not much going on for the pilots, the pilot responds time to a sudden emergency increases. Not too big a problem, because at that altitude a few seconds dont matter that much.

But during say an autoland, the pilot will have his/her hands on the yoke/throttle and follow constantly what the auto-plane is doing and will/can respond near instantantenous on anything out of the ordinary. But that part of the flight is measured in minutes, 10-15 minutes most.

Industrial automation has had its share of similar problems. When they started automating complex industrial processes and things did go wrong, the operators were too slow to respond.

With aviation and say process automation we also have seen the need for pilots and operators to stay fully current and train manual flying/operation on a regular basis. That might not be a problem yet in automated vehicles, but at some point in time people will simply lack sufficient experience and currency in dealing with a car who's computer told to take control.

I'm convinced that fully autonomous cars are in the not so distant future. The weakest part is and will still be the human/driver. I see a lot of emphasis on the technical part of fully autonomous cars, but very little on the human behaviour part in the media. Sadly, most likely we will need to see some serious accidents before that starts to emerge.

Another human trait, especially for engineers, (no offense intended) we are not particularly good at learning from other peoples mistake, especially not from other industries. We need to experience it ourselves, so we come up with the 'best solution", but with hind sight is often just a variation on already existing solutions somewhere else.

Jeroen
Jeroen is offline   (5) Thanks
Old 14th September 2017, 19:22   #67
Senior - BHPian
 
amitoj's Avatar
 
Join Date: Nov 2004
Location: Windham, NH USA
Posts: 3,348
Thanked: 3,105 Times
Re: NTSB (USA): Over-reliance on technology led to Tesla crash

Well, no surprises there.
As stated earlier, Autopilot in this case is a misnomer. I am not sure how different this is from the autopilot feature in aircrafts but I remember reading somewhere that it is not very different in terms of what it does. But regular drivers don't go through same scrutiny and training as the pilots.
On a lighter note, maybe Tesla should make it mandatory to attend a refresh course every 6 months or so
amitoj is online now  
Old 15th September 2017, 15:52   #68
BHPian
 
vamsi.kona's Avatar
 
Join Date: Sep 2008
Location: Hyderabad
Posts: 280
Thanked: 568 Times
Re: The Tesla Auto-Pilot Thread

One of the more dangerous outcomes of modern cars especially the high end cars which are connected to internet, is that there is a vary high risk of malicious elements hacking and taking control of the vehicle.

https://www.wired.com/2015/07/hacker...l-jeep-highway

https://www.theguardian.com/technolo...rakes#comments

I have read that Auto pilot cars are actually designed to be more careful and as such even if hacked may not be of much use. However, I think the hackers can use that security features and make a car brake hard in a fast moving highway or swerve sideways by making the system believe there is a obstacle in front, causing fatal accidents.
vamsi.kona is offline  
Old 24th January 2018, 14:07   #69
Distinguished - BHPian
 
Join Date: Dec 2010
Location: --
Posts: 23,427
Thanked: 67,858 Times
Re: The Tesla Auto-Pilot Thread

Never ending woes for Tesla's autopilot!

U.S. safety board opens probe of second Tesla Autopilot crash.

The U.S. National Transportation Safety Board will investigate an accident involving a Tesla Inc. Model S sedan that rear-ended a firetruck on a freeway near Los Angeles.

The Tesla Auto-Pilot Thread-teslamodelsautopilotcrashcaliforniafiretruck1.jpg

Quote:
The investigators will focus on the driver’s actions and how the vehicle performed.
Quote:
The Tesla’s driver said he had the vehicle’s Autopilot driver-assist system engaged when it struck a firetruck while traveling at about 65 miles per hour
Link
volkman10 is offline  
Old 24th January 2018, 14:51   #70
Distinguished - BHPian
 
audioholic's Avatar
 
Join Date: Jun 2012
Location: BengaLuru
Posts: 5,659
Thanked: 19,407 Times
Re: The Tesla Auto-Pilot Thread

Quote:
Originally Posted by volkman10 View Post
Never ending woes for Tesla's autopilot!

U.S. safety board opens probe of second Tesla Autopilot crash.
Finally they declared their technology as just an Advance Driver Assistance System, which they had to do earlier but the kind of hype they built up was as if the car could drive on its own. Hopefully, this should be made aware and they should ensure that the hands off detection is made more strict and require regular intervention from the driver. A stationary fire truck is more basic compared to a crossing trailer and it is a real pity that the car couldnt detect it.

Quote:
Tesla said in a statement that Autopilot is “intended for use only with a fully attentive driver.” The company said it has taken steps to educate drivers about the need to keep their hands on the steering wheel and be prepared to take over from Autopilot, which it calls an "advanced driver assistance system" that is not intended to turn the vehicle into an autonomous car.

Last edited by audioholic : 24th January 2018 at 14:56.
audioholic is offline   (1) Thanks
Old 24th January 2018, 17:22   #71
Distinguished - BHPian
 
Join Date: Oct 2012
Location: Delhi
Posts: 8,100
Thanked: 50,861 Times

Quote:
Originally Posted by audioholic View Post
Finally they declared their technology as just an Advance Driver Assistance System, which they had to do earlier but the kind of hype they built up was as if the car could drive on its own. Hopefully, this should be made aware and they should ensure that the hands off detection is made more strict and require regular intervention from the driver. A stationary fire truck is more basic compared to a crossing trailer and it is a real pity that the car couldnt detect it.

It's probably more a legislative, insurance, liability thing then anything else. Human driver have been crashing into station-airy objects for as long as we have had cars.

But for some reason a lot of folks require autonomous cars to be far superior to human driverss.

Jeroen
Jeroen is offline   (2) Thanks
Old 26th January 2018, 07:05   #72
Senior - BHPian
 
deetjohn's Avatar
 
Join Date: Sep 2006
Location: Kochi
Posts: 4,530
Thanked: 10,583 Times
Re: The Tesla Auto-Pilot Thread

Quote:
Originally Posted by audioholic View Post
Hopefully, this should be made aware and they should ensure that the hands off detection is made more strict and require regular intervention from the driver.
Tesla has tried to make it strict through some over the air software updates, but people are also finding ways to trick the sensors.



Level 2 and 3 autonomous driving is tricky and extremely dangerous. Level 4/5 is much safer.
deetjohn is offline   (1) Thanks
Old 18th April 2018, 09:40   #73
Distinguished - BHPian
 
Join Date: Dec 2010
Location: --
Posts: 23,427
Thanked: 67,858 Times
Re: The Tesla Auto-Pilot Thread

Tesla told to improve Autopilot, release claimed “World’s Safest” data!

Quote:
The Consumers Union (CU) group (a division of Consumer Reports) has called Tesla out for its Autopilot system, obviously due to the recent fatal Model X crash and related media coverage. Tesla has been asked to improve the system, as well as to release a new statement explaining its claims that Autopilot is the “world’s safest” system. The Union wants more public data supporting such claims.
Tesla has admitted that Autopilot was engaged during the deadly crash, and this was also the case in an earlier fatal incident in Florida.

Quote:
According to CU, Autopilot should limit its use to areas in which it can be used successfully. It believes that the safety system is able to be activated when it’s not necessarily safe to use. Additionally, it’s concerned that Tesla’s “hands-on” warning isn’t enough.

Link
volkman10 is offline  
Old 30th May 2018, 13:38   #74
Distinguished - BHPian
 
Join Date: Dec 2010
Location: --
Posts: 23,427
Thanked: 67,858 Times
Re: The Tesla Auto-Pilot Thread

Tesla's Autopilot crashes are not rare, but this crash makes a bigger headline!

Tesla Model S Slams Into Police Ford SUV.

The Tesla Auto-Pilot Thread-teslamodelsslamsintopolicefordsuv125974_1.jpg


Quote:
The latest such incident took place in Laguna Beach, California, when a Model S took aim at a parked and unoccupied police police SUV and crashed into it.

The investigation into the incident is still ongoing, but the driver of the S already claims not to be at fault and pins the guilt on the car’s Autopilot system.
Quote:
the incident occurred in the exact same spot another Tesla crashed a while ago. It remains to be seen whether the Autopilot was engaged at the time of the crash as the driver claims.
Link
volkman10 is offline  
Old 18th June 2018, 15:31   #75
Distinguished - BHPian
 
Join Date: Dec 2010
Location: --
Posts: 23,427
Thanked: 67,858 Times
Re: The Tesla Auto-Pilot Thread

More Trouble for TESLA - This time Auto Pilot not enabled, Model S catches fire!

The battery pack of a Tesla Model S caught on fire yesterday in Los Angeles.

Quote:
The owner says that the fire started “out of the blue” without any impact while the vehicle was being driven in traffic. Tesla says that it is investigating the situation.



Link
volkman10 is offline   (1) Thanks
Reply

Most Viewed


Copyright ©2000 - 2024, Team-BHP.com
Proudly powered by E2E Networks