Go Back   Team-BHP > BHP Worldwide > The International Automotive Scene


Reply
 
Thread Tools Search this Thread
Old 14th September 2017, 10:40   #61
BHPian
 
blackwasp's Avatar
 
Join Date: Apr 2015
Location: Navi Mumbai
Posts: 585
Thanked: 1,169 Times
Default NTSB (USA): Over-reliance on technology led to Tesla crash

The National Transportation Safety Board (NTSB) - an independent U.S. government investigative agency, has come up with a report on the May 7, 2016 crash that killed a Tesla driver when his car collided with a truck. The car's autopilot function was engaged at the time of the crash.

In the crash, a 2015 Tesla Model S 70D car struck a semitrailer. At the time of collision, the truck was making a left turn from the highway. The car struck the right side of the truck, crossed below it, and then went off the road. The impact with the underside of the semitrailer tore off the roof of the car.

The report indicates that the crash was a result of the truck driver not yielding to the car and the Tesla driver's lack of attention and over-reliance on vehicle automation. The frequent and excessive use of the autopilot function resulted in the driver getting too dependent on the technology and not paying attention on the road. The board not only faulted the Tesla driver for not paying attention, but also noted that the autopilot did not do an adequate job of detecting other traffic. The system also did not inform the driver early enough to allow for enough reaction time.

The board issued seven suggestions at the hearing - including three for Tesla and every other automaker that offers a level 2 self-driving car. The Society of Automobile Engineers defines 6 levels of automation from fully manual to fully autonomous. In a level 2 system, the car controls the steering, acceleration and braking, but it is the human that is responsible for monitoring the surroundings and he/she must be ready to take over in case the system fails. The NTSB board suggested that manufacturers should not let a product be used, in a manner that is "inconsistent with its design." The board members also asked for recommendation on systems for deciding whether drivers are actually paying attention behind the wheel.

A board member mentioned that Tesla allowed the driver to use the autopilot system outside the environment for which it was designed. Tesla has said that the company will work towards making customers understand that autopilot is not a fully autonomous system, but needs the driver to be attentive at all times.

The NTSB report can be downloaded here.

Link to Team-BHP News

Last edited by blackwasp : 14th September 2017 at 10:44.
blackwasp is offline   (5) Thanks Reply With Quote
Old 14th September 2017, 11:34   #62
BHPian
 
abhishek46's Avatar
 
Join Date: Oct 2010
Location: Bangalore
Posts: 769
Thanked: 1,220 Times
Default Re: NTSB (USA): Over-reliance on technology led to Tesla crash

A level 2 system, as explained above, requires that the Driver is still involved in the act of driving, even though he is not actually driving it, because the system can 'legally' & 'technically' give up at any time.

In my opinion, this is more difficult than actual driving, mentally.

The reason being that, the auto-pilot could give up in such a horrible scenario, that it may not leave any room for the alarmed human to react.

Hence, the cautious user will always be wondering on the lines of "what is it doing" or "what is it doing again", just like an Airline Pilot.

I have utmost respect for NTSB and the way they do their investigation.
And in this instance, just one line summarizes it all.
"manufacturers should not let a product be used, in a manner that is inconsistent with its design".
In my view, the system should either work 'completely' or remained 'switched off'. Not 'toggling' between the two possibilities. (For a road Car)
abhishek46 is online now   (4) Thanks Reply With Quote
Old 14th September 2017, 12:23   #63
BHPian
 
Join Date: Nov 2009
Location: RJ-02,DL,MH-12
Posts: 870
Thanked: 1,010 Times
Default Re: NTSB (USA): Over-reliance on technology led to Tesla crash

The level II automation is just few notches above "Cruise Control" as it seems from the investigation of National Transportation Safety Board.

Yes auto pilot is evolving, yet the present generation is far away from errors and human behaviour. The investigation clearly stated that the truck driver has not yielded to lane changing protocol what the Tesla Autopilot assumed (as per programming) that he will. In a hypothetical scenario if the truck was also on autopilot, this situation would not have been there.

Its basically a machine's expectation vs a human response.

Let this be a lesson for future development in which the programmers will create new subroutines for analysis of human behavior by machines !

Sad ! a life is lost.

Last edited by i74js : 14th September 2017 at 12:25.
i74js is offline   (2) Thanks Reply With Quote
Old 14th September 2017, 15:02   #64
BHPian
 
jalajprakash's Avatar
 
Join Date: Aug 2015
Location: Delhi
Posts: 157
Thanked: 403 Times
Default Re: NTSB (USA): Over-reliance on technology led to Tesla crash

I think the fault in this case lies with Tesla.

Tesla's naming convention "AutoPilot" as well as how they depict a system at being closer to autonomous than to a simple cruise control is what provides customers with a false sense of security. A lot of other manufacturers also provide similar level 2 systems but are not so generous in their naming or description.

A level 2 system by definition is only required to control the steering and acceleration/deceleration while a human is responsible for everything else and any circumstances and is supposed to pay full attention to the road.

A level 3 system on the other hand controls everything about the car and a human is only expected to intervene if requested by the car.

Tesla cars are a little more advanced than Level 2 but not quite Level 3 yet, and I believe that Tesla misleads people to think that it is a complete level 3 system when it is not, as is apparent from the crash. If humans are supposed to pay attention at every moment then it is still a level 2 system and this should be conveyed to the customer fair and square.

I am a supporter of such systems and since driving on highways abroad becomes quite tiring and boring sometimes with one driving in a straight line mostly but having to make small corrections all the time to keep the car from going out of the lane.
jalajprakash is offline   Reply With Quote
Old 14th September 2017, 16:51   #65
BHPian
 
sudeepg's Avatar
 
Join Date: Feb 2006
Location: Bangalore
Posts: 489
Thanked: 1,029 Times
Default Re: NTSB (USA): Over-reliance on technology led to Tesla crash

Personally, I find these automatic systems less appealing because it takes the fun away from driving. How about that split second decision of downshifting and slam the A Pedal to make a quick, safe overtake? Driving should be fun and if one doesn't want to drive, best to take a taxi or another means of transport.

I agree that the word AutoPilot is misleading. While at the time of delivery, due diligence may be given to warn the buyer that this is not a fully autonomous system, as days go by, people tend to forget that fact. Then some of the things become a routine and therein lies the danger and pitfall.
sudeepg is offline   Reply With Quote
Old 14th September 2017, 17:17   #66
Senior - BHPian
 
Join Date: Oct 2012
Location: Delhi
Posts: 2,954
Thanked: 5,880 Times
Default Re: NTSB (USA): Over-reliance on technology led to Tesla crash

This is one of the least understood aspects of self driving cars. How do you ensure that the driver takes control back in an appropiate matter.

The more we rely on automation the longer it will takes us to respond, take in the situation and take appropiate action. Some years back I was at Stanford, Palo Alto, and this topic was brought up by their reseachers as well:

http://www.team-bhp.com/forum/intern...omous-car.html (Silicon Valley and the electric / autonomous car)

Quoting from my own post:

Quote:
So when the computer gets confused he tells you to take control again. A lot of research around this handing over of control was done in this simulator. They found it took on average 3.5 - 5.5 seconds for a driver to re-engage. Imagine, you are driving and your car is autonomous mode. You are maybe looking at your smart phone and the car tells you to take control. It would take you that long to fully understand what is happening, absorb the environment etc and be in actual control of the car. At a speed of around 100km/h that also means that the sensors must have the ability to see ahead for several hundred meters!
The idea that human are any good at continuously watching an automated system and step in smoothly has been proven wrong over the years in various industries.

Humans can only keep that sort of concentration on for a relatively short period. Then the mind starts to wonder and response time will start to increase.

If you look at for instance commercial planes. They typically are flown mostly on Autopilot. When they are at cruise altitude with not much going on for the pilots, the pilot responds time to a sudden emergency increases. Not too big a problem, because at that altitude a few seconds dont matter that much.

But during say an autoland, the pilot will have his/her hands on the yoke/throttle and follow constantly what the auto-plane is doing and will/can respond near instantantenous on anything out of the ordinary. But that part of the flight is measured in minutes, 10-15 minutes most.

Industrial automation has had its share of similar problems. When they started automating complex industrial processes and things did go wrong, the operators were too slow to respond.

With aviation and say process automation we also have seen the need for pilots and operators to stay fully current and train manual flying/operation on a regular basis. That might not be a problem yet in automated vehicles, but at some point in time people will simply lack sufficient experience and currency in dealing with a car who's computer told to take control.

I'm convinced that fully autonomous cars are in the not so distant future. The weakest part is and will still be the human/driver. I see a lot of emphasis on the technical part of fully autonomous cars, but very little on the human behaviour part in the media. Sadly, most likely we will need to see some serious accidents before that starts to emerge.

Another human trait, especially for engineers, (no offense intended) we are not particularly good at learning from other peoples mistake, especially not from other industries. We need to experience it ourselves, so we come up with the 'best solution", but with hind sight is often just a variation on already existing solutions somewhere else.

Jeroen
Jeroen is offline   (5) Thanks Reply With Quote
Old 14th September 2017, 19:22   #67
Distinguished - BHPian
 
amitoj's Avatar
 
Join Date: Nov 2004
Location: Nashua, NH
Posts: 2,953
Thanked: 1,587 Times
Default Re: NTSB (USA): Over-reliance on technology led to Tesla crash

Well, no surprises there.
As stated earlier, Autopilot in this case is a misnomer. I am not sure how different this is from the autopilot feature in aircrafts but I remember reading somewhere that it is not very different in terms of what it does. But regular drivers don't go through same scrutiny and training as the pilots.
On a lighter note, maybe Tesla should make it mandatory to attend a refresh course every 6 months or so
amitoj is online now   Reply With Quote
Old 15th September 2017, 15:52   #68
BHPian
 
vamsi.kona's Avatar
 
Join Date: Sep 2008
Location: Hyderabad
Posts: 138
Thanked: 107 Times
Default Re: The Tesla Auto-Pilot Thread

One of the more dangerous outcomes of modern cars especially the high end cars which are connected to internet, is that there is a vary high risk of malicious elements hacking and taking control of the vehicle.

https://www.wired.com/2015/07/hacker...l-jeep-highway

https://www.theguardian.com/technolo...rakes#comments

I have read that Auto pilot cars are actually designed to be more careful and as such even if hacked may not be of much use. However, I think the hackers can use that security features and make a car brake hard in a fast moving highway or swerve sideways by making the system believe there is a obstacle in front, causing fatal accidents.
vamsi.kona is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search


Similar Threads
Thread Thread Starter Forum Replies Last Post
Michelin launches Pilot Road 4, Pilot Power 3 superbike tyres Aditya Superbikes & Imports 5 3rd March 2016 19:21
Michelin Pilot Road 2 & Pilot Street Radial Motorcycle Tyres launched parrys Motorbikes 37 27th January 2016 23:27
Would the Honda Pilot make more sense in India? Shan2nu The Indian Car Scene 30 30th October 2015 19:34
Intro of a Chopper Pilot prakashchopper Introduce yourself 21 3rd August 2006 23:00
Playing the pilot to the Forester Samurai Street Experiences 15 4th January 2006 21:46


All times are GMT +5.5. The time now is 19:22.

Copyright 2000 - 2017, Team-BHP.com
Proudly powered by E2E Networks