![]() | #16 |
BHPian Join Date: Nov 2013 Location: Bangalore
Posts: 74
Thanked: 122 Times
| ![]() I think autopilot system of Tesla is under tested and an incorrect approach. A study found that this kind of approach makes drivers more careless. This is exactly what happened. Tesla cannot simply write certain points in user manual/agreement and wash their hands off. They should have studied, understood and designed around human behaviour. I feel Google has a better approach in this regards. Google doesn't want driver to play any role in their autonomous car driving. There is no ambiguity on who should be driving at any given point of time. |
![]() | ![]() |
|
![]() | #17 |
Senior - BHPian ![]() | ![]() The driver of the car did have a portable DVD player in the car. Tesla also cannot detect a tractor trailer, as the base was empty, fault tesla for that. The driver seemed to have ample time to brake. The auto pilot has travelled many million miles before the first crash, so asking them to roll it back is not fair. I would suggest tesla not encourage the drivers to take their hands and eyes off the road though! Maddy |
![]() |
![]() | #18 |
Senior - BHPian ![]() Join Date: Nov 2006 Location: Bengaluru
Posts: 4,317
Thanked: 2,071 Times
| ![]() Elon Musk, founder and CEO of Tesla Motors, has previously said Autopilot is twice as safe as humans. In a statement, Tesla said: “This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.” So much spin, can you add irresponsible drivers, drunk drivers, sleepy drivers to comparing the auto pilot system of their cars ? These systems are not tested against a general public standard and could have been in assist mode only (as they say) with testers really attentive. |
![]() |
![]() | #19 |
BHPian ![]() Join Date: Jul 2015 Location: Bangalore-Kochi
Posts: 352
Thanked: 1,039 Times
| ![]() What a tragic incident. May his soul Rest in Peace Hope the development will not be put into a halt by this Not on a serious note though, I guess, auto pilot or autonomous driving system should be developed in India, If they can achieve 20% success in Indian roads, it will be 100% accurate in developed countries |
![]() | ![]() |
![]() | #20 |
Distinguished - BHPian ![]() ![]() Join Date: Jun 2012 Location: BengaLuru
Posts: 4,341
Thanked: 10,251 Times
| ![]() This incident was quite a surprise for me, on the technical front. I myself am one such function developer working on autonomous driving and currently focus on the topic of sensor fusion. My development activities involve merging data from various sensors and data sources and providing a digital environment for the various other components to know whats around the car. In such a situation, though the camera might not detect the crossing trailer due to contrast issues, the radar would have definitely caught the oncoming trailer. And it would have caught it even before the driver or the camera would have spotted it. And its quite a surprise that Musk, in one of his tweets explains that the radar will get confused between a road sign overhead and an unloaded trailer. Firstly, though the radar wont be so accurate in a 3D space and height, its definitely not as inaccurate to distinguish between something say 3ft above the road and a road sign perhaps about 15-20ft above the road. Many a times, or rather in most cases, the road sign above is not detected by radar itself, unless its an upward incline. Even if this is the case, then some good filtering would elimnate any false triggers. Though its difficult to explain without revealing confidential information, I wanted to contradict the statement given by Musk, since he mentioned about use cases or real world examples. If they have some exceptions like these, then they shouldnt be calling it an 'Auto-Pilot' and just term it as Driver Assistance, like how other manufacturers like Volvo, Mercedes etc call it. I think the 'First to market' tag caused this. |
![]() | ![]() |
![]() | #21 | |
Senior - BHPian ![]() | ![]() Quote:
In our road conditions, I feel auto pilot should be developed to operate only at lower speeds, say less than 20 kmph.It will be a good feature in bumper to bumper traffic and also very much safe since speed will be quite low in case it does fail. Off topic : I feel all new cars should come built with Front camera inside car. This will be great help to (Honest) authorities in case of accidents. | |
![]() |
![]() | #22 |
BHPian ![]() Join Date: Aug 2015 Location: Nor Cal
Posts: 75
Thanked: 129 Times
| ![]() Tesla is a Level 3 autonomous car. Level 2: Minimal Driver Supervision, the current Google DL car is one example of it which has clocked 1.1 million Kms. Level 1: is 'fully' Driver Less mobility which is still a distant(5-6 yrs) AI dream. I can tell this since I work in a field which is kinda a super-set of what Tesla Autopilot is, ie. Computer Vision with Deep Machine learning. Tesla uses Autopilot system provided by MobilEye from Israel, which itself has its DL program in embryonic stage. The Mobileye's system consists of a radar and a camera and generally both of them compliment each other so that there's a redundancy during the object detection, localization and prediction. Now, during the crash the radar sensed the Truck's bay/ deck as some overhead pole or structure and so didn't brake, moreover, the camera didn't make out the structure because of the color of sky(*). So in Tesla's pursuit of making the system 'False-Positive safe' the car made a "false- negative" blunder. Personally, I still blame Tesla for not having overhead radar(if not LiDars) which is much more effective position instead of the current position(lower part of bumper where we have intercooler) since Elon Musk thinks its not-so-cool looking. And now as far as the whole "AI replacing drivers" premise is concerned, its gonna happen and in upcoming 5-6 years as said by some pioneers in my field. The current state-of-the-art is the google DL car, with others like Uber(recently acquired CMU's whole robotic dept! ), NVIDIA(no radar, only camera) and numerous other Ivy League DL car research projects. The algorithms used to detect objects in these systems( D-E-E-P Neural Networks) are such a cool thing that its only limited by the compute power(NVIDIA GPUs), I'll go on and say that NVIDIA is having a monopoly here, it's literally controlling the whole Deep Learning scene, even Google. Any smart guy today can build a self-driving car now in a garage who has lot of GPUs and a bunch Velodyne LiDars(read George Hotz vs. Musk), with 99% accuracy but that 1% is make or brake for the Autonomous car. Also, as the compute and learning algorithms advance -> the DL car scene will advance and currently the field is so dynamic, there are breakthroughs almost every week. Even I am flabbergasted by the fact that the a bunch of code is gonna save a million lives and perhaps unfortunately kill a couple. Fully Driver less Car is gonna be a reality in coming decade, so much so that, as I quote: "There will be fines charged towards actual drivers for driving a car in coming 20 years", Dr. Y. LeCun. |
![]() | ![]() |
![]() | #23 |
BHPian ![]() Join Date: Aug 2015 Location: Nor Cal
Posts: 75
Thanked: 129 Times
| ![]() Please merge with my original post if its required, because I can't resist to share some awesome info about the "unpredictability" raised by GTO. There was testing going on by Google's DL car and this http://forums.roadbikereview.com/gen...ay-349240.html took place. Believe me, DL cars are sensitive to even 1 cm all around 360degs. Talking about our Indian Road ethics, I can pay a thousand bucks to watch Google's DL car struggle in our conditions. Especially Laxmi Road in Pune. |
![]() |
|
![]() | #24 |
BHPian Join Date: Jul 2011 Location: Bangalore
Posts: 107
Thanked: 32 Times
| ![]() Some very clever statements made by Elon Musk / Tesla regarding millions of miles per fatality - such statements are very common in the "Investment Industry". To a common person, the "first fatality in 130 miles where autopilot has been active" implies that tesla autopilot is close to 1.5 times better (America's standard of 90miles per fatality) than humans judging situations. The truth is far from it - to cash in on popular fantasy of driver-less cars, the cars are projected as being capable of driving themselves through numerous videos / press articles. Whereas the truth is probably that humans are better at judging situations like these, at this point. http://qz.com/524400/tesla-just-tran...riverless-car/ |
![]() |
![]() | #25 |
BHPian Join Date: Feb 2015 Location: Sydney
Posts: 71
Thanked: 145 Times
| ![]() I think its way to early to be judging the Auto-pilot. It just might be the future and it needs to be built upon and perfected over time. We used to carry so many spares with the car when we traveled in the 80s and 90s, because they were so unreliable. Today, all we do is an air pressure check. Over time, we have perfected the internal combustion engine - even though it is so much more complicated than an electric motor. I think over time, this also can be made more reliable. But while waiting for the reliability to arrive, we are risking human lives. Until then, I think any auto-pilot or driver less feature needs to have an alert human behind the wheel just in case. Last edited by sidzzone : 5th July 2016 at 12:53. |
![]() |
![]() | #26 |
Senior - BHPian ![]() Join Date: Nov 2011 Location: Bangalore
Posts: 1,045
Thanked: 1,646 Times
| ![]() This is a great opportunity for tesla to expand their government funded publicity project. Auto pilot is an assistance system that got hyped and now someone's dead because he bought into the marketing drivel. The only outcome of this is going to be more demands from tesla for the government to subsidize R&D into autonomous driving - something just about every of school manufacturer is doing, and doing pretty well, given what we've seen from Volvo and Benz. Do insurance companies cover damage caused when auto pilot is engaged? No media is covering this angle. |
![]() |
![]() | #27 |
BHPian Join Date: Nov 2015 Location: Hyderabad
Posts: 510
Thanked: 994 Times
| ![]() Guys Tesla here is experimenting with technology which might change the way we view transport. There are bound to be errors and fatalities in the development. Not that i am indifferent to the life of the person involved but we should recognize that this is the first major change in the way we see cars. All the testing or for that matter driving involved is a risk in itself and is known to the users. I never saw such a hue and cry a few months back when Indias Accident statistics were published. And none of them was on Autopilot. Imagine ho many lives would have been saved if all cars had the autopilot. I have no intention of provoking a debate/argument. Just think over if we are missing a bigger part of the picture. |
![]() | ![]() |
![]() | #28 |
BHPian ![]() Join Date: Jan 2016 Location: NGP and CBE
Posts: 116
Thanked: 473 Times
| ![]() Tesla risked having a public beta in the hands of customers against being a tech trailblazer. The responsibility here is with both the user and the OEM, after all Tesla buyers are early adopters with good tech awareness. It is quite understandable that the trailer was not detected due to some shortcomings in the system with lack of contrast (for vision system) and lack of solid object (for Radar). May be the system interpreted it as a sensor error and ignored, since it was not taught of this particular scenario.But what I don't get is how can the car not detect such a crash? Am sure the passive safety systems(impact sensors) should have picked up the hit and brought the car to a stop against powering on even after the crash. This should certainly be a failure, if not. Even some mortal ( ![]() |
![]() |
![]() | #29 | |
Senior - BHPian ![]() | ![]() Quote:
Tesla knew of the crash 2 months before. The info just leaked to the press before the July 4 weekend. Between the crash and the leak, tesla did sell over 2billion worth of shares. Picture abhi baaki hai. Someone mentioned Tesla is a Govt funded company using the funds to market themself. I think its a little ridiculous statement to make simply because the govt does subsidize every industry in either tax grants or other favorable terms, so singling out this company is not right. Big oil companies are subsidized, any factory or power plant does get funds and grants from govt, even if you and I decide on opening a company to make anything, we can apply for funds from the govt. Tesla is utilizing a loophole, which any car company either from detroit or from germany could have used. Anyways its a american car company making cars in america and exporting it to the world. It employs 1000s of people legally and pays them. Tesla knew of the crash 2 months before. The info just leaked to the press before the July 4 weekend. Between the crash and the leak, tesla did sell over 2billion worth of shares. Picture abhi baaki hai. Maddy | |
![]() | ![]() |
![]() | #30 |
Senior - BHPian Join Date: Mar 2006 Location: mumbai
Posts: 2,046
Thanked: 2,256 Times
| ![]() If the autopilot couldn't detect a trailer, it can't detect a container truck bed also, may be a truck without underrun bars is also invisible and so is a truck or a tractor carrying hanging iron bars. Couple this with vehicles having zero tail lights at night and the autopilot is a dumb box. For people wanting an autopilot car in India, dream on. ![]() |
![]() |