• NTSB 'unhappy' with Tesla release of investigative information in fatal crash
    74 replies, posted
https://www.washingtonpost.com/news/dr-gridlock/wp/2018/04/01/ntsb-unhappy-with-tesla-release-of-investigative-information-in-fatal-crash/?utm_term=.1c89c6ef66b8
I’m sick of seeing Tesla always pointing the finger at someone else whenever these crashes happen. If it were any other maker, they would either not say anything at all and spare the deceased from a posthumous shaming, or release a statement acknowledging a deficiency in their design, and a promise to consumers that it would never happen again. Instead, Tesla resorts to both victim shaming and not promising to re-evaluate their design. Alot of people on the road are bloody morons. Morons have existed for as long as Homo sapiens have existed. If Tesla’s autopilot system cannot account for people being morons, then they should either improve autopilot, or get rid of it altogether.
The issue is more that if it was any other car, nobody would have reported on it. It's hard for Tesla to ignore direct attacks like that.
Tesla and the autopilot system both tell you to keep your hands on the goddamn wheel. If you can't handle such a basic task, don't drive.
Should we get rid of regular cruise control as well? It doesn't deal with people not paying attention so might run into something. No, of course not. Autopilot (just advanced cruise control), and cruise control both help deal with driver fatigue, and are net positives.
Other car manufacturers don't release statements because theyre not pushing the bleeding edge of technology where one fatal accident could kill the entire project. They dont give statements because they dont have too; tens of thousands of Americans are killed by their own negligence in car accidents every year and nobody cares and nobody blames the manufacturer. Tesla has to condemn idiots who get themselves killed because lawmakers are looking for excuses to kill their billion dollar business. Tesla shouldnt have to take the hit because some idiot wasnt paying attention. Its the guys fault he killed himself in a crash, not Teslas, and not the auto-pilot's. Teslas semi autonomous system cant really account for the driver not driving the car. Thats not an inability of the system, thats just a retarded driver.
I think Tesla was probably screwed either way in this situation. If they did as the NTSB wanted, and didn't release any statement when they knew Autopilot was on during this crash then down the road everyone would've shit all over them, especially if they end up doing another stock offering before the release of the results.
Tesla really needs to push that auto-pilot != self-driving.
its not shaming to point out multiple sources human error, and its not nebulously pointing the finger to release crash data and statistics that give clarity on the situation
They do on their own material, but when other people cover the autopilot system they don't emphasize it as much. They should probably change the name though IMO.
I'm confused why people hear Tesla has an 'autopilot' system, and instantly assume it to have more features and abilities than the literal autopilots in aircraft that often just keep the plane on a level heading and altitude. Heck, they have autopilots on some planes that can only correct on the roll axis, to keep the wings level.
Imo, this is a non-story blown up to get clicks. Someone was not using the drive assist correctly and crashed. It's tragic that he didn't make it but it's no different than someone crashing their car with cruise control on. The incorrectly named "auto pilot" is a driving assistance tool, and doesn't turn the car into a selfdriving one. Imagine if any other car company had their cars hit national headlines everytime someone crashed while some sort of driver assistance tool was on. We'd see thousands of articles a day, but because Tesla is an independent, up and coming company they're thrust into the spotlight and have to defend themselves. I think the real focal point should be the fact that the safety equipment on the highway wasn't replaced, and now someone is dead when they crash could have been less severe had the crash thing been fixed ASAP.
To people who don't know shit about aircraft or self driving cars, "autopilot" sounds like it automatically pilots the vehicle, which is a problem for Tesla when people assuming this are driving their cars.
More people die every day to negligent human behaviour than have ever died from autonomous driving. Yet, no one gives a fuck. There's a reason Tesla has to defend themselves, and it's because human stupidity is a massive problem we're barely willing to agree exists as a society.
It's because it's a "semi autonomous" system, and happened shortly after the Uber self driving car killed someone.
Tesla is the only car company who gets blamed and has their stock go down for a crash.
Musk's response: https://twitter.com/elonmusk/status/980876926830039041
In fairness though, there's a hell of a lot more human dipshits on the road than there are self-driving cars.
Is the NTSB also 'unhappy' with the fact that the legally mandated energy dissipation barrier wasn't in place at the time of the crash, despite a span of days between the previous accident at this site? It really boggles my mind that Tesla is receiving any flak at all for someone dying while using glorified adaptive cruise control with a fancy name letting their car drive itself into a sheer concrete barrier that had its safety maintenance ignored by the state organization responsible for it.
They are currently listing options as "full self-driving capability". Yes, the fact that the tech is not there yet is listed, but the header clearly gives the exact opposite. This is by design. They are marketing it as this because they can get away with it and it drives more sales.
Not sure if this is the same gore point but I assume this is exactly what happened, missing line at gore point causes it to follow the left most white line directly into barrier https://www.youtube.com/watch?v=6QCF8tVqM3I
They are most likely also "unhappy" that the barrier wasn't replaced in time. Tesla isn't getting flack for their autopilot. They are getting flack for making statements about the investigation before it's concluded, which means that facts could change at any moment if something else pops up during the investigation. The NTSB looks at ALL contributing factors, not just the ones it wants to look at. Tesla basically abused their position to undercut the NTSB. Now the NTSB could tell them to create a system to cut Tesla out of the entire process, have them sign non-disclosure agreements before they can pull the logs, or not look at the logs and recommend that the autopilot feature be disabled on roadways until they comply. Beings that the NTSB has already stood on Tesla's side before, they had no reason to do this.
Honestly seems stupid that a system that can kill you or others is actually allowed. I know they tell you to pay attention, but how do you know if its really about to kill you, and what if you get into an accident trying to stop it or if you get scared...it seems like an unreliable autopilot is worse than none at all sometimes...
The should call it copilot
I think if the autopilot system only fails when a human doesn't respond for >1 second it could be justifiable to keep imperfect auto-driving legal, because you could say "it's the human's fault" and the autopilot system is pretty much good enough. But anything less than that I really feel like we're giving people something that just puts them in worse situations than they would be in normally. I don't really...know if autopilot is that bad. It doesn't bode well that the system is even capable of crashing into something without having reduced its speed substantially or just plain turning anywhere else, and I don't think it's a stretch to say we should legislate in some minimum standards to ensure everyone is safe.
It clearly states "enhanced autopilot" isn't self driving, and that the self driving option is years away.
the full self driving is NOT autopilot and NEVER claims it is. It says in the option that it is currently NOT available. Not to mention it demands you agree that it is not autonomy before using autopilot. The full self driving and enhanced autopilot options are two entirely different options.
but wouldn't a human crash just as badly? the driver ignored multiple warnings to put his hand back on the wheel
They should rename it because it doesn't matter how clearly you explain something. A moron who sees the word "Auto-pilot" and reads no further isn't going to learn no matter how clear you make the warning. There's no such thing as idiot-proof.
Jesus christ that looks scary can't imagine how somebody must feel when it just happens like that.
Sorry, you need to Log In to post a reply to this thread.