NTSB 'unhappy' with Tesla release of investigative information in fatal crash
74 replies, posted
Realistically, it isn't reasonable to expect someone to pay equal attention to the road when the car is supposed to account for basically all aspects of driving (speed control, lane control, etc.). That's like inventing a teapot, but saying that people should stand there and watch it because it might explode if left unattended. You can say that, but a tiny minority is actually going to do it because the express purpose of the teapot is to automatically boil water and let you know when it's done. In the same way, the express purpose of the autopilot is to correctly account for speed and lane management.
If I set my cruise control to 70mph, I'm not going to constantly be watching my speedometer to make sure it's still at 70mph. I might take a look every once in a while, but, generally, I expect the cruise control to do what it is supposed to do.
Thats autopilot dood lol. And lane assist isnt full on driving the car for you. An $80,000 mercedes that has lane assist isnt going to completely drive the car if you set the cruise control. Its really dumb to compare autopilot to cruise control.
Of course autopilot isnt completely autonomous driving, but its still driving the car for you as long as you remain attentive to what its doing.
It does matter that this guy wasnt using the system as it was intended. Would Tesla have to issue a statement if somebody crashed their car because they were drunk? No, because youre not intended to drive a car drunk, just like youre not supposed to use autopilot while distracted
So if you light a firework off and the thing ricochets off a lamp post and hits you in the eye, is it the fault of the manufacturer or just a bad set of circumstances?
If anyone should be blamed, it's the state/city for failing to maintain road markings. The autopilot did not create the set of circumstances, it was the victim of them. Someone died because the city/state did not maintain a normal and expected safety standard (road markings).
How about this: your stove was leaking because the wrong fitting was used to install it to the gas line. The gas leak continues until it ignites from the stoves pilot light (or whatever). Whose fault is it? The stove started the fire. Who do you blame?
Should Tesla learn from this event? Absolutely. Should they be shunned or held responsible? No.
I don't know if it's still the case, but the first time you enabled autosteer it literally asked you to agree to testing the beta.
https://farm6.staticflickr.com/5673/22656765915_82bdea8c34.jpg
Also the barrier was previously damaged and not replaced. The part that would've absorbed most of the shock from the collision was already damaged and needed replacing. The accident might have been survivable if the highway was properly maintained. If anything it highlights how these systems read the roads and how important it is to make sure they can do it properly. Would vandals be able to fool the cars by replacing a stop sign with a 60mph speed limit sign?
You could say that locking the steering wheel to keep to a single heading is autopilot because that's what early autopilot systems on planes did. A Mercedes might not have the same lane assist as a Tesla, but Volvo's for example do pretty much the same thing. And they don't market it as autopilot, they market it as Pilot Assist. It doesn't drive for you. It only assists.
Autopilot informs you it isn’t an autopilot joost.
youve actually been told this already and ignored it.
This was a really unfortunate, horrible accident. And it's a shame that Tesla is getting as much flak as they are for this, but they certainly aren't blameless.
Sadly, every party is at fault in this specific case. I'd even argue that each of the three parties involved are very close to being equally at fault here.
First, we have the Highway Barrier that wasn't properly maintained and rebuilt after a previous car accident in the same place about a week earlier.
https://www.tesla.com/sites/default/files/images/blogs/google-street-view.jpg
(What the crumple barrier was supposed to look like.)
https://www.tesla.com/sites/default/files/images/blogs/thursday-march22_0.jpg
(The state of the crumple barrier at the time of the crash, having not been rebuilt after nearly a week.)
If that had been repaired sooner like it should've been, I'm willing to bet the driver would've survived the crash there alone. However, the crash shouldn't have happened in the first place, which leads to the other two party's faults.
As was linked previously in this thread and shown in this video, Tesla's 'autopilot' system very clearly had a MAJOR glitch that somehow completely overlooked the massive barrier it was barreling towards.
https://www.youtube.com/watch?v=6QCF8tVqM3I
This is a very major glitch in the system. The lack of a road-line and following to the left is one thing, and from a programming perspective, I understand completely why it would follow that path. However the fact that it completely didn't take into account the very large, stationary object very clearly in it's path is absurd. How it would possibly miss something like that and continue forward at full speed with no attempted braking or steering to avoid is unbelievable, and Tesla will absolutely have to address the issue and discuss how the system overlooked something so massive.
Now the driver himself. The driver absolutely failed to take appropriate action to the 'autopilot' system failure. From the article:
“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision,” Tesla said. “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
I want you to stop and do nothing but count for six seconds. Now imagine being the driver of a vehicle on a highway and not paying attention to the road at all for six seconds straight.
Six seconds is an astronomical amount of time for things to go wrong when it comes to driving on a highway. Even one second is a long time for many things to go wrong. Six seconds where he didn't react at all is just ridiculous. He clearly was not being a responsible driver and absolutely deserves flak for that much unattentiveness at the wheel, especially on the highway.
So really, everyone is at fault here. The highway maintainers, Tesla, and the driver.
It's really sad that this happened as a result of the neglectfulness involved from every party that resulted in this death. Hopefully Tesla will learn from this, the highway will learn from this, and drivers with 'autopilot' systems will learn to respect that they're in a very early trial period of semi-automated driving systems and need to remain in control at all times, even if it seems like the car has got it handled. Systems like this will get better and better over time, but they'll never be perfect. Nothing ever is, but someday they'll save far more lives than human drivers could ever hope to.
I've been rewriting this post over and over again trying to find a good way to start, but at this point, I'm just going to just be simple: this entire mindset is essentially just a retarded way to deflect the blame away from irresponsible morons.
Pay attention on the road. Don't assume you don't have to pay attention. If you are willing to assume you can stop paying attention when driving, for any reason, get the fuck out your car, sell your car, and never drive again, because you are a danger to yourself and everyone around you.
Driving's a complex task and it requires you to launch your body at absurd speeds which defy your reaction times and demand a certain set of skills, especially anticipation, moderation and vigilance. Having a computer handle some of those aspects for you is not valid enough reason to doze off. And the product reminds you of that all the time, essentially as a fail-safe because Tesla knows fully well that people are dumb and there's countless idiots who are hellbent on putting themselves at risk while driving by thinking they're capable of doing anything else but drive when behind the wheel.
Unrealiable humans are worse than unrealiable autopilots
I think you're confusing me with Cyke lon bee here, I am saying that it isn't an autopilot, while he's arguing that it is. It is a bad name though, because it gives people the illusion that it is a true autopilot.
Also, this whole idea that deaths are to be expected with early technology like doesn't make much sense to me. You're all arguing that Tesla is treated worse by the media than other car companies, while you're doing exactly the same. If any other company did this, you'd lose your shit. Imagine Boeing coming out with a new landing autopilot. The sensors fail to properly detected a specific runway in time, the pilots trust the system and the plane crashes.
But in this case it's Tesla, and 1 person died. Yet it's fine, because it helps engineers understand the problems. But that's what internal testing is for. If you let consumers do the testing for you, and someone dies, it's the company's fault IMO. They should've tested this thoroughly before allowing consumers to get their hands on it. I understand that eventually there will be a scenario which the system isn't prepared for, but the driver isn't to blame for steering into the barrier. I completely agree with blaming him for not paying attention and steering away, but the system still malfunctioned.
I find it rather weird that the pre-collision systems didn't sensor the barrier. It does detect objects that thin, right? It can also detect people, so I'd assume so.
I don’t think I would lose my shit if Fords autopilot killed someone. That’s a prettty empty assumption. I would react how I am here.
this should've been written like FP's old ban warnings. here I even did a mockup for them:
https://i.imgur.com/wBuUYQi.png
Sorry, you need to Log In to post a reply to this thread.