• NTSB 'unhappy' with Tesla release of investigative information in fatal crash
    74 replies, posted
But it is literally autopilot... Airplane autopilot still requires a pilot present, it cannot detect hazards like mountains or make complex maneuvers like landing. This is literally people who don't know what autopilot does and assume it works like Homer's semi-truck in The Simpsons.
Yes, but as I said, people are idiots.
Actually, some airplanes do have autoland. With FMS systems, autopilot can basically fly the whole route. Autopilot isn't just an altitude and heading hold anymore. https://www.wired.com/story/boeing-autonomous-plane-autopilot/
While the driver didn't use the Autopilot function as he was supposed to, keeping both hands at the wheel and paying attention, I don't think we should just ignore how the Autopilot caused the crash in the first place. The fact that the driver didn't use it well doesn't mean the system didn't malfunction. If you're using an autopilot, you're always going to pay less attention to the road than if you're actually driving. You trust the system to work, so it might take you a second before you intervene when it starts steering you towards a concrete barrier. And when you do, you're probably a bit more panicked because of your late reaction and the fact that the system is driving into a barrier. Even if you do pay attention and intervene, it would've been dangerous. The fault lies with both the driver for not paying attention, and Tesla because their system didn't spot the barrier and somehow steered into it. Dantai's video on the first page shows the system failing. This is definitely not an intended feature and as seen from the crash, potentially deadly. People make the comparison here to cruise control, which the Autopilot sort of is. If adaptive cruise control malfunctions and creates a dangerous situation by accelerating into the car in front of it, who's to blame? The driver should pay attention, but the cruise control is the cause of this situation. The driver could prevent the outcome from turning badly, but he's not the one who accelerated. I do believe this whole topic has blown up though. The media just put the spotlight on it because it criticizes autopilot systems and generates some quick cash.
Exactly, there's been a strange obsession about absolving Tesla from any blame. I've noticing it both here and on Reddit.
You can still blame the driver and not Tesla. By using Teslas system, you have to understand that its not full proof and still very much in its early phases. You cant expect a system that drives the car for you to work flawlessly when its not even done yet. When you buy a Tesla, youre a beta tester. If you use the system, you have to expect it to fail.
Or, we can all be rational adults and say that while the driver was the main cause, the failure of the autopilot system to properly track the road and stop before the car hit a static object IS a factor in this.
It's important to talk about Tesla's side of this because the crash was directly caused by the autopilot function. It wouldn't have happened if the car wasn't equipped with it. For example, let's say you were using normal cruise control, but the vehicle started to accelerate like crazy instead of keeping a constant speed, which caused you to flip while going around a turn. Sure, you should have noticed and turned off the cruise control, but the cruise control still malfunctioned and is partly responsible for the accident. In the same way, yeah, the driver should have reacted, but the autopilot function still bares some responsibility.
I never denied the autopilot may have failed. Im simplying saying its foolish to incriminate tesla because their brand new system failed. Its impossible to make cutting edge tech like this 100% safe in testing but also make it marketable for your average consumer. You cant have your cake and eat it too.
Similar errors have occurred and still automated throttle control systems is a standard component in many vehicles. The important thing is we have conclusive proof of a reproducible flaw in the autopilot system, so we can agree that Tesla should acknowledge the issue, release a fix in a timely manner and let this whole incident serve as learning experience that will lead to better autopilot systems.
Newness isn't an excuse for malfunctions unless consumers have agreed to be some sort of beta tester.
100% agree. We should expect nothing less than a total commitment on Tesla's part (as well as any car manufacturer) to investigate issues where the autopilot is involved, so that the systems can be improved as fast as humanly possible if the manufacturer wishes to keep them on public roads. We also have to acknowledge the possibility that accidents involving autopilot/self-driving systems may happen while these improvements are being made. If we can accept both of the above, we're on our way to a good future.
Ive never bought nor driven a Tesla, but I would assume its in the paperwork or on screen prompts. Even then, youd have to be a fool to not know that the autodrive system isnt fully functional. You cant not know thede things if youre buying a tesla. "Its the Wright Flyer's fault he died in the crash. Theres no way he could have known it wasnt fully safe"
I'm noticing a strange obsession trying to paint Tesla as the worst of evil and greedy corporations. I don't want to go on record as one to paint conspiracies but I don't remember that happening before Tesla's electric cars found commercial success. Anyway I for one will argue it if I see someone unjustly blamed, as in this case.
Tesla's autopilot certainly is partly responsible, I don't think anyone is denying that. Even musk's comment implies that. I don't think it deserves all the blame though. Even if Autopilot does make unfortunate mistakes like this, overall it's still a net positive, and Tesla's autopilot team undoubtedly feels like shit about. But they will be figuring out exactly why it made this mistake, and they will fix it. Potentially hundreds of thousands of cars around the world will then get that fix automatically overnight and they won't make that mistake.
Heres another video of a model X on the same stretch of road: https://youtu.be/VVJSjeHDvfY It does look like something that needs to be addressed. I'm curious how other manufacturers like Merc and Cadillac handle the same area. Since these systems use the painted lines as a reference, the car didn't notice the faded line, so it followed the left line as if it was the road. There should also be systems in place so the car can realize it's driving towards a wall.
It isn't exactly cutting edge tech. Adaptive cruise control that monitors the speed of the car in front of you has been in use for over 20 years. Systems which check for vehicles in front of you have also been installed in cars since 2004. There have also been systems which track your lane to make sure you keep in it. The only new thing is allowing the computer to take full control of the car. Tesla's autopilot has been installed in their cars for several years now. There is no reason not to incriminate Tesla for this failure. They make a system, market it as the safest way of driving and then it causes a death. Upon that death, they push all the blame to others. If they marketed it as a beta feature only for a select amount of drivers, that it can't guarantee their safety or whatever, then I'd be fine with that excuse. But here you've got a plain malfunction which caused a death, and Tesla barely even acknowledges it. Instead they push a statement to say their cars are much safer and that other brands would fail in their position.
The tech has vaguely been around for years but its never been commercially tested until now. Comparing lane assist and adaptive cruise control to Tesla's system is silly. And even with this one death, computer driven vehicles are still safer. Theres been what, 2 deaths that can be attributed to system failure? Yes the system failed in this one instance but the bottom line is the driver wasnt properly operating the vehicle. If he was, this wouldnt be a headline.
Im of the belief that people are always gonna die when doing dangerous things of which driving is statistically very dangerous. We can make systems and cars safer but there's always going to be shit happens scenarios and I think this is one of them. These deaths serve to highlight flaws in engineering and allow them to be fixed, making the system safer. Autopilot is very safe but that doesn't mean it's not completely safe. You should always get into a car with the understanding it could be the last thing you do. Even if you're a safe driver, it doesn't mean others are safe and your death can be completed not your fault. Self driving systems should be regarded in the same way by the occupants. But that doesn't mean they aren't safer than Human drivers or should be banned or have the manufacturer sued because one of the thousands of users found a way to die with it. Unless it could be proved that the manufacturer was aware of the issue (like those who failed to recall faulty airbags) and not just some unforeseen freak situation where everything was right for it to go wrong. Now they can use this tragic situation to make the system safer like the last dude who died using autopilot.
The safest way of driving doesn't equal risk free driving, see OvB's post.
The entire thread here has argued that Tesla's autopilot is not a full autopilot and instead just a lane assist and adaptive cruise control. I won't go further into it. Whether or not the driver was paying attention is irrelevant. The Autopilot system failed and created a dangerous and in this case deadly situation. There are already several videos of Tesla's doing the exact same steering correction. But Tesla's statement doesn't show really show any concern about that. Instead it boasts about how safe their systems are. They should've released a better statement, something along the lines of "Our Autopilot system has shown a malfunction at a recent crash. Our engineers are working to fix this particular problem. Please pay attention while driving."
Well, you're right that there's an effort to smear Tesla as well. It started years ago, but has ramped up quite a bit as Tesla and SpaceX picked up steam. Here's an article https://electrek.co/2016/11/22/elon-musk-right-wing-trump-propaganda-campaign-against-tesla-spacex/ and http://www.globalresearch.ca/alternative-energy-whats-really-behind-the-assault-on-tesla-factory-safety/5592073 Tesla shouldn't be blamed for the accident in this case. But valid arguments can be made about its marketing about the "autopilot", about the lack of warnings as shown in the video on previous page, about the infancy of the technology and so on. In the end the driver himself is responsible for what happens with the vehicle.
Isn't that the very definition of an Autopilot system? The computer handles the mundane rubbish, and the pilot checks the auto pilot isn't doing anything unexpected, or whatever. Pilots on commercial airliners aren't allowed to go start banging hookers and snorting coke as soon as they turn the autopilot on.
Let's have a list of systems and mechanisms that can kill you if you don't pay attention and commit major user errors Gas stoves Any form of fast moving vehicle Power-tools Industrial equipment Lifts/Elevators Stairs Doors Power outlets
And all those things have minimum safety standards
So does the Tesla autopilot, what's your point ?
This isn't a case of improper use or failing to prevent a crash. It's a case of the autopilot directly causing a crash where one wouldn't have happened otherwise. A comparison with, say, a gas stove would be buying a gas stove and having it leak into your house, causing a gas explosion that kills the resident. Let's say that the stove even had a warming light telling the resident that it was leaking. Even with that safety feature, the stove is the major culprit. We would all recognize that it isn't acceptable to sell stoves that kill people through normal usage. In the same way, this death was directly caused by the autopilot. That isn't the same thing as saying it failed to prevent a death or that it was being used improperly.
The car crash happened because the driver was not paying attention. Having an easier time driving does not mean you should stop paying attention and the car itself makes that abundantly clear. With the amount of people who kill themselves or others with cars by being distracted idiots while driving, I think it's highly unfair to blame the autopilot function specifically for it and not the driver being inattentive. As people have said before it's like cruise control, it's there to simplify your experience, not take over for you, and someone who gets in a collision because they just zoned out after their car started handling their speed for them is directly at fault, not the cruise control.
Looking away from the road for at least 6 seconds is the very definition of improper use of Autopilot.
I really don't think it's a stretch to say that we could, theoretically, create a minimum standard for self-driving cars that might UNFORTUNATELY disallow nearly the earliest implementation ever. It just happens sometimes, go figure.
Sorry, you need to Log In to post a reply to this thread.