• Self-Driving Uber Car Kills Arizona Pedestrian
    111 replies, posted
Sadly, even if the car isn't at fault it'll still be spun around as "BUT REMEMBER THAT TIME THE DRIVER-LESS CAR KILLED SOMEONE IN ARIZONA???"
Lots of AI algorithms previously thought as a 'black box' are now pretty well understood what they do and why they function. So lots of AI experts will just tell you AI/ML algos are not intelligent but very good optimizers.
The police has said that the car is most likely not at fault and that the acident would've happened with a human driver too.
Apologies for my tone, that's a very good explaination. However are you sure the nn implementation is a total black box? Using Seth blings marIO as an example, you can see the neural network visualized in real time, you can visually see the connections made and understand it's decision making. Granted I'm sure marIO is a very basic neural network but I can't see why it couldn't be done with something as complex as a self driving car.
Video released https://twitter.com/TempePolice/status/976585098542833664/video/1
They do but as displayed, its not perfect. This is a sad situation overall but after seeing the footage, even a human driver would have sadly hit her. She really shouldn't have been crossing the road at that place in the pitch black. Hopefully this will be used as a case to more aggressively test the detection systems, something happened to stop it from detecting the side profile of the bike and herself.
So she was in fact jaywalking, more or less. Either the headlight or the camera sucks. If the headlights were that bad in a non-autonomous vehicle, the driver would probably be deemed at fault for failing to operate at a safe speed. Even disregarding that, there's no way that the lidar/radar system didn't/couldn't catch that. There's something seriously wrong here.
Protip : If you don't want to get shwacked by a car don't jaywalk at night without checking the road and then just kind of hoping everyone will telepathically know you're there
Also don't stare at your center console/cellphone for a full 6 seconds while moving
Yeah this video is a real mixed bag. On one hand, the woman pretty much just walks into the middle of the road in the pitch black, and even a human driver likely would have hit her given the lighting and speed of the vehicle.. But, I would expect the radar/LIDAR to be unaffected and notice this obvious, slow-moving obstruction. I'm curious to hear what the explanation is here, because this seems like a case that an autonomous vehicle should have been able to handle.
Going to have to disagree. Without high beams on the lights look fine to me.
For what it's worth, the video is probably a big contributor to how dark it seems. I've been down that road at night and it's not quite as pitch black as it looks, at least not with my run-of-the-mill headlights. If anything I would expect the Uber car's headlights to be brighter than mine. A human driver would probably notice her now that I think about it, just going off of how human eyes are able to adjust way better to darkness than a camera.
As I said, it could just be a shitty camera. Although I can't fathom why you'd want a camera with a sensor that bad in low light in an autonomous vehicle.
Pretty wild people are acting like it's somehow ok or not the fault of the software if people who jaywalk are hit by a car
It's still the fault of the software/hardware, and it definitely needs to be fixed, I'm just saying I still think I'd prefer autonomous vehicles to human drivers even with that caveat. This is clearly an anomaly, it's not as though autonomous vehicles aren't already programmed to avoid jaywalkers and various road obstacles, it'll be investigated, addressed and hopefully fixed; but it's not like the car just spazzed out and swerved into a bike or rear-ended someone at a light, it hit someone that was walking through the middle of the road at night with no regard for traffic in a situation that a human driver would have also likely failed to control.
A human driver could've tried to brake or to avoid her.
if this reddit comment has any truth to it, it would have been literally impossible for a human to not hit the jaywalker. at the end of the day the person is a fucking moron for having done what they did.
The first person to die from a corporate drone strike on US soil
If anyone thinks they wouldn't have been able to spot them in time there either, remember that your eyes do not operate at 480p 15FPS with awful compression. That and if the car had actually applied any braking at all there, that woman would probably be alive (though it's an SUV, so she might've gotten crushed under the wheels regardless).
She's an idiot for jaywalking in the middle of the night, but Uber ATG is going to have one hell of a week of going through the sensor data and figuring out why the self-driving system didn't slow down.
The video shows that she was coming out of nowhere. Maybe some human drivers could have detected her and break/swivel, most people would have probably still hit her. A lot of arguments in the comments on how human eyes could see more then what the video shows. Not sure about that. The obvious question is how LIDAR and other sensors failed here. For any of the on-board sensors, even optical ones the reaction time should have been more than enough to act.
It's fucking dashcam footage, not even the sensor footage from the car - the autofocus is out of control, it's compressed to absolute shit and the resolution and framerate are awful. Uber might just as well have doctored the footage with how bad that footage is. An actual driver would've been able to spot her from miles away (okay, not literally). But yeah regardless, the car's LIDAR and IR should've been able to detect that even if a human couldn't. And IIRC it kept going afterwards - didn't spot her at all. There's something seriously wrong with their sensor software.
It's hard to notice, but there's actually a subtle shift in the car's trajectory when the jaywalker comes into view. It actually veers slightly towards her. Not sure if that's due to something on the road(bump or whatever). Also, we finally have approached the philosophical issue of: who is responsible when an autonomous vehicle kills someone?
Honestly the fact that a backup driver was also there and didn't respond sorta shows that this might have happened anyways. An event happened that neither the software or a person could acknowledge and react to in time. The software should be corrected obviously, but it does go to show that the person didn't do any better. I feel like the software manufactures need to take an immediate hit of fines/restitution. Generally it seems like driverless cars are safer than people driving cars, but there needs to be a legal system in place to ensure that the software engineers are doing their best to ensure accidents don't happen, and when they do there needs to be big compensation for victims. there's going to be a lot of automation of vehicles in the future, and something needs to be set in stone to make sure that the safety is always improving.
Yeah but the only reason the person was so unattentive and useless was because they were trusting the car to do it's job correctly. It's definitely a failure on the car's part not stopping for an obvious obstruction in the road. I hope they release details on why the LIDAR fucked up so bad here.
"Yeah but the only reason the person was so unattentive and useless" See: The cause of most car accidents
You're going to argue it's not the car's fault it hit someone because some people are bad drivers?
You may have not read the second parapgraph of my post. I'm not diverting blame from the obvious software/hardware faults, those are the first things that need to be fixed; even before the backup driver is reprimanded. I'm simply stating that this accident is not a reason for reactionaries to condemn automated driving, given how people are even more prone to making mistakes than the software is. The backup driver also fucking up is just an example, both the machine and the person fucked up. However, now *all* the automated drivers can be corrected to not make the same mistake, but every other person on the road is still capable of making this same mistake that the backup driver made (ie: not paying attention while operating/supervising a vehicle)
Yeah, but don't you think she would have been at least a bit more attentive if she hadn't been sitting idly in an automated car all day? Human drivers are obviously very error prone and often get distracted which is why I still am a believer in autonomous vehicles, but I think it's fair to say that even the worst human driver would at least be trying a little bit harder than this person if they had to manually drive the vehicle themselves.
Well, that would be very true.. were it not for the fact that this vehicle was in testing with a backup operator. That persons entire job was to be supervising a moving test vehicle in a public space, it doesn't matter if it was boring. Their job was to be there, watching the road, and to use the override controls when necessary. The fact that there was a backup driver present shows that the vehicle was acknowledged to have potential faults, that's why a backup driver was there in the first place. It's harsh, but if the backup driver had been paying attention, Uber could have diagnosed the problem in the software without somebody dying in the process. Until these cars are out of testing and are literally "driver-less," there's still a need for due diligence on part of supervisors. I don't even know where I'm going with this anymore, I'll try to wrap this rambling by stating that this was a test. The problems with the car need to be fixed, but you can't punish the car for executing a faulty program. The various peoples who didn't do their due diligence are the ones at fault here, you can't blame some vacuous intangible concept of autonomous vehicles, it was ultimately people who made the mistakes in the engineering and testing.
Sorry, you need to Log In to post a reply to this thread.