• Tesla nerfs Autopilot in Europe due to new regulations limiting steering angle
    26 replies, posted
Tesla is starting to push a new software update that is pulling back some Autopilot features in most European markets due to new regulations. Last year, the European Union adopted a revision to their UN/ECE R79 regulation, which oversees the steering of driver assistance systems. Tesla wrote in the release notes: “Due to new local regulations, the limit of how far the steering wheel can turn while Autosteer is active has been adjusted. This may reduce Autosteer’s ability to complete sharp turns. Additionally, to initiate Auto Lane Change, the turn signal must be engaged to the first detent (held partially up or down) and the lane change must start within 5 seconds of engaging the turn signal.” https://electrek.co/2019/05/17/tesla-nerfs-autopilot-europe-regulations/
Ensuring road safety by making the car steer you off the fucking road
What does limiting steering angle accomplish?
Prevents autosteer from making sudden, sharp and jerky movements of the car, one would presume. Never knew this was something anyone considered a problem, but I can see why you would do it. Still stupid.
System Misuse – “Hands Off” Driving LKAS (Lane Keeping Assistance System) on the market today are designed to be operated “hands on”. The vehicle manuals also include corresponding information for and warnings to the vehicle owner. However, an increasing number of drivers are misusing the systems and removing their hands from the steering wheel, particularly for LKAS in combination with Automatic Cruise Control (ACC). Some drivers may only want to test the limits of the systems, while others may have a poor understanding of the system limitations and believe that “hands off” operation is possible under certain circumstances. There are plenty of videos on social media platforms documenting this misuse. The scenarios range from drivers that let the vehicle do the steering while keeping their hands next to the steering wheel, to others who fully rely on the vehicle while having their hands on their lap or even using both hands for eating and drinking. In some extreme cases the driver has even left the driver ́s seat, meaning he would no longer have the opportunity to intervene if there is a system error [1]! A fatal crash of a Tesla S operated in AutoPilot-mode happened in May 2016. Driver misuse of the system is believed to have played a significant role. https://www-esv.nhtsa.dot.gov/Proceedings/25/25ESV-000202.pdf It looks like the regulation changes are intended to discourage people from misusing the LKAS, as those people confuse it with a ‘hands-off’ system. Even Tesla warns that drivers should keep their hands on the steering wheel when autopilot is engaged, but there are many documented cases where people don’t do that. So by restricting the capabilities of autopilot, people will stop misusing it, and there will be fewer crashes.
Guess we will have to wait for VW to have something at a similar level to Autopilot ready to go so the regulations get changed.
As I say almost every time this comes up, calling it autopilot was the worst thing Tesla could have done.
I don't think the example is even relevant, if Autopilot was doing a sharp turn then people are more likely to be aware of what it's doing compared to to being on a straight road. The example they gave was on a straight road.
How on Earth do you make something safer by stunting it's ability?
I'm not supporting it at all, but I assume the mindset is that it will further enforce the notion that it IS NOT in fact auto pilot, and shouldn't be relied upon. A step backwards for self driving cars, a technology that literally can't come soon enough, but only once it's ready.
Thing is, it's called that in airplanes, and the way it works there is very similar to how it works in a Tesla.
Yes. I know that, and you know that, but to the lowest common denominator autopilot in a plane means https://files.facepunch.com/forum/upload/114100/c0aa388e-9681-4bb1-b288-6066df790fc3/image.png
Probably just lead to the system being used less, and if it's generally being considered safer to use it than not then it might lead to more accidents. Thousands of people kill themselves (or others) every year due to inattentive driving.
Laws written by a bunch of people who don't have a clue how the technology works. Article 13 was proof enough of that.
Jesus. There's a stretch of 70mph dual carriageway near me which is the maximum radius for a 70mph road, where this would likely cause the autopilot to plow you through the central barrier and into a 140mph impact with opposing traffic if you had autopilot engaged. I suspect a large number of freeflow junctions such as M5 J16/M4 J18 would result in a sudden departure from controlled driving. The 'advised' speed is 50mph but you could probably corner at 70 in a car with the weight and grip of a Model 3 or S. I can't see how this is a good thing - driver inattention issues, being solved by the possibility of the car killing you even more quickly and catastrophically if you stop paying attention?! What? There's no way that the car could warn you the 'steering angle' had been exceeded until it was already on its way to a departure from your driving lane. If I understand the technicals here correctly, this will kill people.
Pretty sure the Tesla autopilot is capable of detecting an obstacle like a barrier and stop instead of keeping on going into opposite traffic, that's kinda fundamental for a road autopilot system.
I hope you're right. That would require Autopilot to be programmed to look ahead and know when it was going outside its steering limits and apply the handover process in good time and get a satisfactory response from the driver. The bend I'm talking about would require the driver and car to know well in advance that the system was going to exceed its steering limit. There's no time for a brief departure from control.
People simply need to be have their licenses revoked if they actively bypass the autopilot wheel detection system. In normal circumstances, Autopilot will turn itself off completely and not allow you to turn it back on until you come to a complete stop, park the car, exit the vehicle, close the door, and enter again, if you keep ignoring its "keep hand on wheel" warnings
by forcing people to drive they have to actually...drive. the last thing tesla needs right now is a long protracted lawsuit when Musk just said they have only 10 months cash on hand and their vehicle sales are dropping as everybody who wanted a tesla has gotten a tesla.
I'm just kind of stunned by the use of the word "nerf" in that headline we live in a world where your car can get balance patches the future is a strange place
It also makes fart noises and you can play Atari games.
I've seen complaints of cars with automatic collision mitigation systems going full nuclear on the brakes at random on highways when the sensor has a hiccup or sees a bag and thinks it's a stopped 18 wheeler. If a car with lane keep systems doesn't have multiple redundant & certified interlock systems a stray piece of garbage on the road could cause it to think there's an abrupt change in the lane direction and steer hard to the side. However, I'm not aware of the latter happening, AFAIK pretty much all these system just scream at the driver to take control and disengages if any one particular system disagrees with the rest.
>I've seen complaints of cars with automatic collision mitigation systems going full nuclear on the brakes at random on highways when the sensor has a hiccup or sees a bag and thinks it's a stopped 18 wheeler. A guy I worked with a year or so ago actually returned a car because of this. Every single time he'd go under the railroad bridge outside the one lot the anti collision system would lock his car up and he'd have to do something like put it in park and then in low to force it to move. This would happen with his ass end hanging out on a corner with poor visibility and people routinely doing 50+.
Tesla does suffer from some phantom braking events caused by overhead objects, but the driver can always override it with the accelerator pedal. They crowd source these events and overrides from the fleet though, so they generally disappear for a specific location after a few weeks.
the source of these problems is all these companies using LiDAR as the primary source of information, which only provides distance. Tesla's use solely cameras (and a radar for the front to be able to detect further ahead + the car in front of the car in front of the tesla) -- so not only are they able to see infinitely more information than just a LiDAR, they can receive updates to change the handling of certain visuals. And as Musk put it, why use LiDAR instead of cameras anyway, anything that would obscure a camera would obscure a LiDAR because it uses visible light anyway.
Tesla is (for the most part) far better about this than most of the competition. Regardless, I'm not looking forward to getting a new car in a few years when all this driver assist tech is mandatory, but still unpolished.
Sorry, you need to Log In to post a reply to this thread.