Well yeah, that makes perfect sense to me. A bit of a strange analogy but if your car kills someone aren't you responsible in the same way as if your pet killed someone?
[QUOTE=Trekintosh;51859163]Well yeah, that makes perfect sense to me. A bit of a strange analogy but if your car kills someone aren't you responsible in the same way as if your pet killed someone?[/QUOTE]
No, see because you train your dog right? But you don't train a car though yeah? So therefore you're not responsible for the car's actions then yeah, because you didn't program the thing? Some engineers at the car company are however if anything.
It should be insured because like, that's a pretty obvious one, but it's not your fault if the self driving car got in an accident unless you programmed the thing your self and the other party in the crash wasn't at fault there.
[QUOTE=Qbe-tex;51860880]Maybe not the Google Car but Tesla's automated car still has user control and the user's command override the car's commands (IIRC) so I suppose that the owner is responsible for the death of a person if he/she didn't react in time the same way it would for a normal car, unless he wasn't paying atention to the road. Much like a pet I suppose.[/QUOTE]
That's fair if there's the ability to have complete user control then yeah.
Sorry, you need to Log In to post a reply to this thread.