Google helps Pentagon analyze military drone footage—employees “outraged”
6 replies, posted
[url]https://arstechnica.com/?p=1270035[/url]
wow i am so shocked that one of the best image recognition engines, something that would be an extremely useful asset for identifying targets and automating warfare, is being used by the government for military applications. who would have ever thought this might happen?
[QUOTE=Ninja Gnome;53182096]wow i am so shocked that one of the best image recognition engines, something that would be an extremely useful asset for identifying targets and automating warfare, is being used by the government for military applications. who would have ever thought this might happen?[/QUOTE]
You're right but that's doesn't make it okay. Fuck the thought of drones flying overhead and tracking people automatically.
[QUOTE=Dr. Evilcop;53182144]You're right but that's doesn't make it okay. Fuck the thought of drones flying overhead and tracking people automatically.[/QUOTE]
It actually does make it ok because they're already tracking you, just now they'll be able to better tell if you're actually a terrorist or just some deadbeat.
[QUOTE=Doctor Zedacon;53182220]It actually does make it ok because they're already tracking you, just now they'll be able to better tell if you're actually a terrorist or just some deadbeat.[/QUOTE]
The engineers working on this software didn't sign up to make deathbots. It isn't fair for them to later find out their code was responsible for erroneously killing some family.
[QUOTE=bitches;53182239]The engineers working on this software didn't sign up to make deathbots. It isn't fair for them to later find out their code was responsible for erroneously killing some family.[/QUOTE]
Whether or not you believe them is something entirely, but that notion contradicts the article entirely
[QUOTE]A Google spokesperson responded to the report, saying, “We have long worked with government agencies to provide technology solutions. This specific project is a pilot with the Department of Defense, [b]to provide open source TensorFlow APIs that can assist in object recognition on unclassified data.”[/b]
The spokesperson added, [b]“The technology flags images for human review and is for non-offensive uses only.[/b] Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.”[/QUOTE]
It's worth noting that the drones fly so high and are relatively so small that nobody is ever aware they are there, so it's not exactly implausible that anything they spot would be reviewed even briefly before a human gives the go-ahead to strike. The system they're using is also totally free and open-source to anybody to use, they're just giving them guidance on using it
[QUOTE=bitches;53182239]The engineers working on this software didn't sign up to make deathbots. It isn't fair for them to later find out their code was responsible for erroneously killing some family.[/QUOTE]
Good thing they aren't making deathbots then and their code is ultimately going to help minimize the risk of killing some family. So if they care and have even half a brain to justify their job, they'd understand that.
Sorry, you need to Log In to post a reply to this thread.