• Drone industry to implement the 3 laws, Will Smith inconsolable.
    16 replies, posted
[quote]The drone industry on Monday unveiled its first-ever “code of conduct” policy, designed to protect the privacy of those on the ground and ensure the sector adheres to safety standards as the popularity and usage of unmanned aerial vehicles continue to grow. Released by the Association for Unmanned Vehicle Systems International, the guidelines focus on three principals: safety, professionalism and respect. They include promises that the industry will properly test all drones before flight, comply with all laws governing aircraft, respect the privacy of individuals and work to better educate the public. “Acceptance and adherence to this code will contribute to safety and professionalism and will accelerate public confidence in these systems,” the association said in a statement. Unmanned aerial vehicles, now used by the military, law enforcement and government agencies, will be available for commercial use by 2015. Better known as drones, the aircraft have raised a number of privacy concerns.[/quote] [url]http://www.washingtontimes.com/news/2012/jul/2/drone-industry-releases-ethics-code/[/url]
That movie had potential, but fell short.
I don't know why but I expected them to be the laws from Bicentennial Man.
It might just be the acid man but I was thinking iRobot
Title implies these are being applied to the drones. These "laws" are just honeyed promises being made by people wholly capable of breaking laws. "We won't spy on you. Pinky swear."
What the fuck is Will Smith inconsolable implying?
[QUOTE=plokoon9619;36601519]What the fuck is Will Smith inconsolable implying?[/QUOTE] It's a reference to I, Robot (The movie) and how it isn't Asimov's Three Laws of Robotics.
was expecting the three laws of robotics A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
A+++ on the title, had me rolling
so basically the first law will be: 1. A drone may not invade the privacy of a human being or, through inaction, allow a human being's privacy to be invaded. no point in writing the other laws as they don't change
[QUOTE=Fish Muffin;36601620]was expecting the three laws of robotics A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.[/QUOTE] I don't believe any government would want to purchase anything under those laws.
[QUOTE=Fish Muffin;36601620]was expecting the three laws of robotics A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.[/QUOTE] There was also the Zeroth Law of Robotics : "A robot may not harm humanity, or through inaction, allow humanity to come to harm".That law enabled a robot to break the 3 laws,but it came with a cost,damaging it`s pathways.
I have always found the 3 laws fascinating, yet been opposed to them. Not because I want "Super Killbot 9000", but they just seem unethical to me. Especially Law #2, but the whole thing in general. If we get to the point where we have robots that are capable of reasoning and independent thinking, it just seems unethical to put artificial inhibitors in it that we would never consider putting in a person.
[QUOTE=Doctor Zedacon;36610364]I have always found the 3 laws fascinating, yet been opposed to them. Not because I want "Super Killbot 9000", but they just seem unethical to me. Especially Law #2, but the whole thing in general. If we get to the point where we have robots that are capable of reasoning and independent thinking, it just seems unethical to put artificial inhibitors in it that we would never consider putting in a person.[/QUOTE] uh it's not as though there's some kind of chained-up soul inside there with a will of its own, but can't do what it wants because of restrictions bolted on top you'd design it so it wouldn't [I]want[/I] to do that sort of thing in the first place
AUVSI is an organization that represents the interests of unmanned systems developers, but its word isn't law; it doesn't set the rules for how companies should design their devices.
[QUOTE=Elecbullet;36601447]That movie had potential, but fell short.[/QUOTE] What are you talking about? It was an amazing movie.
[QUOTE=DainBramageStudios;36610948]uh it's not as though there's some kind of chained-up soul inside there with a will of its own, but can't do what it wants because of restrictions bolted on top you'd design it so it wouldn't [I]want[/I] to do that sort of thing in the first place[/QUOTE]In that regard, as long as no one ever knows they lack freedom, isn't it then alright to deny them said freedom? Wouldn't slavery be a perfectly valid concept so long as the slave was never made aware that they could be free? In that regard, a slave would be perfectly content with its life because that is all it knows, and so it thinks its at the top, and it'd be perfectly happy never knowing freedom. But that would be unethical wouldn't it? And i'm not talking about Siri/CleverBot type things, I'm talking about fully developed AI that can reason and think on the same level as a person could.
Sorry, you need to Log In to post a reply to this thread.