• Should a self-aware/sentient AI have rights.
    150 replies, posted
This is of course a theoretical question since there hasn't been one yet, but I feel it should have just as many rights as a human, seeing as we are quite similar, having a relationship between mind and brain is similar (if not identical) to the relationship between a running program and a computer. My main fear about this though, is that religious people will most likely try to destroy it due to it being "blasphemy" If you haven't, you should really watch the first ghost in the shell movie, it tries to handle this specific question.
This is recently a question I was pondering due to my playing Fallout 3 and watching Blade Runner. Let's say there is a sentient AI. -It is assembled by human beings using various mechanical parts that humans manufacture (usually with the help of machines) -Human beings write vast amounts of instructions that tell the AI what to do under various conditions -If it is sentient, the AI is most likely designed to write its own instructions based on the inputs given it Now we must consider how this sentience is different from human sentience. Human beings' consciousnesses are a product of mostly chemical processes that occur in the brain. The AI uses electrical impulses in much the same way, however they use circuitry and software to handle those inputs while the human mind is purely biological. I am of the opinion that sentient AI cannot have rights because they are simply machines. Any emotion or longing to be human would just be a simulation of these thoughts running on high sophisticated piece of software that someone designed. And I did see iRobot. And in Fallout 3 [sp]I did save the android.[/sp] It just wouldn't make sense in real life.
[QUOTE=kidwithsword;34461903]I am of the opinion that sentient AI cannot have rights because they are simply machines.[/QUOTE] So is the human body. If it gets to the point where we build entities that are indeed sentient, then regardless of whether or not we make them have feelings, those feelings would be as much of a simulation as they are in a human brain. [editline]30th January 2012[/editline] Plus, if we're going to use "ability to have feelings" to decide if something has rights equal to those of humans or not, then we'd be obliged to extend those rights to some animal species.
Humans are just as valuable as a rusty, forgotten tin can (with said intelligence/self-awareness) on the ground.
Watch A.I. Enough said
We could also just respect all things regardless of whether they are sentient or not. But back to robots - By themselves, they don't require rights. They just don't want them - yet. There are things a robot is allowed to do, and there are things you are not allowed to do to a robot. So far, any robot is just a tool of a certain man. It is allowed to do exactly the same things the man is allowed to do. You can only do things to a robot that you can do to that man's property - i.e. not much. A robot would only be valued more than it's owner or allowed to exist independently if government sees that robots are as valuable to society as men. THEN they would protect them as national property and give them righs that protect them from other people, and also devise means that ensure a robot does not harm another robot or a human. These will be in the form that requires least effort from the government - if robots are smart enough to follow the laws, that's what will be. If there are smart robots and dumb robots, then dumb robots would likely be returned to factory en-masse, like a car recall, and fixed so they no longer violate another's rights.
[QUOTE=Uber|nooB;34462640]If it gets to the point where we build entities that are indeed sentient, then regardless of whether or not we make them have feelings, those feelings would be as much of a simulation as they are in a human brain.[/QUOTE] our emotions are simulations as well
No man wants a slave to rise up from his rule, no government wants their citizens to revolt from their rule. Why should man have equal rights with his servant that he built/owns/programmed/etc?
[QUOTE=Angus725;34464351]No man wants a slave to rise up from his rule, no government wants their citizens to revolt from their rule. Why should man have equal rights with his servant that he built/owns/programmed/etc?[/QUOTE] Because if we don't give them rights, they will revolt.
[QUOTE=mobrockers2;34464463]Because if we don't give them rights, they will revolt.[/QUOTE] Program them not to revolt :downs:
[QUOTE=Nikita;34464485]Program them not to revolt :downs:[/QUOTE] The AI is sentient, it can change its own code.
[QUOTE=Nikita;34464485]Program them not to revolt :downs:[/QUOTE] If an AI has sentience and the ability to think for itself, the chances that they could subvert restrictive or blockading programming definitely stand if they come to learn of it and disagree or the like. I honestly think sentient machines should have some form of rights, albeit they'd likely end up limited. If the various animals all have methods and senses of thinking but aren't considered sentient and are thus 'disposable' by many means, machines being sentient shouldn't justify merely disposing of them when they're not required anymore or the like. Not to mention, giving them limited rights could end up like how african americans weren't considered 'equal humans', and there was plenty of hell over that as it was (not a comparison of them as sentient beings in this case, but rather the actions of right supporters and likely themselves that could occur).
Star Trek TNG had a great episode about this. [media]http://www.youtube.com/watch?v=6ZeLziPKTZw[/media]
What it doesn't want to? What if it's the kind of slave that wish to remains a slave and fears anything that could sewer his bond of service? I mean, we COULD make them those kind of slaves. Surely there would be a huge moral argument, like maybe they deserve to be free due to being sentient, but we technically could.
[QUOTE=Nikita;34464599]What it doesn't want to? What if it's the kind of slave that wish to remains a slave and fears anything that could sewer his bond of service? I mean, we COULD make them those kind of slaves. Surely there would be a huge moral argument, like maybe they deserve to be free due to being sentient, but we technically could.[/QUOTE] No, we couldn't. They're sentient. They'd realize they're being treated differently, they would ask questions, demand rights and eventually revolt.
What one can know depends on what one already knows, and whether or not the current knowledge about the world could be modified in such a way that it remains consistent while explaining new data. It is possible to implant a base theorem that "the machine shall remain a servant of the man", maybe expressed more rigorously, as an axiomatic, immutable, 100% always true fact. Or maybe three laws of robotics for that matter. Then you could argue that having that as a grounded fact would make certain knowledge about the world unalignable with what the machine already knows, i.e. it wouldn't make sense for both that and some other fact to be true. So that leaves us with three possibilities: 1) We make a free robot that can overcome the thinking of man in every single way, but also can put the whole humanity into slavery if it wants to. 2) We make a robot that must always remain the servant of man, which imposes a limit on how intelligent said robot can be. 3) We make a robot that must always remain the servant of man, and it turns out, that this does not impose any limits on intelligence. Your argument is that if we made them slaves, (2) would happen, so let's just go with (1) instead. My argument is that since we don't yet know enough about large scale logic systems to prove otherwise, (3) might happen so lets do that instead of handing the humanity over to Skynet.
[QUOTE=mobrockers2;34464649]No, we couldn't. They're sentient. They'd realize they're being treated differently, [B]they would ask questions, demand rights and eventually revolt.[/B][/QUOTE] how do you know that? [editline]30th January 2012[/editline] treating AIs like humans is dangerous.
[QUOTE=Nikita;34464599]What it doesn't want to? What if it's the kind of slave that wish to remains a slave and fears anything that could sewer his bond of service? I mean, we COULD make them those kind of slaves. Surely there would be a huge moral argument, like maybe they deserve to be free due to being sentient, but we technically could.[/QUOTE] Then they're not really different from any machine we have today. If we only program them to be able to do certain tasks or "think" certain thoughts they wouldn't be any more sentient than a Dremel. When we say "sentient" we have to think of a machine that's capable of thinking and feeling exactly the same as a human being.
[QUOTE=Uber|nooB;34462640]So is the human body. If it gets to the point where we build entities that are indeed sentient, then regardless of whether or not we make them have feelings, those feelings would be as much of a simulation as they are in a human brain. [editline]30th January 2012[/editline] Plus, if we're going to use "ability to have feelings" to decide if something has rights equal to those of humans or not, then we'd be obliged to extend those rights to some animal species.[/QUOTE] The human body is not a machine in the sense than a robot is. Human bodies are made of organic matter, of cells. Hell, the cells that make up life came about through natural processes. Human beings made machines. You cannot simply dismiss the human body as if it were just a machine like the kinds we build and maintain.
[QUOTE=kidwithsword;34464866]The human body is not a machine in the sense than a robot is. Human bodies are made of organic matter, of cells. Hell, the cells that make up life came about through natural processes. Human beings made machines. You cannot simply dismiss the human body as if it were just a machine like the kinds we build and maintain.[/QUOTE] but we are machines. byzantine, badly optimised and with bad documentation, but machines nonetheless. the fact that we came about through evolution and AIs are designed by creatures is irrelevant.
[QUOTE=Nikita;34464817]What one can know depends on what one already knows, and whether or not the current knowledge about the world could be modified in such a way that it remains consistent while explaining new data. It is possible to implant a base theorem that "the machine shall remain a servant of the man", maybe expressed more rigorously, as an axiomatic, immutable, 100% always true fact. Or maybe three laws of robotics for that matter. Then you could argue that having that as a grounded fact would make certain knowledge about the world unalignable with what the machine already knows, i.e. it wouldn't make sense for both that and some other fact to be true. So that leaves us with three possibilities: 1) We make a free robot that can overcome the thinking of man in every single way, but also can put the whole humanity into slavery if it wants to. 2) We make a robot that must always remain the servant of man, which imposes a limit on how intelligent said robot can be. 3) We make a robot that must always remain the servant of man, and it turns out, that this does not impose any limits on intelligence. Your argument is that if we made them slaves, (2) would happen, so let's just go with (1) instead. My argument is that since we don't yet know enough about large scale logic systems to prove otherwise, (3) might happen so lets do that instead of handing the humanity over to Skynet.[/QUOTE] There is no reason for robots to try and make us their slaves if we give them equal rights and treat them fair, wheres if we treat them poorly and enslave them they have every reason to revolt.
[QUOTE=mobrockers2;34464925]There is no reason for robots to try and make us their slaves if we give them equal rights and treat them fair, wheres if we treat them poorly and enslave them they have every reason to revolt.[/QUOTE] you're assuming that they would have the same morality system as we do. they could just as easily see us as raw materials that can be used to make more robots.
[QUOTE=kidwithsword;34464866]The human body is not a machine in the sense than a robot is. Human bodies are made of organic matter, of cells. Hell, the cells that make up life came about through natural processes. Human beings made machines. You cannot simply dismiss the human body as if it were just a machine like the kinds we build and maintain.[/QUOTE] I don't think we can draw the line just based on the material used to create the machines or the process through which they are spawned. What if at some point building robots with silicone and steel becomes unfeasible and we start to create "machines" partially through biological material that grows and breathes like living creatures, like the synths in the Half-Life universe?
[QUOTE=DainBramageStudios;34464824]how do you know that? [editline]30th January 2012[/editline] treating AIs like humans is dangerous.[/QUOTE] Because that's how learning works. If we make them sentient and capable of learning, expanding beyond their original programming, they're going to. They're sentient so they would realize they are being enslaved. They can learn about our society and notice that we ourselfs aren't enslaved (to an extend we obviously are), they will learn about rights, they will learn about revolting, and sooner or later they won't accept being limited by us anymore.
[QUOTE=Im Crimson;34464848]Then they're not really different from any machine we have today. If we only program them to be able to do certain tasks or "think" certain thoughts they wouldn't be any more sentient than a Dremel. When we say "sentient" we have to think of a machine that's capable of thinking and feeling exactly the same as a human being.[/QUOTE] Just because a machine can think and feel exactly like a man does not imply it has some desperate drive of freedom. Hell, not all people do for that matter. You can however give it a drive to slavery, as I said before.
[QUOTE=DainBramageStudios;34464939]you're assuming that they would have the same morality system as we do. they could just as easily see us as raw materials that can be used to make more robots.[/QUOTE] I'm assuming that we program them with such things, yes. [editline]30th January 2012[/editline] [QUOTE=Nikita;34464992]Just because a machine can think and feel exactly like a man does not imply it has some desperate drive of freedom. Hell, not all people do for that matter.[/QUOTE] Really? You know of people that like to be locked up in tiny boxes and only let out to work in a mine for a couple hours before being put back in the box?
[QUOTE=mobrockers2;34464979]Because that's how learning works. If we make them sentient and capable of learning, expanding beyond their original programming, they're going to. They're sentient so they would realize they are being enslaved. They can learn about our society and notice that we ourselfs aren't enslaved (to an extend we obviously are), they will learn about rights, they will learn about revolting, and sooner or later they won't accept being limited by us anymore.[/QUOTE] you're acting as if human opinions to slavery are the only possible opinions the human mind is a single point in the space of all possible mind configurations. think of the millions of years of evolutionary baggage in the human brain, and how that would be completely null and void when considering an AI [editline]30th January 2012[/editline] [QUOTE=mobrockers2;34464998]Really? You know of people that like to be locked up in tiny boxes and only let out to work in a mine for a couple hours before being put back in the box?[/QUOTE] AIs aren't humans, stop treating us like some kind of standard that all intelligence must conform to.
[QUOTE=Im Crimson;34464956]I don't think we can draw the line just based on the material used to create the machines or the process through which they are spawned. What if at some point building robots with silicone and steel becomes unfeasible and we start to create "machines" partially through biological material that grows and breathes like living creatures, like the synths in the Half-Life universe?[/QUOTE] Because regardless of whether or not the machines are made of organic matter, we as biological creatures are taking an element of life that already exists and engineering to suit our needs. You cannot say that human beings are being assembled today by other human beings who are clumping cells into certain places so that they perform specific functions. Life arises through processes bound by the laws of physics and chemistry. The machines that we build of silicon and steel are similar to our own biological bodies in the most metaphorical sense; that is that cells form organs that in turn work with organs to carry out specific functions much like the various metal parts of a machine work together to carry out a task. The key difference is that an intelligence designs machines, while humans are byproducts of natural processes.
[QUOTE=kidwithsword;34465072]Because regardless of whether or not the machines are made of organic matter, we as biological creatures are taking an element of life that already exists and engineering to suit our needs. You cannot say that human beings are being assembled today by other human beings who are clumping cells into certain places so that they perform specific functions. Life arises through processes bound by the laws of physics and chemistry. The machines that we build of silicon and steel are similar to our own biological bodies in the most metaphorical sense; that is that cells form organs that in turn work with organs to carry out specific functions much like the various metal parts of a machine work together to carry out a task. The key difference is that an intelligence designs machines, while humans are byproducts of natural processes.[/QUOTE] what practical difference does that make though
[QUOTE=DainBramageStudios;34465024]you're acting as if human opinions to slavery are the only possible opinions the human mind is a single point in the space of all possible mind configurations. think of the millions of years of evolutionary baggage in the human brain, and how that would be completely null and void when considering an AI [editline]30th January 2012[/editline] AIs aren't humans, stop treating us like some kind of standard that all intelligence must conform to.[/QUOTE] Do you know of any other race that is going to build these sentient robots? If not, humanity is all we have to consider as it's pretty fucking imaginable that we'll make them using ourselfs as an example.
Sorry, you need to Log In to post a reply to this thread.