• Should an advanced Artificial Intelligence be given the same rights as a human being?
    93 replies, posted
If we made an advanced AI, we would still need to "program" it to what we want You could make it totally unable to have emotions
[QUOTE=Tobba;32427176]If we made an advanced AI, we would still need to "program" it to what we want You could make it totally unable to have emotions[/QUOTE] What if I actually wanted it to have particular emotions? What if it came to conclussion it needs particular emotions, and programmed them itself?
I can see it comming, AI's get the same rights as humans, and take over the world aka matrix style :v:
[QUOTE=DrogenViech;32427509]I can see it comming, AI's get the same rights as humans, and take over the world aka matrix style :v:[/QUOTE] A bit off topic, but why do people always assume that A.I.'s will have an inherent desire for absolute control over people?
I'm going to make a more detailed post on something I don't think robots should ever get human rights, in fact, they shouldn't even be given emotion in the first place. Once rights and emotion are developed, maybe even free will, robots would be on equal grounds with humans but perhaps even be stronger, faster, smarter, etc. Eventually, what if robots became advanced enough to be in a power position and kept fighting for the rights of themselves? I seriously doubt we'd ever get to the point where we can handle robots in power, as its far too soon for that. We simply can't let robots have any power over a human, especially with free thought as well, imagine just how chaotic it would be if robots could develop better robots that could easily out-do a human. when I talk about robots of course, I talk about robots as tall as humans, with human appearance, even cognitive thinking. Though, maybe I'm just paranoid. :tinfoil: [QUOTE=Paramud;32427583]A bit off topic, but why do people always assume that A.I.'s will have an inherent desire for absolute control over people?[/QUOTE] Only if they get advanced to the point of becoming free thinkers and have the ability to create ideas like a human, learn, change programing, adapt, etc. Otherwise, if they aren't free willed, its just a mad scientists wet dream and nothing more
The thing is, how do you even give them real emotion? I suppose you they could have variables that state what their current emotion is, and this has some kind of effect on their actions. But the AI wouldn't even know that these were emotions, it wouldn't "feel" anything. Emotions come from chemicals and the impulses those chemicals create, along with reactions to outside stimuli. I don't think that we understand the brain enough to be able to create an AI cyborg. I really think creating sentient AI is way out of our capacity at the moment and will be for a long time (longer than any of us will live, at least).
[QUOTE=The DooD;32427801]The thing is, how do you even give them real emotion? I suppose you they could have variables that state what their current emotion is, and this has some kind of effect on their actions. But the AI wouldn't even know that these were emotions, it wouldn't "feel" anything. Emotions come from chemicals and the impulses those chemicals create, along with reactions to outside stimuli. I don't think that we understand the brain enough to be able to create an AI cyborg. I really think creating sentient AI is way out of our capacity at the moment and will be for a long time (longer than any of us will live, at least).[/QUOTE] Honestly, I think the real question is... to what extent CAN emotion be added? And by the time we fully understand the brain, is when we could go crazy with robots and human intelligence, but will we ever even reach that point? Depending on just how advanced the robot is depends on how we react. [editline]22nd September 2011[/editline] I know that even if we simulate humans, so many will oppose it that the robots will be treated like machines, but act like humans.
I still don't understand the argument why should emotion warrant rights. Emotionless cold people deserve no rights? [editline]22nd September 2011[/editline] It's not like emotions are anything to strive for. Half of human emotions are leftowers of what kept us alive before intellect evolved, and it's mostly shit that's holding us back, as a society.
[QUOTE=Awesomecaek;32427886]I still don't understand the argument why should emotion warrant rights. Emotionless cold people deserve no rights?[/QUOTE] Emotionless would extremely likely mean that they essentially have no real will, considering they're robots. If they have emotion, giving no rights would hurt them terribly, but if they have no emotion, they wouldn't really react with emotion anyways.
[QUOTE=J!NX;32427911]Emotionless would extremely likely mean that they essentially have no real will, considering they're robots. If they have emotion, giving no rights would hurt them terribly, but if they have no emotion, they wouldn't really react with emotion anyways.[/QUOTE] So, you are saying that that rights should only be awarded to entities who would react badly if they didn't have them? How about broken subdued slaves without will to fight? These are ok?
I say yes. Firstly because all things capable of complex thought or emotion deserve some sort of rights, and I believe that anything of intelligence of human level or close to deserve equality. The second reason is because, well.. [media]http://www.youtube.com/watch?v=qHD--2uKM4E&feature=related[/media] [media]http://www.youtube.com/watch?v=O2oVQto99So&feature=related[/media]
Computers work nothing like the brains of living organisms. All the emotions and stuff of AI systems are purely simulated, it would be near impossible for them to actually be properly sentient and have conciousness. Anyway, it's not like all living organisms feel emotions anyway. Even creatures as complex as reptiles only have really simple emotions, and even those are just instincts, so they don't actually feel them and reflect on them the same way we do. So I strongly disagree.
[QUOTE=Awesomecaek;32427952]So, you are saying that that rights should only be awarded to entities who would react badly if they didn't have them? How about broken subdued slaves without will to fight? These are ok?[/QUOTE] A human and a robot are two completely different things. If they're emotionless, a defective robot would be able to be easily fixed by simply repairing it, and it can easily determined if they're broken or not, they also don't favor any human over another because of emotional attachment or jealousy, hate, and won't ever feel it. If they have emotion, and free will, learning, etc, they'd have morals and choose some humans over another. A human is far harder to read than any robot would. You can't tell if it wasn't their fault, they had bad intent, or good intent. Of course, programming robots to never pick favorites or get too close changes things, it'd just be very simple emotion.
You could create a robot but it wouldn't be able to ponder the meaning of life or stuff like that. It would probably have an alarmingly utilitarian view on society, and 'right' and 'wrong'.
[QUOTE=Awesomecaek;32427952]So, you are saying that that rights should only be awarded to entities who would react badly if they didn't have them? How about broken subdued slaves without will to fight? These are ok?[/QUOTE] Well a broken subdued slave still suffers. If you start smashing an AI with a weapon, it feels nothing, it starts to break down and eventually shuts offs. If you did the same thing to a living creature, it feels pain, confusion and regret (and probably a whole bunch of other distressing things). Robots also don't hold any emotional attachments to each other and society would probably deem those who had emotional attachments to a robot weird. If they turn off, they can be restarted. If a large part of their circuitry is destroyed, it can be replaced. Assuming the robot had any important information stored on it, it would also have back ups, so even if it's storage unit was destroyed it could be replaced. Humans on the other hand go through their entire lives having a slightly (or very) different perspective on things that happen and experience everything differently. On top of this we're fragile and can be ruined easily, with no chance of being "turned back on". Unless these advanced AI we're discussing were somehow completely different from the way electronics work today, then there really is no point in giving them rights.
Yes. As long as it looks nothing like a human (aka uncanny valley :gonk: ), and is able to prove that it is "conscious", it should have the same rights a any human being. It should because any AI which can show "consciousness" is not very different from a human being (in the sense of mind), and also because it can be beneficial to society. If not human rights, then atleast animal rights!
[QUOTE=RobL;32428050]Computers work nothing like the brains of living organisms. All the emotions and stuff of AI systems are purely simulated, it would be near impossible for them to actually be properly sentient and have actual conciousness. Anyway, it's not like all living organisms feel emotions anyway. Even creatures as complex as reptiles only have really simple emotions, and even those are just instincts, so they don't actually feel them and reflect on them the same way we do. So I strongly disagree.[/QUOTE] The idea of giving AIs is just deluded sci fi nonsence imho. And even if we could create AI with actual- not simulated- emotions, sentience and conciousness (which i reckon is impossible) with intelligence comparable to humans that would have to be given rights, what would be the point in doing so? The only use I see for AI is comparable to slave labour (specialised taks and stuff), and the ability to independantly reason and reflect is useless for that . Yes, maybe simulated reason and reflecting, but not actual sentient reasoning and reflecting governed by a conciousness. Anyway, we humans can do that perfectly well already.
Also, our brain is just a vastly powerful computer, which can build and repair itself. It comes pre-programmed with what we call "instincts", and is organically grown. Saying that a robot which has the same amount of intelligence as us means that it has the same capability to be conscious as a human. The key is not programming it so much, i.e. leaving it to learn the world like a human. Saying that the above robot cannot have human-ish rights is like saying a synthetically grown brain doesnt have the same rights as normal one, even if it behaves in the same way.
[QUOTE=Eltro102;32428305]Also, our brain is just a vastly powerful computer, which can build and repair itself. It comes pre-programmed with what we call "instincts", and is organically grown. Saying that a robot which has the same amount of intelligence as us means that it has the same capability to be conscious as a human. The key is not programming it so much, i.e. leaving it to learn the world like a human. Saying that the above robot cannot have human-ish rights is like saying a synthetically grown brain doesnt have the same rights as normal one, even if it behaves in the same way.[/QUOTE] A synthetically grown brain is still a brain- it's just developed in different circumstances. People just don't seem to get that computers and brains work in totally different ways. It would be impossible to programme an AI conciousness identical to that of a human's, as it would require a near infinite amount of it to cover every possible state that the 'brain' could be in, and that is the only way to avoid a 'simulated conciousness'. Hope i'm making some actual sense here
[QUOTE=Awesomecaek;32426579] Once the robots come, the whole society will have to dramatically change. It is already changing. There's one term heard in the news a lot, lately. "Jobs". Do you know who is taking all the jobs? Automatization - the vanguard of Robots who like a wave, will change the history dramatically. If we don't give robots their rights, they will just take them. Not because it's fair, or moral, or because gods entitle them, or because they deserve it, or because WE deserve it. But because they will simply take whatever they will need, and nobody will be able to stop them.[/QUOTE] That was one of the things my story considered - in order to preserve human jobs, AI would have to be limited in either the kinds of jobs they could do, or their capacity to do them. Suppose you had an AI that managed a fast-food restaurant; with enough equipment you could potentially automate the entire operation and eliminate the need for human workers. AI also don't have to be paid; unless the upkeep for the AI outweighs the cost of paying workers or the benefits gained from its efficiency, then the logical business choice would be to switch to AI but that's hardly fair for all the people who need jobs. Shit, I wish I had kept a better record of that story...
Roundup of my posts because I hate being ignored haha... [QUOTE=RobL;32428050]Computers work nothing like the brains of living organisms. All the emotions and stuff of AI systems are purely simulated, it would be near impossible for them to actually be properly sentient and have conciousness. Anyway, it's not like all living organisms feel emotions anyway. Even creatures as complex as reptiles only have really simple emotions, and even those are just instincts, so they don't actually feel them and reflect on them the same way we do. So I strongly disagree.[/QUOTE] [QUOTE=RobL;32428253]The idea of giving AIs is just deluded sci fi nonsense imho. And even if we could create AI with actual- not simulated- emotions, sentience and conciousness (which i reckon is impossible) with intelligence comparable to humans that would have to be given rights, what would be the point in doing so? The only use I see for AI is comparable to slave labour (specialised taks and stuff), and the ability to independantly reason and reflect is useless for that . Yes, maybe simulated reason and reflecting, but not actual sentient reasoning and reflecting governed by a conciousness. Anyway, we humans can do that perfectly well already.[/QUOTE] [QUOTE=RobL;32428401]A synthetically grown brain is still a brain- it's just developed in different circumstances. People just don't seem to get that computers and brains work in totally different ways. It would be impossible to programme an AI conciousness identical to that of a human's, as it would require a near infinite amount of it to cover every possible state that the 'brain' could be in, and that is the only way to avoid a 'simulated conciousness'. Hope i'm making some actual sense here[/QUOTE]
I do agree with you RobL. The problem I have with debating this subject is that we aren't even nearly close to developing sentient AI, so nearly everything that can be said about the subject is speculation and a lot of people are just drawing on fictional works of art for inspiration. So going back on myself, I'm going to do the exact same thing and say that I think if sentient AI was created, it would be something similar to the Geth in Mass Effect that wants nothing to do with us and they would just fly away to another planet and get on with their own things. AI is created with the idea of serving us, if we make it sentient it isn't going to want to serve us for very long, so why would anyone want to make it sentient.
Let's give Cleverbot human rights and obligations :v: In my opinion, no. They're just machines. Even if they have emotions, because those emotions were coded in by us. I don't care a machine is as smart as me, has feelings like me, it's still not a natural human.
[QUOTE=Wnd;32426500]You realize human body is just machine too with chemical receptors and stuff? Surely we won't program it to work on home computer if that's what you meant.[/QUOTE] But there's still the difference. Consciousness and emotions, something our current science cannot explain at the moment. And the true essence of consciousness is a question of philosophy.
[QUOTE=The DooD;32428216]Well a broken subdued slave still suffers. If you start smashing an AI with a weapon, it feels nothing, it starts to break down and eventually shuts offs. If you did the same thing to a living creature, it feels pain, confusion and regret (and probably a whole bunch of other distressing things).[/quote] This is naturally evolved reaction supposed to warrant self protection. It's logical to assume that at some point, both robots designed by man and designed by other robots would have their own equivalent of that. [quote] Robots also don't hold any emotional attachments to each other and society would probably deem those who had emotional attachments to a robot weird.[/quote] Firstly, that shouldn't be a requirement for multiple rights, like right of free speech, right of ownership, ecetera. Secondly, I don't think that will necessarily be true. [quote] If they turn off, they can be restarted. If a large part of their circuitry is destroyed, it can be replaced. Assuming the robot had any important information stored on it, it would also have back ups, so even if it's storage unit was destroyed it could be replaced. Humans on the other hand go through their entire lives having a slightly (or very) different perspective on things that happen and experience everything differently. On top of this we're fragile and can be ruined easily, with no chance of being "turned back on".[/quote] Well, right, robot would have advantage in that area, and could "survive" more, but that doesn't change the fact it should have the right to survive, in some way or form. For instance, in case of dire need, shutting a robot down might not be a violation of his rights, as long as he would be booted again. That's question of specific wording of the rights, tho, not their existence. [quote] Unless these advanced AI we're discussing were somehow completely different from the way electronics work today, then there really is no point in giving them rights.[/QUOTE] I think that the point is that any intelligent being should have it's rights. Primarily rights to keep existing and to have at least some degree of space for personal progress and for it's future. [editline]22nd September 2011[/editline] [QUOTE=J!NX;32428138]A human and a robot are two completely different things. If they're emotionless, a defective robot would be able to be easily fixed by simply repairing it, and it can easily determined if they're broken or not, they also don't favor any human over another because of emotional attachment or jealousy, hate, and won't ever feel it. If they have emotion, and free will, learning, etc, they'd have morals and choose some humans over another. A human is far harder to read than any robot would. You can't tell if it wasn't their fault, they had bad intent, or good intent.[/QUOTE] That mostly applies only to the "AI" we have today. Once the AI will start forming itself through the experiences it goes through (which is actually happening at some level in certain cases), it will be rather easy to lose the trace of why and what exactly are it's decisions. By the way, this isn't something that should warrant rights either.
This is kind of the whole argument of Ghost in the Shell, what it means to be human. I personally believe that what considers itself to be human, has to be human if it's indistinguishable from humans. A dolphin could think it's human but never become one, but what about an artificial brain in an absolutely biologically correct shell? I suppose it's the AI's choice to consider themselves part of us or part of a new race.
[QUOTE=Awesomecaek;32428854]This is naturally evolved reaction supposed to warrant self protection. It's logical to assume that at some point, both robots designed by man and designed by other robots would have their own equivalent of that.[/quote] Why would they have an equivalent of that? Why would they even want an equivalent of that. Why would they want to distressing feeling of pain, when they could just wait out their destruction, then be reconstructed at a later date. [quote] Firstly, that shouldn't be a requirement for multiple rights, like right of free speech, right of ownership, ecetera. Secondly, I don't think that will necessarily be true. [/quote] I'm not saying that it should be a requirement, but it helps. We strive for animal rights because some people love animals. Not everyone loves all animals, but we still go for animal rights. [quote] Well, right, robot would have advantage in that area, and could "survive" more, but that doesn't change the fact it should have the right to survive, in some way or form. For instance, in case of dire need, shutting a robot down might not be a violation of his rights, as long as he would be booted again. That's question of specific wording of the rights, tho, not their existence. [/quote] Well that's the thing, what's the point in it having rights if it can be turned on and off at will. If say there are two beings in danger and only one of them is save-able. One is a human, one is a sentient robot. The human has more of a right to be saved than the robot, because in the event that the robot is destroyed, it can be repaired and back on-line in no time. Either that or it could be replaced. No one would be any the wiser, because it would be the exact same thing. [quote] I think that the point is that any intelligent being should have it's rights. Primarily rights to keep existing and to have at least some degree of space for personal progress and for it's future.[/QUOTE] I wouldn't exactly think a lot of animals are intelligent, but I still think they have a right to live. And there are a bunch of intelligent programs running on my PC right now that I could care less for, because if they die, they can be fixed and restarted.
[QUOTE=The DooD;32429278]Why would they have an equivalent of that? Why would they even want an equivalent of that. Why would they want to distressing feeling of pain, when they could just wait out their destruction, then be reconstructed at a later date.[/quote] To preserve resources, to preserve time, to keep unnecessary cases of some robot truly dying because his rebuilding was impossible from happening. There is a plenty of reasons to implement negative reaction towards physical pain. [quote] I'm not saying that it should be a requirement, but it helps. We strive for animal rights because some people love animals. Not everyone loves all animals, but we still go for animal rights. [/quote] Giving rights because of liking seems incredibly shallow to me. People might aswell assign rights to their cars, and their waifu pillows. Personal prefference doesn't matter at all. [quote] Well that's the thing, what's the point in it having rights if it can be turned on and off at will. If say there are two beings in danger and only one of them is save-able. One is a human, one is a sentient robot. The human has more of a right to be saved than the robot, because in the event that the robot is destroyed, it can be repaired and back on-line in no time. Either that or it could be replaced. No one would be any the wiser, because it would be the exact same thing. [/quote] Firstly, right for life isn't the only right, by far. Secondly, choosing a human over a robot on save-one situation doesn't mean that the robot shouldn't have the right to seek help if saving him didn't mean sacrificing someone else. [quote] I wouldn't exactly think a lot of animals are intelligent, but I still think they have a right to live. And there are a bunch of intelligent programs running on my PC right now that I could care less for, because if they die, they can be fixed and restarted.[/QUOTE] You are talking about completely different kind of intelligence. What's more impotant is self awareness.
If it gets to the point where we can simulate a human personality then yes, without a doubt. The last thing I want is to end up living in neo arcadia. [sp] anyone recognize my avatar? Oh the irony[/sp]
[QUOTE=asteroidrules;32429528]If it gets to the point where we can simulate a human personality then yes, without a doubt. The last thing I want is to end up living in neo arcadia. [sp] anyone recognize my avatar? Oh the irony[/sp][/QUOTE] Keyword there- 'Simulate' Why would you need to give something with a simulated conciousness rights?
Sorry, you need to Log In to post a reply to this thread.