Should an advanced Artificial Intelligence be given the same rights as a human being?
93 replies, posted
[img]http://img820.imageshack.us/img820/8423/advncedai.png[/img]
I brought this up on Facebook, and thought it was a good topic for the new forum. I'm green
[highlight](User was banned for this post ("Starting a debate thread with a Facebook chatlog is a terrible idea" - Craptasket))[/highlight]
I would love the day when machines are accepted as fellow human beings.
Think of the Terminators and the Machines from the Matrix.
Imagine the devestating ideas, although if machines were smart.. wouldn't it be a precaution to make the machine bodies a little weaker than a humans? so atleast they couldn't kill a human so fast.
I don't believe Artificial Intelligence should be created in the first place.
[QUOTE=Soviet Bread;32422428]I don't believe Artificial Intelligence should be created in the first place.[/QUOTE]
I don't believe that Justin Bieber should ever have been born. Doesn't erase his sorry ass from existence.
[QUOTE=Mr. Scorpio;32422461]I don't believe that Justin Bieber should ever have been born. Doesn't erase his sorry ass from existence.[/QUOTE]
Advanced AI worthy of equal rights doesn't exist yet. I don't get your point.
Annnnd, it's kind of cruel to wish someone was never born. Even if you don't like their music or being a corporate puppet.
The term "artificial intelligence" implies the machine is self-conscious and capable of making decisions.
So why not? Also, denying it and its fellow robots the rights that other beings have could result in machine rights movements, rebellions, and maybe wars.
[QUOTE=Soviet Bread;32422468]Advanced AI worthy of equal rights doesn't exist yet. I don't get your point.
Annnnd, it's kind of cruel to wish someone was never born. Even if you don't like their music or being a corporate puppet.[/QUOTE]I think he means it's going to happen anyway.
I believe that an AI as or more intelligent and responsible as a human should have the same rights as privileges as we do, with some changes as they wouldn't have the same needs as us.
[editline]22nd September 2011[/editline]
Also, if you're talking about computers, electronic sentience is a more accurate term, an engineered organism is artificial and could be sentient.
Well, yes they should. If they can reason, imagine, think, all the thinkings of man (of most of them at least), then they deserve the same rights.
An update:
[quote]Red Ohh shut up... hahaha but technically they do. hahaha The difference being that AI is a not an organism as such.
30 minutes ago · Like
Green Since when did parents have the right to do what they like with you? An organism is just a complex machine
29 minutes ago · Like
Red To be classified as an organism, it must be able reproduce, respire, consume (basically eat), excrete (usually in some way or another) and move (only some though). Only then can you considering human rights for an AI. Think of this, we kill plants and animals, which don't have the right to protest their death, etc.
23 minutes ago · Like
Blue u bring up a strong point
17 minutes ago · Like
Green But a plant is not sentient, an advanced AI is. The ability to have sex doesn't define you rights.
An AI can do those though, it reproduces through copying, eat electricity, doesn't excrete though
17 minutes ago · Like
Green I'm not sure shitting defines your rights
13 minutes ago · Like
Red Electricity isn't physical matter. It's energy. Copying what? Can it reproduce machines like itself without use of other majorly different machines and through growth, etc. No shitting doesnt, but it defines that you are a living organism, which is what defines your rights.
12 minutes ago · Like
Green e=mc^2, energy is matter.
9 minutes ago · Like
Green Still, I don't think your definition of life is correct
8 minutes ago · Like
Red Search it up. It is. Anyway.. is this for an assignment or something?
8 minutes ago · Like
Green Nope, my personal curiosity
7 minutes ago · Like · 1 person
Green ‎"Life (cf. biota) is a characteristic that distinguishes objects that have signaling and self-sustaining processes"
7 minutes ago · Like
Red Yeah, now search what makes something a living organism.
6 minutes ago · Like
Green I just did: "Life (cf. biota) is a characteristic that distinguishes objects that have signaling and self-sustaining processes"
6 minutes ago · Like
Green Another idea; if a human mind was uploaded onto a computer, are they still human and do they still deserve human rights?
5 minutes ago · Like
Red Yeah but that's only a definition. Search up the characteristics that make it a living organism. HAHA that's a good one. Umm if the person dies but you have their memory electronically... then no. If they are still alive.. you should ask the person who holds the original brain.
4 minutes ago · Like
Green So a body defines your rights?
about a minute ago · Like
Green How about an advanced AI uploaded into a braindead human
about a minute ago · Like
Red Everything defines your rights; body, soul and mind. If the AI is capable of emotion and everything else and adapts to a human body like you said... i guess you need to grant human rights.[/quote]
I think that it's interesting that a braindead human has more rights than the most advanced AI
Personally I see it as a moot issue because we're hard pressed to make AIs that imitate the smallest parts of us, let alone do that by itself. They're likely to never gain intelligence beyond humans, they might be faster and more reliable, but intelligence is about more than that.
If one was to become intelligent though, I would give it all the rights it deserves that make it fair rather than copy pasting our rights. I certainly wouldn't try to restrict it from killing me lest everyone's Frankenstein complex come to be a reality, but I doubt I would be giving it the right to life and the pursuit of happiness due to those concepts being completely different for an AI.
I think an AI should be given [I]more[/I] rights than humans.
Humans are imperfect. However, AIs are not constrained by physical or genetic imperfections. Since I believe psychology stems from experiences, which ultimately is dictated by differences in ability, I propose that the removal of physical constraints effectively creates a "pure" point of view. A point of view that we shouldn't taint by imposing sub-par citizenry on, or else it might bite us in the ass.
AI should be given rights just as much as teaching a giant metal robot anger.
No wait.
Implying that humans have magical "souls" and that we're something more than organic machines is just completely fucking stupid.
[QUOTE=SuppliesAttack;32422580]I think an AI should be given [I]more[/I] rights than humans.
Humans are imperfect. However, AIs are not constrained by physical or genetic imperfections. Since I believe psychology stems from experiences, which ultimately is dictated by differences in ability, I propose that the removal of physical constraints effectively creates a "pure" point of view. A point of view that we shouldn't taint by imposing sub-par citizenry on, or else it might bite us in the ass.[/QUOTE]
I agree. While we're at it we should rid ourselves of our own imperfections through augmentations.
[QUOTE=SuppliesAttack;32422580]I think an AI should be given [I]more[/I] rights than humans.
Humans are imperfect. However, AIs are not constrained by physical or genetic imperfections. Since I believe psychology stems from experiences, which ultimately is dictated by differences in ability, I propose that the removal of physical constraints effectively creates a "pure" point of view. A point of view that we shouldn't taint by imposing sub-par citizenry on, or else it might bite us in the ass.[/QUOTE]
I don't understand your logic, are you saying since we're imperfect, AI should treat us like second-class citizens?
[editline]22nd September 2011[/editline]
[QUOTE=DeEz;32422635]Implying that humans have magical "souls" and that we're something more than organic machines is just completely fucking stupid.
I agree. While we're at it we should rid ourselves of our own imperfections through augmentations.[/QUOTE]
At what point do we stop 'augmenting' ourselves. Anyone who supports needless augmentation is pretty insecure about their own humanity and/or played a little too much deus ex.
[QUOTE=Soviet Bread;32422802]I don't understand your logic, are you saying since we're imperfect, AI should treat us like second-class citizens?[/QUOTE]
No, integrate AI within out governments to make a system without the weaknesses humans bring to the table (racism, self-interest, greed, etc.)
So you want a dictatorship of beings that are more advanced than us "imperfect beings". This sounds like a misanthropic pipedream a 12 year old would come up with. Do you not understand the concept of representation. If an AI has no ability to have greed, racism or whatever, which is good, it's not exactly a being with free thought.
[editline]22nd September 2011[/editline]
Humans have the ability to show altruism and benevolence, i'd rather have that, someone like Jens Stoltenberg, a very altruistic person and a good leader ruling over a country than some robot incapable of understanding emotions.
[QUOTE=Soviet Bread;32422868]So you want a dictatorship of beings that are more advanced than us "imperfect beings". This sounds like a misanthropic pipedream a 12 year old would come up with. Do you not understand the concept of representation. If an AI has no ability to have greed, racism or whatever, which is good, it's not exactly a being with free thought.
[editline]22nd September 2011[/editline]
Humans have the ability to show altruism and benevolence, i'd rather have that, someone like Jens Stoltenberg, a very altruistic person and a good leader ruling over a country than some robot incapable of understanding emotions.[/QUOTE]
Thank you for the insult. Glad to see we can have a civil debate.
Oh, I don't doubt that there are good people. However, it's inevitable that there will be people who prize their own desires and ambitions at the expense of others. Therefore, in order for the perfect society to exist, you must do one of two things:
1: You must eradicate those who would disturb the peace (unrealistic, brutal, and inefficient)
2: You must prevent them from seeking to do evil
However, is evil not an ambiguous definition? Who's to say what's right and wrong? Our perceptions of justice stem from our own personal experiences: this is why us humans disagree on matters. An AI isn't guided by such matters to the extent humans are, since our perceptions are defined by our experiences, which are defined through our inherent differences.
However, how do you define perfect? I believe that nobody, with all our inherent flaws and subconscious psychological needs (derived from our physical imperfections), is in a position to determine the fate of humanity. Therefore, I believe, in order to create the perfect government, we must be governed by a perfect leader. It does not matter whether this leader is an AI, a God, or anything, really.
This is why I believe we should put more trust in AI than other humans. AI doesn't have our flaws, therefore it can make pure decisions. By using a true, unbiased medium for institutions such as law and enforcement, our society benefits.
I believe that AI should be able to have equal rights to humans, although that also means that they can't be forced to do things that a normal human would be repulsed to do, such as forced to be literal servants for humanity. Robots with intelligence less of that of a human, around the brain power of an ape. So when we finally do becomes something non-human, and our conscience merges into a global network of information, where are bodies are left behind and beyond feeling, we can be safe knowing that robot's won't take revenge for the pain and suffering we made them endure. Although, A.I. doesn't have to be a physical manifestation, we can create ones similar to GLADOS, living entirely in an interconnected network of information and other A.I's. We can make them the tech support guys.
[QUOTE=SuppliesAttack;32423006]However, how do you define perfect? I believe that nobody, with all our inherent flaws and subconscious psychological needs (derived from our physical imperfections), is in a position to determine the fate of humanity. Therefore, I believe, in order to create the perfect government, we must be governed by a perfect leader. It does not matter whether this leader is an AI, a God, or anything, really.[/QUOTE]
Wanna know why there's no perfect human? Because there's no such thing as perfect. Reality is subjective and it is absolutely impossible to create something perfect for such an ambiguous goal as governance. There is also no absolute morality, barely even any good or evil that is universal, how can we expect to create anything that is perfect in any way?
This is not the role of an AI and it is not even what an AI is.
[QUOTE=SuppliesAttack;32423006]Thank you for the insult. Glad to see we can have a civil debate.
Oh, I don't doubt that there are good people. However, it's inevitable that there will be people who prize their own desires and ambitions at the expense of others. Therefore, in order for the perfect society to exist, you must do one of two things:
1: You must eradicate those who would disturb the peace (unrealistic, brutal, and inefficient)
2: You must prevent them from seeking to do evil
However, is evil not an ambiguous definition? Who's to say what's right and wrong? Our perceptions of justice stem from our own personal experiences: this is why us humans disagree on matters. An AI isn't guided by such matters to the extent humans are, since our perceptions are defined by our experiences, which are defined through our inherent differences.
However, how do you define perfect? I believe that nobody, with all our inherent flaws and subconscious psychological needs (derived from our physical imperfections), is in a position to determine the fate of humanity. Therefore, I believe, in order to create the perfect government, we must be governed by a perfect leader. It does not matter whether this leader is an AI, a God, or anything, really.
This is why I believe we should put more trust in AI than other humans. AI doesn't have our flaws, therefore it can make pure decisions. By using a true, unbiased medium for institutions such as law and enforcement, our society benefits.[/QUOTE]
I don't define perfect, because perfect isn't real. There IS no such thing as perfect, basically making your entire point of developing a 'perfect society' moot. You can make a good society, a great society, an amazing society, but not a perfect society.
You're completely removing free thought to regulate free thought, how the hell is this not authoritarian? Just because you tack AI to it doesn't make it any less.
AI without free though see things as black and white. There is no grey. Again, this is all a misanthropic pipedream.
I ascribe to Iain M Banks views on AI - So a definitive yes from me.
[QUOTE=Soviet Bread;32422802]
At what point do we stop 'augmenting' ourselves.[/QUOTE]
If we become a race of machines we would then be in control of our own evolution. At that point I don't think it matters.
[QUOTE=Soviet Bread;32422802]
Anyone who supports needless augmentation is pretty insecure about their own humanity and/or played a little too much deus ex.[/QUOTE]
Probably. Or maybe I am just a greedy bastard who wouldn't mind an "upgrade" which makes me better (see there's another human imperfection that could use some fixing).
Why the hell would you make an AI then give them the same rights? You'd have to give them emotion first, and that'd be super retarded to do
how about we let robots then take over hmm?
[QUOTE=J!NX;32424960]Why the hell would you make an AI then give them the same rights? You'd have to give them emotion first, and that'd be super retarded to do
how about we let robots then take over hmm?[/QUOTE]
What sound arguing, I think you may be onto something
The thing is we can only program them imitate consciousness and self-awareness, maybe make them have an ability to learn, but that does not mean they're really self-aware ect.
[QUOTE=Maucer;32424979]The thing is we can only program them imitate consciousness and self-awareness, maybe make them have an ability to learn, but that does not mean they're really self-aware ect.[/QUOTE]
At what point does an imitation not become the same thing? I mean, if it thinks and believes it is self-aware, acts like it is self-aware, how in any way is that different from the real thing?
[QUOTE=Soviet Bread;32423064]You can make a good society, a great society, an amazing society, but not a perfect society.[/QUOTE]
All of those things are also subjective. An amazing society to one person isn't necessarily an amazing society to another person. It's alright though because in his amazing society the AI will "eradicate" those people.
Perfect society at last!
I think advanced AI (if programmed human like) would see itself as an elite in comparison to humans
That's not a good thing
I was actually thinking about posting this topic. I also think there was a discussion on it along time ago.
The thing is, I think it really depends on what the AI thinks of itself/themselves. Assuming that they are fully sentient, capable of creating thoughts we hadn't programmed them to, then we would have to know whether they realise that they were created to serve us, and are wholly expendable.
Maybe an AI wouldn't even be created to serve us (although generally any current AI research is done to make our lives easier). But assuming some crazy evil scientist just wanted to create some self-replicating synthetic lifeforms, that are fully capable of sentient thought, then I think that not only should they have AI rights (basically human rights, but obviously they're not human), but we should probably try and stay separate from them and let them develop along their own path, or try and coerce them into creating their civilization on another planet that is uninhabitable to us, but not to them.
There is already enough conflict between humans of different skin colour, introducing a new race that's able to communicate with us would probably only lead to a lot of blood shed.
Really though this entire subject is hard to debate, since sentient AI doesn't exist and I'm not even sure that people have any idea of how they might go about developing it (please post me links if there is).
Current AI obviously isn't sentient, it's just complex software that can do jobs much faster than we can, or much more efficiently, or even jobs that we can't do at all, or a combination of those. That kind of AI doesn't need rights similar to us.
[editline]22nd September 2011[/editline]
I think a possibly more important question to ask is, if we could develop sentient AI, should we give them emotions? Emotions could make them irrational, and do things that would not-benefit/hinder us. But at the same time would they really be that sentient if they didn't have emotions?
[editline]22nd September 2011[/editline]
[QUOTE=J!NX;32424960]Why the hell would you make an AI then give them the same rights? You'd have to give them emotion first, and that'd be super retarded to do
[/QUOTE]
Bam.
Sorry, you need to Log In to post a reply to this thread.