• Should a self-aware/sentient AI have rights.
    150 replies, posted
[QUOTE=matsta;34487953]Rights are supposed to be founded on moral principles.[/QUOTE] So?
I have a very strange view of things when it comes to Artificial Intelligence. If a form of artificial intelligence displays traits associated with semi-sentience or full sentience then I say let them have rights. Besides we don't know what self-aware AI will be like in the future, we don't even know it they will have malevolant or benevolant attitudes towards us humans. One thing we also have to consider what we humans will be like in the future as it may impact their views of us.
[QUOTE=matsta;34485623]You should also give two tugs of a dead dog's cock about rights in a moral sense. If you are talking about rights in a 'practical' sense then you're not talking about rights. If you think morally, we should do things for a reason. If we do have a purpose (we we were designed in a kind of 'masterplan' instead of being just an accident) what we SHOULD do is to fulfil that purpose.[/QUOTE] purpose is an entirely artificial concept, existing only in minds. there aren't little particles with "purpose" charge on them, interacting through some kind of "purpose" field. [editline]1st February 2012[/editline] [QUOTE=matsta;34487953]Rights are supposed to be founded on moral principles.[/QUOTE] they aren't "supposed" to do anything. they're a human invention, we can ascribe whatever purpose to them as we desire.
[QUOTE=mobrockers2;34481248]Do you not understand what hypothetically means?[/QUOTE] If you mean pointless babble mostly, yes.
Anything sentient deserves rights. If a computer is made that is sophisticated enough to qualify as being "alive" in the sense that it thinks by it's own accord and has feelings (not nessesarily in the same way as humans), it deserves to be treated with respect. After all, as humans, we are nothing more than complicated biological machines, our brains are merely vastly complex biological circuits.
The mighty question of "is this thing showing complex self awareness, that is not us, allowed to mingle within our established base of human rights?" is being boiled down. only reason this might be objected in anyway is because unlike another biological creature with that same criteria (say that we gave an alien these rights), the AI is a hand made being existing without any doubt of where it came from. which we cannot relate to ourselves. the boiled down question remaining for the opposing party i would think is "It has no soul, does it matter if it has a choice?" We are used to giving animals this excuse, they don't have the same sentience as us, so we prescribed them as having no soul. which we are projecting to this new thing that we have made. i am fairly confident in this assumption, despite not standing in the position.
[QUOTE=Gekkosan;34511022]If you mean pointless babble mostly, yes.[/QUOTE] If you don't want to contribute to the debate, don't come in and post. In my opinion if an AI has been programmed to think exactly like a human then it should be considered a person and given the rights a person deserves. [editline]5th February 2012[/editline] [QUOTE=matsta;34487953]Rights are supposed to be founded on moral principles.[/QUOTE] What moral principles restrict a new form of life recieving rights?
Sentient AI is a pretty retarded idea on it's own ,
[QUOTE=DainBramageStudios;34495082]purpose is an entirely artificial concept, existing only in minds. there aren't little particles with "purpose" charge on them, interacting through some kind of "purpose" field. [editline]1st February 2012[/editline] they aren't "supposed" to do anything. they're a human invention, we can ascribe whatever purpose to them as we desire.[/QUOTE] I don't get why it's wrong that rights or purposes are concepts. And, of course, if you don't have moral principles nothing is "supposed to do anything". But the question of the thread is "SHOULD a self-aware/sentient AI have rights". [editline]6th February 2012[/editline] Following your criteria, sentience would also be an entirely artificial concept. You won't find 'sentience' particles either. Actually, you won't find anything regarding 'sentience' directly out there.
Why should they have rights? they were created by us, and therefore serve us. They are not living. Also, because they are programmed, their "thoughts" could be altered somewhat easily, making them not really a sentient being.
[QUOTE=GetOutOfBox;34522726]Anything sentient deserves rights. If a computer is made that is sophisticated enough to qualify as being "alive" in the sense that it thinks by it's own accord and has feelings (not nessesarily in the same way as humans), it deserves to be treated with respect. After all, as humans, we are nothing more than complicated biological machines, our brains are merely vastly complex biological circuits.[/QUOTE] 'sentient AI' doesn't mean 'living AI'. Also, the a machine is sentient doesn't mean it has feelings. The discussion here is about sentient machines, but not necessarily 'living' machines.
This is one of those questions where we will never know until it happens.
[QUOTE=squids_eye;34556530]What moral principles restrict a new form of life recieving rights?[/QUOTE] What moral principle says that anything sentient deserves rights? Also, as I said before, we're not talkign some something 'living' here.
If an AI is fully self aware it has life, If it knows it's there for scientific/whatever reasons it would have minimal rights like the ability to have conversations and etc. If it fully believes it is a human it should have human rights as you have basically created life and should treat it as a living person. Though thats just my opinion. Also, never teach them how to use guns, That right is nulled as soon as they are given network control.
[QUOTE=Crimor;34461489]This is of course a theoretical question since there hasn't been one yet, but I feel it should have just as many rights as a human, seeing as we are quite similar, having a relationship between mind and brain is similar (if not identical) to the relationship between a running program and a computer. My main fear about this though, is that religious people will most likely try to destroy it due to it being "blasphemy" If you haven't, you should really watch the first ghost in the shell movie, it tries to handle this specific question.[/QUOTE] Cleverbot is blashpemy [IMG]http://www.topnews.in/files/pope_3.jpg[/IMG]
[QUOTE=matsta;34580679]Following your criteria, sentience would also be an entirely artificial concept. You won't find 'sentience' particles either. Actually, you won't find anything regarding 'sentience' directly out there.[/QUOTE] yep [editline]7th February 2012[/editline] [QUOTE=The Kakistocrat;34580765]Also, because they are programmed, their "thoughts" could be altered somewhat easily, making them not really a sentient being.[/QUOTE] our thoughts can be altered easily
Before we determine whether or not an AI deserves rights, we need to figure out if it's actually conscious. As far as I know scientists only have theories on what consciousness/self-awareness actually is, so we can't know for sure the difference between a truly self-aware being, or a very clever, very convincing, yet not actually conscious machine.
[QUOTE=sgtshock;34594350]we can't know for sure the difference between a truly self-aware being, or a very clever, very convincing, yet not actually conscious machine.[/QUOTE] you need to demonstrate that there actually is a difference first
If we ever make very complex AIs that are self aware and have qualia/experiences, they would have to have rights. If we had humanoid robots walking around that were not aware, there [i]would[/i] be some complicated laws regarding how to treat them properly, since they are someone's property, and that needs to be respected.
[QUOTE=DainBramageStudios;34590144]yep[/QUOTE] Oh. I get it. You are one of those guys who believes their entire life is an illusion. What do you even have to say in a debate like this? [editline]7th February 2012[/editline] [QUOTE=Splarg!;34595385]If we ever make very complex AIs that are self aware and have qualia/experiences, they would have to have rights. If we had humanoid robots walking around that were not aware, there [i]would[/i] be some complicated laws regarding how to treat them properly, since they are someone's property, and that needs to be respected.[/QUOTE] Yes, but how would you know if the AI has qualia or not? If we think on the development of a sentient machine, there would be a moment when we seriously would not know if the machine is sentient or not. And if we did know that the machine experiences qualia we wouldn't know how does the machine feel if it isn't human. How do you attempt to know a particular feeling you have never felt? Rights can't be made based on completely uncertain suppositions.
[QUOTE=matsta;34598386]Yes, but how would you know if the AI has qualia or not? If we think on the development of a sentient machine, there would be a moment when we seriously would not know if the machine is sentient or not. And if we did know that the machine experiences qualia we wouldn't know how does the machine feel if it isn't human. How do you attempt to know a particular feeling you have never felt? Rights can't be made based on completely uncertain suppositions.[/QUOTE] I didn't say if or how we'd be able to, but that seems like where you'd draw the line. Also, I'm not quite sure what you're saying in the second bit.
[QUOTE=Splarg!;34599293]I didn't say if or how we'd be able to, but that seems like where you'd draw the line. Also, I'm not quite sure what you're saying in the second bit.[/QUOTE] I'm saying that we can't know how does some A.I. robot experiences qualia ("what do they feel") if their brain and body structure is different from ours.
If something is capable of begging for its life, not for the sake of functionality, but for the sake of being alive, then I consider it sentient and to be protected similar to humans.
[QUOTE=ShadowSocks8;34599469]If something is capable of begging for its life, not for the sake of functionality, but for the sake of being alive, then I consider it sentient and to be protected similar to humans.[/QUOTE] Then a whole lot of living beings would have to be protected in a similar way than a human is, yet they aren't for a reason. This isn't going anywhere. Most of the argument's I see here presuppose that if something is sentient then it deserves rights but give little definition on what they understand as sentience. One could hardly think of some animals as not being sentient, yet we do not treat them as humans. And, even if they interact with society in many ways and have rights (in some countries), they are seen as 'something a human could use for their purposes' and not seen as an end. I already argued about why I think human rights exist. There IS a difference between humans and other sentient beings (until now): humans are purposeless. That means that humans don't act in a way or another or define themselves (just) according to a pre-established nature, they don't "have to". As every other living being, they are born, they reproduce, they eat, they (normally) preserve themselves, etc. but they are not defined by that (as every other living being, including sentient ones, is).
[QUOTE=matsta;34598386]Oh. I get it. You are one of those guys who believes their entire life is an illusion. What do you even have to say in a debate like this?[/QUOTE] no no no I meant that consciousness isn't a fundamental property, not that it doesn't exist.
you have no idea how hard it is not to pepper this thread with HAL 9000 dialogue clips. Personally, assuming that they [I]do[/I] have what we would define as full or partial sentience, then yes, artificial intelligences should be afforded rights to some degree/the same degree as humans depending on the situation, but then again... [video=youtube;7qnd-hdmgfk]http://www.youtube.com/watch?v=7qnd-hdmgfk[/video] it's a complicated issue I doubt we'll be able to answer until the advent of what we're talking about.
[QUOTE=DainBramageStudios;34605375]no no no I meant that consciousness isn't a fundamental property, not that it doesn't exist.[/QUOTE] A fundamental property of what?
[QUOTE=matsta;34613235]A fundamental property of what?[/QUOTE] the Universe eg quarks are fundamental things while atoms are not
It really depends on what level of intelligence they have. You could give the two smartest people on the planet more rights than you're average slobby teen and have no problems. I mean would you let the fry cook from burger king run the mission to mars?
[QUOTE=mustachio;34622278]It really depends on what level of intelligence they have. You could give the two smartest people on the planet more rights than you're average slobby teen and have no problems. I mean would you let the fry cook from burger king run the mission to mars?[/QUOTE] A fry cook is perfect for mission to mars, because he'd die a fiery death that tells the smart people what to do better.
Sorry, you need to Log In to post a reply to this thread.