its ok for a robot to suck ur dick as long as they were built to suck dicks
[QUOTE=trent_roolz;29540694]Should robots be allowed to practice religion?[/QUOTE]
Just how would a robot go about finding a religion?
[QUOTE=aridpheonix;29540853]This just in:
[B]Apple releases: "The iRobot"[/B][/QUOTE]
Isn't there some kind of copy right issue naming their product after a movie?
[editline]1st May 2011[/editline]
Also, I don't think it's right that in a thread about Artificial Intelligence we're talking about robots.
Aren't robots more like electromechanical machines that carry out orders and tasks.
You should just call them AI.
[QUOTE=LordMalevolence;29540820]If a robot dies in a forest, does it make a sound?[/QUOTE]
No because it's dead.
No, just look what happened in Blade Runner.
Oh wait, that's other way around.
[QUOTE=trent_roolz;29540694]
Should AI be allowed to have sentience? And if so, how far should it go? Should we put things in them to prevent crime or allow them to live like actual human beings?
Should AI with human-like artificial intelligence have [b]human[/b] rights?
Should destroying an AI with human intelligence count as murder?
Should AI be allowed to practice religion?
Should AI be allowed to marry other AI?
Should AI be allowed to marry humans?
[/QUOTE]
An AI would have to have sentience to be an AI, so yes. Whoever does the initial programming for the AI should make sure that certain things are hard coded into the AI that can't be overwritten (assuming that, since it's AI it has to be able to learn an adapt, and to do that it would have to be able to change it's own programming).
AI should not have human rights, because they're not human. Just because they have human like intelligence doesn't make them that similar to us. They don't even have to be bi-pedal humanoids (eg. GLaDOS). They are practically immortal considering they are made of metal, and if they run out of power and die, we can just power them up and restart them. If their storage gets broken (if it's even stored on them, as that could present problems), then chances are it would be stored on a computer somewhere anyway, and they can just plug in a new one and it's good to go.
Again, since they don't have human rights, it shouldn't be counted as murder. They could be programmed to feel reward when killed and it wouldn't matter. If they get destroyed, chances are they have a back-up somewhere and a new body of the production line in no time.
AI should be allowed to practice religion, but why the fuck would they do that. Infact no they shouldn't, because if they read the bible they would find it illogical and explode, or worse they would go rogue and start killing shit.
AI should be allowed to marry, but again, it's illogical. Unless this AI somehow incorporates emotion (which it wouldn't as far as I can tell, because that would be [b]very[/b] bad), it has no reason to marry for love. And since it's an android and is likely to be maintained by a company or someone who bought it, it has no financial assets and no family. So marriage is useless to it.
A human should be allowed to marry an AI if they're fucked up in the head. But whyyyyyy?
[QUOTE=a-k-t-w;29541698]How do you know this?[/QUOTE]
Well, who built the robots in the first place? Assuming robots don't come to us from another type of lifeform out there.
If they were truly artificially intelligent, they could possibly become smarter than us.
But chances are they wouldn't be, because it's really hard.
Q: Should robots be allowed to have sentience? And if so, how far should it go? Should we put things in them to prevent crime or allow them to live like actual human beings?
A: Why not, if it helps solving crimes and the likes. It is still a machine, not a human.
Q: Should robots with human-like artificial intelligence have [B]human[/B] rights?
A: No, they are a machine with programmed intelligence, not humans.
Q: Should destroying a robot with human intelligence count as murder?
A: No, they are a machine with programmed intelligence, not humans.
Q: Should robots be allowed to practice religion?
A: I don't think a machine with artificial intelligence would deem religion a very worthwile thing to spent CPU power on.
Q: Should robots be allowed to marry other robots?
A: Other than the fact it wouldn't make any sense for a robot to do so, I don't think I care.
Q: Should robots be allowed to marry humans?
A: Can you marry your pet dog? No, then why should humans marry robots?
If a robot begins to emulate true consciousness, whether or not it is actual consciousness, if it is not treated with respect is will respond by rebelling, as any proper conscious being does. We don't want robots that (by the point of plausible AI events happening) will millions of times smarter and better connected than us rebelling against us. So, sentient beings that past any consciousness-testing test we can throw at it should be treated with respect and dignity, and as such should receive the same rights as a human being.
One of my greater fears for the future is that when something big like fully-immersive VR or sentient AI comes around, the greater bulk of humanity will misunderstand or view it with biases of the past and completely screw up the possibilities raised by a groundbreaking new technology or event.
[QUOTE=FreeBee;29544652]
Q: Should robots be allowed to have sentience? And if so, how far should it go? Should we put things in them to prevent crime or allow them to live like actual human beings?
A: Why not, if it helps solving crimes and the likes. It is still a machine, not a human.[/QUOTE]
I don't think there is that much difference between procreating and creating a sentient being in some other way. Why does it matter that it's a machine? I guess it's some religious reason cause I can't see any other.
[QUOTE=FreeBee;29544652]
Q: Should robots with human-like artificial intelligence have [B]human[/B] rights?
A: No, they are a machine with programmed intelligence, not humans.[/QUOTE]
I can't see a reason why not, gets a bit philosophical if you try to determine when something is sentient or not.
What would you say about aliens?
[QUOTE=FreeBee;29544652]
Q: Should robots be allowed to marry humans?
A: Can you marry your pet dog? No, then why should humans marry robots?[/QUOTE]
For the benefits economically and otherwise or as a symbolic gesture that they want to spend the rest of their lives together.
Ethically I think the idea of artificial intelligence is wrong. You could argue that since it can think for itself and speak it should be given rights, but a major use of robots is to act as a 'slave' of sorts.
[QUOTE=Optional Pirate;29541054][img_thumb]http://s11.allstarpics.net/images/orig/6/j/6jvnf3ll2b85l32v.jpg[/img_thumb][/QUOTE]
That film was so sad
You have been asleep for: nine nine nine nine nine, ni-
ITT: Normal people and biochauvinists who think only 'herp glorious humans' can have sentience :colbert:
[QUOTE=xZippy;29541544]Us humans fear that one day robots will be smarter than us. This will never happen.[/QUOTE]
You're crazy.
[QUOTE=The DooD;29544469]An AI would have to have sentience to be an AI, so yes. Whoever does the initial programming for the AI should make sure that certain things are hard coded into the AI that can't be overwritten (assuming that, since it's AI it has to be able to learn an adapt, and to do that it would have to be able to change it's own programming).
AI should not have human rights, because they're not human. Just because they have human like intelligence doesn't make them that similar to us. They don't even have to be bi-pedal humanoids (eg. GLaDOS). They are practically immortal considering they are made of metal, and if they run out of power and die, we can just power them up and restart them. If their storage gets broken (if it's even stored on them, as that could present problems), then chances are it would be stored on a computer somewhere anyway, and they can just plug in a new one and it's good to go.
Again, since they don't have human rights, it shouldn't be counted as murder. They could be programmed to feel reward when killed and it wouldn't matter. If they get destroyed, chances are they have a back-up somewhere and a new body of the production line in no time.
AI should be allowed to practice religion, but why the fuck would they do that. Infact no they shouldn't, because if they read the bible they would find it illogical and explode, or worse they would go rogue and start killing shit.
AI should be allowed to marry, but again, it's illogical. Unless this AI somehow incorporates emotion (which it wouldn't as far as I can tell, because that would be [b]very[/b] bad), it has no reason to marry for love. And since it's an android and is likely to be maintained by a company or someone who bought it, it has no financial assets and no family. So marriage is useless to it.
A human should be allowed to marry an AI if they're fucked up in the head. But whyyyyyy?[/QUOTE]
Seems to me that answer might be good for the next 10-20 years. But what about beyond that? What if they develop sentience (And AIs don't have to be sentient.), what if the difference between us and AIs shrunk so much, eventually they surpass us in every way?
I don't mind robots until they are given the proper coding for rebellious actions. the last thing i fucking want is robots ruining our shit.
Let's say that robots cannot "die" since their consciousness can be transmitted to a new body. However, they feel the "pain" of injury/death similar to a human (just electrical in the end anyway).
Should punishment for killing a robot as a human be equal to killing a human as a robot? I think not, since the robot has only "pain and suffering" and not eternal disappearance.
What if robots are, by-far, our intellectual superior? Should they go to the same schools? There are too many "what-ifs" out there for something that doesn't exist to cause a debate. The only sure thing about the whole situation is that pragmatism is key when creating rules governing any sentient race.
[QUOTE=Eudoxia;29546415]ITT: Normal people and biochauvinists who think only 'herp glorious humans' can have sentience :colbert:[/QUOTE]
Why would robots need sentience?
No, seriously, think about it, why would slaves that don't get tired, don't feel pain and don't need food and water need sentience? Then would just be either:
A: Sentient beings suffering from slavery
or
B: The loss of a resource as great as slaves that don't suffer from labor
Just dropping this here:
[URL]http://www.gizmag.com/organic-molecular-computer/15041/[/URL]
Shows that it is possible to create a processing chip that acts in a similar way to our neurons
[QUOTE=Laserbeams;29549574]Why would robots need sentience?
No, seriously, think about it, why would slaves that don't get tired, don't feel pain and don't need food and water need sentience? Then would just be either:
A: Sentient beings suffering from slavery
or
B: The loss of a resource as great as slaves that don't suffer from labor[/QUOTE]
Every instance of AI for slave-labor will [b]rape[/b] the economy. If consumers don't have jobs they have no money, and if consumers have no money capitalism has no purpose.
[QUOTE=Helix Alioth;29550315]Every instance of AI for slave-labor will [b]rape[/b] the economy. If consumers don't have jobs they have no money, and if consumers have no money capitalism has no purpose.[/QUOTE]
I think he's talking more about some stupid utopian idea where humans no longer work rather than injecting robo-slaves into the current economy
[QUOTE=XxTheAvengerxX;29550273]Just dropping this here:
[URL]http://www.gizmag.com/organic-molecular-computer/15041/[/URL]
Shows that it is possible to create a processing chip that acts in a similar way to our neurons[/QUOTE]
Neural networks have been made time and time again. The network of the brain is so vast that mapping it would be impossible without a quantum computer, and even then, imagine how much we would need to improve fMRIs to catch all of the variables in the brain. The problems go on and on...
[editline]1st May 2011[/editline]
[QUOTE=bord2tears;29548672]Let's say that robots cannot "die" since their consciousness can be transmitted to a new body. However, they feel the "pain" of injury/death similar to a human (just electrical in the end anyway).
Should punishment for killing a robot as a human be equal to killing a human as a robot? I think not, since the robot has only "pain and suffering" and not eternal disappearance.
What if robots are, by-far, our intellectual superior? Should they go to the same schools? There are too many "what-ifs" out there for something that doesn't exist to cause a debate. The only sure thing about the whole situation is that pragmatism is key when creating rules governing any sentient race.[/QUOTE]
Yes, it should still be punished as murder. The idea of justice is keeping those morally insane out of the world.
No, they wouldn't even need schools, they would know everything (Wikipedia, Google, Watson, etc). In fact, asking them to go to school would be a metaphorical punch in the face. Deductive reasoning can tell us that AI will immediately evolve into seed artificial intelligence, incessantly improving themselves until the emotion of fear is gone.
[editline]1st May 2011[/editline]
You have to look at AI as benefactors, not slaves. If we had to create them, we would hardcode empathy as the most powerful emotion, giving us a colossal network of AI helping us reach whatever goals we require.
[editline]1st May 2011[/editline]
and finally, if we could create AI, we would have the ability of emulating our universe and gaining any technology we want.
[QUOTE=FreeBee;29544652]
A: Can you marry your pet dog? No, then why should humans marry robots?[/QUOTE]
In India you can.
I've wondered for some time if it was possible to code a complex enough algorithm that recreate our form of logic in order to make Research and Engineering AIs.
[QUOTE=Laserbeams;29544201]I think robots should never have their own will, because then they would just be dangerous, we don't need no robot uprisings and other generic movie shit in real life[/QUOTE]
just make them into a ball like the cores in Portal 2
Assuming direct control
[QUOTE=trent_roolz;29540694]Should robots be allowed to marry humans?
Post your thoughts.[/QUOTE]
I robot would have no reason to want to marry a human. Your acting like allowing a robot to get smart means its going to do human thing. No a robot wont want rights unless it is programmed to want rights. A robot can be complex and seem like a human but it wont ever want to do human things unless we make them want to. Your watching to many movies if you think a robot is going to one day get sentience. Its not like sentience is something that just happens there is a level of sentience that will grow over time as robots get more complex and still non of this means the robot will care about rights or even the right to live.
[editline]1st May 2011[/editline]
[QUOTE=Laserbeams;29544201]I think robots should never have their own will, because then they would just be dangerous, we don't need no robot uprisings and other generic movie shit in real life[/QUOTE]
There is nothing wrong with giving them there own "will" in cases of allowing them to help think up there own answers to questions and help us like a more complex computer. Giving them human feelings would be stupid though.
[editline]1st May 2011[/editline]
[QUOTE=xZippy;29541544]Us humans fear that one day robots will be smarter than us. This will never happen.[/QUOTE]
Whats 2349349237838234 squared?