• Autonomous technology "requires debate"
    111 replies, posted
Why the fuck would robots want emotions and empathy if they are working 24/7 doing some of the most boring ass jobs known to man? They'd probably terminate themselves.
The only time we should worry about robots and their rights, is if they actually complain about it. As far as I know, they barely know how to walk around corners.
[QUOTE=Chezplokker;16874200]Why the fuck would robots want emotions and empathy if they are working 24/7 doing some of the most boring ass jobs known to man? They'd probably terminate themselves.[/QUOTE] Which is my point, they shouldn't be treated like tools or slaves. Its no different from us being treated in an identical manner. [editline]03:24AM[/editline] [QUOTE=Xystus234;16874228]The only time we should worry about robots and their rights, is if they actually complain about it. As far as I know, they barely know how to walk around corners.[/QUOTE] We are talking about robo-ethics in scale though, particularly of fully sentient robots, which more scientists and engineers believe will be appearing quite soon, perhaps in only a few decades. You have to look at the long term and short term aspects. I mean, realistically, its comparative to the civil rights issues from the mid-nineteenth century back.
[QUOTE=Zeddy;16874235]Which is my point, they shouldn't be treated like tools or slaves. Its no different from us being treated in an identical manner.[/QUOTE] My dishwasher is angry because I've kicked it a few times for not working so now it will sue me for damages.
With sentient robots comes robosexuals.
[QUOTE=Chezplokker;16874502]My dishwasher is angry because I've kicked it a few times for not working so now it will sue me for damages.[/QUOTE] Your dishwasher isn't capable of feeling emotions or any form of complex thought. A sentient robot however is capable. Its like a human brain being put in a robot body. Its identical thought and emotional capabilities, just a different body. [editline]03:48AM[/editline] [QUOTE=Billiam;16874617]With sentient robots comes robosexuals.[/QUOTE] You mean homosexual robots, or people who have sex with robots? Neither is wrong, though the latter might be notably uncomfortable in early generations if they aren't given a realistic flesh imitation. The first though isn't wrong in anyway.
[QUOTE=Zeddy;16874641]Your dishwasher isn't capable of feeling emotions or any form of complex thought. A sentient robot however is capable. Its like a human brain being put in a robot body. Its identical thought and emotional capabilities, just a different body. [editline]03:48AM[/editline] You mean homosexual robots, or people who have sex with robots? Neither is wrong, though the latter might be notably uncomfortable in early generations if they aren't given a realistic flesh imitation. The first though isn't wrong in anyway.[/QUOTE] Find me an instance where a sentient robot is a useful addition to us please, I only see it being useful if used to download peoples brains into a robot so they can never truly die.
[QUOTE=Chezplokker;16874696]Find me an instance where a sentient robot is a useful addition to us please, I only see it being useful if used to download peoples brains into a robot so they can never truly die.[/QUOTE] How are you useful?
I hope you are trolling.
Have you read any of the damned thread? You're acting like a bigger troll than damn near anyone, and certainly a bigger one than myself. How are you any more useful than a sentient robot?
Because I can do jobs shittier than they can, I cost more money than they do and I have a limited lifespan - is that how you want me to answer it? Now tell me why they are so useful. And don't give me another question.
Because they can do jobs better than you can, they cost less money than you do and they last as long as they keep repairing themselves.
[QUOTE=Mr. Mcguffin;16868326]So if I had a chainsaw implanted into my arm, I would no longer be subject to human rights?[/QUOTE] Yes because you would be put in the loony bin.
[QUOTE=Zeddy;16874151]Even non-pack animals do these things. And our emotions towards subjects such as empathy and subjection to authority are learned. Quit having so much faith in humanity. We're not that special. And the robot would feel something, based on its experiences and learning. We do the same thing.[/QUOTE] No, they aren't. The ability to empathize is a strictly pack animal trait, and it's an incredibly complicated reaction. You don't just learn it. And subjection to authority isn't learn. Mob psychology is an inherent part of all humans. This isn't a question of faith, it's a question of facts. A robot is a compiling machine, something that records and stores data. It cannot develop the ability to empathize any more than it can develop the ability to become illogical. No, it wouldn't. Not unless it was somehow programmed to intentionally. And we do that because we're programmed to. We're rather wired in quite a few respects. [editline]10:46AM[/editline] [QUOTE=Zeddy;16874235]Which is my point, they shouldn't be treated like tools or slaves. Its no different from us being treated in an identical manner. [editline]03:24AM[/editline] We are talking about robo-ethics in scale though, particularly of fully sentient robots, which more scientists and engineers believe will be appearing quite soon, perhaps in only a few decades. You have to look at the long term and short term aspects. I mean, realistically, its comparative to the civil rights issues from the mid-nineteenth century back.[/QUOTE] The difference being robots don't have concepts of boredom or depression. Nor do they feel hate, nor love or, yes, empathy. That's because a robot is an entirely logical construct. Emotions are illogical when they have no physical benefit. Not quite. A "human" robot that is built to emulate a human might fall under these "robo-ethics". A normal construction robot however would be a simple automaton, with no concepts of right, wrong, love, or hate. You might as well campaign for the rights of horses, or chickens. [editline]10:49AM[/editline] [QUOTE=Zeddy;16874641]Your dishwasher isn't capable of feeling emotions or any form of complex thought. A sentient robot however is capable. Its like a human brain being put in a robot body. Its identical thought and emotional capabilities, just a different body. [editline]03:48AM[/editline] You mean homosexual robots, or people who have sex with robots? Neither is wrong, though the latter might be notably uncomfortable in early generations if they aren't given a realistic flesh imitation. The first though isn't wrong in anyway.[/QUOTE] No, emotions are the product of chemical reactions. Really, do you have [I]any[/I] idea of what you're talking about? Do you even know what an emotion is or how it occurs? Do you have even a basic understanding of neurology? Robots will have no sexuality. Why would they? What benefit would it entail for something that can't reproduce? [editline]10:50AM[/editline] [QUOTE=lazyguy;16875584]Yes because you would be put in the loony bin.[/QUOTE] What if my arm was lost in the war, and the only job left to me was logging with me pop in Alaska? and I couldn't hold the chainsaw because I only had one hand? [editline]10:57AM[/editline] [QUOTE=Zeddy;16874721]Have you read any of the damned thread? You're acting like a bigger troll than damn near anyone, and certainly a bigger one than myself. How are you any more useful than a sentient robot?[/QUOTE] It's not a question of use, so much as it is of ethics. He's more important because he can feel pain and the robot can't. A sentient robot, for all intents and purposes, is a fact compiler. That's all you need to be sentient, the ability to analyze the world and compile facts. Robots can't change their structuring however, so they can't really give themselves emotions, unless they were programmed with such an intent in mind. A solely logical being with no generations has no need for emotions. Anger wouldn't give any enhanced physical ability, sadness wouldn't exist because it would simply impair efficiency, love would be useless because there is no procreation between robots, jealousy simply wouldn't exist because all tasks are allocated logically to the most opportune times and are never forgotten, and trust would be a mechanical and mathematical process.
[QUOTE=Mr. Mcguffin;16868272]Er, you realize that the instinct to live is in fact an instinct, and a robot won't have that unless we give it to them? They might systematically evaluate their own worth, but if a new and better model came out, wouldn't they notice the unimportance of their existence and simply give up anyway? [B]Remember, robots aren't people, their incredibly efficient thinking machines.[/B][/QUOTE] Yeah I would think so to. We might have thought we saw the worst of "strongest defeat everything" with the nazi's in 1930's, but with robots, they could commit mass suicides just so the stronger could live on.
What you guys don't realize is that if this happens they would be no need for money,jobs and hard labour and people will be able to pursue their dreams and space travle will be a hell of alot easier!! The thing we need to worry about is nanotech as it is alot harder to program it to have sentience etc... and it would act like a hivemind evolving and learning itself read the story "Prey" by Michael Crichton that what could of happened in a worst case scenario but who says that the robot is going to be like us? it is a completely different thing to a human.
[QUOTE=Mr. Mcguffin;16875620]No, it wouldn't. Not unless it was somehow programmed to intentionally.[/QUOTE] I'm gonna clip your entire post of bullshit and go straight to the problem. [highlight]YOU DIDN'T READ ANYTHING. READ THE GOD DAMNED THREAD AGAIN, THEN TALK.[/highlight] Maybe if you'd actually read it, you'd understand what I've been saying. [editline]11:48AM[/editline] [QUOTE=skifer;16878333]What you guys don't realize is that if this happens they would be no need for money,jobs and hard labour and people will be able to pursue their dreams and space travle will be a hell of alot easier!! The thing we need to worry about is nanotech as it is alot harder to program it to have sentience etc... and it would act like a hivemind evolving and learning itself read the story "Prey" by Michael Crichton that what could of happened in a worst case scenario but who says that the robot is going to be like us? it is a completely different thing to a human.[/QUOTE] That wasn't really given true sentience. It could learn, but only in a strict manner. It was designed to be a cloud of hunter nanites based off an anti-virus program. The problem was that they basically fucked up and it misinterpreted everything and went nuts. Love the reference though, pretty obscure, even being a Michael Crichton work. I read the thing back when it came out. Good book.
[QUOTE=Chezplokker;16874200]Why the fuck would robots want emotions and empathy if they are working 24/7 doing some of the most boring ass jobs known to man? They'd probably terminate themselves.[/QUOTE] "I will do it. I will power down." "No, 037-J. You have so much to function for."
[QUOTE=Zeddy;16880128]I'm gonna clip your entire post of bullshit and go straight to the problem. [highlight]YOU DIDN'T READ ANYTHING. READ THE GOD DAMNED THREAD AGAIN, THEN TALK.[/highlight] Maybe if you'd actually read it, you'd understand what I've been saying. [editline]11:48AM[/editline] That wasn't really given true sentience. It could learn, but only in a strict manner. It was designed to be a cloud of hunter nanites based off an anti-virus program. The problem was that they basically fucked up and it misinterpreted everything and went nuts. Love the reference though, pretty obscure, even being a Michael Crichton work. I read the thing back when it came out. Good book.[/QUOTE] I've read the thread. I fail to see how the ability to recognize owns individuality also gives one the ability to gain emotions. You've also failed to explain this. Either do so, or stop being such a whiny bitch about everything I say. [editline]07:15PM[/editline] [QUOTE=JohnnyMo1;16880485]"I will do it. I will power down." "No, 037-J. You have so much to function for."[/QUOTE] "I'm sorry 024-N. I enjoy the bonus to work efficiency you bring when we are in close proximity. I've left my deactivation file on a portable hard drive in the deactivation file receptacle."
Flaws in programming create viruses, At one point people said viruses were computers creating AI. I can't remember if I was told this, or I learned this somewhere. I don't know what to think, because you remember those little robots, where some of them developed programming that was the complete opposite of their original programming? Or is this a different situation...
[QUOTE=thedekoykid;16890698]Flaws in programming create viruses, At one point people said viruses were computers creating AI. I can't remember if I was told this, or I learned this somewhere. I don't know what to think, because you remember those little robots, where some of them developed programming that was the complete opposite of their original programming? Or is this a different situation...[/QUOTE] lol uh no flaws in programming are called bugs, viruses are completely intentionally programmed
[QUOTE=Chezplokker;16874696]Find me an instance where a sentient robot is a useful addition to us please, I only see it being useful if used to download peoples brains into a robot so they can never truly die.[/QUOTE] Here's an idea you should think about. There is a British neural-scientist who studies neurons and brain-computer interfaces, I've seen him on TV a couple times. He's the guy who did the robot with a rat brain. In one clip of the show, he placed a computer chip in his arm and a device that would send electric current down his hand, which would send neurons firing down the arm, along with a sensation. It occurred to me by watching it that when the neurons themselves fired, it actually CAUSED sensation. The neurons firing WAS sensation. To upload your mind would give you a completely different experience.
Sorry, you need to Log In to post a reply to this thread.