Autonomous robotic is retarded
Make more sence biologic enhancement
[QUOTE=Garion vz. 2;16857651]If there is a robot revolution all we have to do is EMP nuke key places[/QUOTE]
They did that in the Animatrix.
It didn't work, they CARPET-NUKED Zero-One and it didn't work.
[QUOTE=Canuhearme?;16859553]They did that in the Animatrix.
It didn't work, they CARPET-NUKED Zero-One and it didn't work.[/QUOTE]
Those were normal nukes.
[QUOTE=Athena;16859668]Those were normal nukes.[/QUOTE]
Being that close to a nuke, even a normal one (hard to say normal, considering how advanced they probably are, after all, this WAS during the Golden Age of Humanity) the EMP would be devastating against something not heavily shielded.
I wonder what would happen if robots discovered that we were just using them for cheap labor?
Your son's internal organs have failed, I am sorry, [I]MISS JOHNSON[/I]
Would you like us to send him in to Microsoft for a new one?
[QUOTE=Micr0;16859795]I wonder what would happen if robots discovered that we were just using them for cheap labor?[/QUOTE]
They won't because they aren't programmed to know that.
Unless we give them the ability to learn.
[QUOTE=Micr0;16859795]I wonder what would happen if robots discovered that we were just using them for cheap labor?[/QUOTE]
[QUOTE=Mr. Someguy;16862271]They won't because they aren't programmed to know that.
Unless we give them the ability to learn.[/QUOTE]
Keep in mind that robots don't need to have a human level of sentience to perform cheap labor. Again, a common mistake is to think of intelligence as a black and white matter of being sentient or not.
As far as cheap labor goes, you're looking at a mechanical/electronic equivalent of a domesticated animal, like a horse.
It is also unlikely that a machine with human level intelligence, if we would have reason to make one, would object to a much simpler machine being used for cheap labor.
We as humans tend not to mind when monkeys are used for lab testing, at least most of us don't.
If you program a robot to do something, then surely it can only do what it is programmed to do. The only way to allow a robot to go crazy and do a skynet would be to have that programmed in or a way for the robot to program itself. Right? So just make it impossible to do that and you're right.
[QUOTE=Mr. Someguy;16862271]They won't because they aren't programmed to know that.
Unless we give them the ability to learn.[/QUOTE]
Eventually scientists will create a robot that is as smart as a human. And that robot will discover that it has no rights and is just considered an experimental prototype and will be recycled once a new one has been created. If you knew you were going to die inevitably would you not try and do something about it? Of course the robot would have no choice because it is a prototype and it wouldn't have a body, only circuits hooked up to a computer. But if somebody creates a sentient machine with a humanoid body and it knew it was going to die against it's will, it would most likely do something about it unless programmed not to. But because it would be programmed not to care about dieing, it wouldn't actually be smart and wouldn't work. So the only way we can create a machine that is as smart as us is to program it to have a sense of self preservation. And if we do that, it will not want to die when it is told it must, it will do anything in it's power to continue living.
So no matter what, the only way for a machine to be considered smart is for it to have a sense of self preservation. And if it knows it's going to die if it doesn't do something, it will do something. So an intelligent robot would rebell unless it is left to die of natural causes without human intervention. Allowing robots into our society could be a danger.
Boy this post really went off track.
Of course simple, non-intelligent work machines will not do this, but eventually we will create a robot that will be able to have a sense of self preservation. And when we do, it will eventually have to die. It will not like it.
The only way to make fully sentient, yet subservient machines is to make the Three Laws of Robotics not only programmed into them, but somehow built into them, so that even if they tried to violate them (if they could get through the programming, that is) they would short-circuit.
[QUOTE=Canuhearme?;16862661]The only way to make fully sentient, yet subservient machines is to make the Three Laws of Robotics not only programmed into them, but somehow built into them, so that even if they tried to violate them (if they could get through the programming, that is) they would short-circuit.[/QUOTE]
Indeed. The second you grant sentience, you make another potential for resistance and rebellion. Inevitably, they would try to expand, unless their instinct stopped them. In order for a sentient species to exist on one planet, none other may exist in pure harmony, only in subservience. Therefore, they would have to be bred to do one thing and one thing only; serve.
Even giving a robot human intelligence wouldn't pose much of a risk in the first place.
I know it's a rather depressing topic to bring up, but it is worth noting that even humans themselves can be indoctrinated into complete obedience given an environment that facilitates such indoctrination, and such indoctrination can take years to undo even intentionally. A machine reaching a human level of intelligence wouldn't be much different.
Contrary to what we often want to believe, intelligence doesn't automatically grant a sense free will.
Whether we would really approach things that way would fall under ethical concerns regarding the treatement of human level artifical intelligence. However, the point is that artificial intelligence alone does not make a "machine rebellion", not by a long shot.
[QUOTE=Arachnidus;16862692]Indeed. The second you grant sentience, you make another potential for resistance and rebellion. Inevitably, they would try to expand, unless their instinct stopped them. In order for a sentient species to exist on one planet, none other may exist in pure harmony, only in subservience. Therefore, they would have to be bred to do one thing and one thing only; serve.[/QUOTE]
The way Asimov got around it was throwing out the idea that the creation will try to kill the creator, (called The Frankenstein Complex, it was rife at the time he wrote I, Robot) instead, he argued that Robots wouldn't mind at all about their jobs.
Think about it, if you were born to build cars, love to build cars, can build cars day and night, and someone offered you the chance to build cars, would you really contemplate the concepts of slavery and how it would effect you? No, you'd build cars.
Why are scietists insistant on forcing others into unemployment?
[QUOTE=Mr. Someguy;16854201]This brings up another question. The shipping industry is nothing short of massive. A lot of people work in that industry. If we replace this with machines, what are we going to do with the tens of thousands of suddenly jobless people?[/QUOTE]
Simple. Pay them for the work the robots do.
Golden age biatch.
[editline]08:36PM[/editline]
[QUOTE=Arachnidus;16862692]Indeed. The second you grant sentience, you make another potential for resistance and rebellion. Inevitably, they would try to expand, unless their instinct stopped them. In order for a sentient species to exist on one planet, none other may exist in pure harmony, only in subservience. Therefore, they would have to be bred to do one thing and one thing only; serve.[/QUOTE]
the fuck are you talking about
these are fucking lorry drivers for fucks sake
not even [I]real[/I] lorry drivers think that much
[editline]08:37PM[/editline]
[QUOTE=Hallucinate;16863024]Why are scietists insistant on forcing others into unemployment?[/QUOTE]
I for one welcome ethical slavery. It worked for the Romans, without the ethical part even.
Come to think of it, most arguements about the possibility of "machine rebellion" almost always stem from our own arrogance regarding our own intelligence. We keep thinking of what we would do if we were in such a position, but even this is a very narrow view. The point has to be made that just because something becomes intelligent does not mean that it turns into an equivalent of "us". When you look at humans who are severely impaired, those subjected to conditions not unlike those a robot would be, or even when you look far enough into our own past, it becomes clear that this is not the case.
[QUOTE=Canuhearme?;16862935]The way Asimov got around it was throwing out the idea that the creation will try to kill the creator, (called The Frankenstein Complex, it was rife at the time he wrote I, Robot) instead, he argued that Robots wouldn't mind at all about their jobs.
Think about it, if you were born to build cars, love to build cars, can build cars day and night, and someone offered you the chance to build cars, would you really contemplate the concepts of slavery and how it would effect you? No, you'd build cars.[/QUOTE]
Exactly.
[QUOTE=Hallucinate;16863024]Why are scietists insistant on forcing others into unemployment?[/QUOTE]
That is the way all technology works. First it takes away jobs, then it allows us to create more jobs by expanding what we can do.
You could make the case that the invention or harvesting machines removed the need for a lot of people to harvest crops from fields, and so on and so forth. In fact, you could say we have have been a bit shortsighted by making people's well being depend on the very thing we often have to eliminate to improve our quality of life for the rest of us.
However, as far as jobs are concerned, in the long run technology has always created more jobs because it allows us to do more, and by necessity we need more people to do more.
It's a continuous cycle of technology of first eliminating need do to things, then "allowing" us to do more things, and the fact that we humans always want to do whatever we are "allowed" to do always balances everything out in the long term.
[img]http://bookcoverarchive.com/images/books/player_piano.large.jpg[/img]
Yay, yay, let's remove all of our purpose in this world!
[QUOTE=Space Spam Squid;16863207][img]http://bookcoverarchive.com/images/books/player_piano.large.jpg[/img]
Yay, yay, let's remove all of our purpose in this world![/QUOTE]
Faff off. We don't have a bloody purpose. And if we did, and that purpose was menial labor, then I wouldn't give a shit if we removed it.
[QUOTE=paul1290;16863138]Come to think of it, most arguements about the possibility of "machine rebellion" almost always stem from our own arrogance regarding our own intelligence. We keep thinking of what we would do if we were in such a position, but even this is a very narrow view. The point has to be made that just because something becomes intelligent does not mean that it turns into an equivalent of "us". When you look at humans who are severely impaired, those subjected to conditions not unlike those a robot would be, or even when you look far enough into our own past, it becomes clear that this is not the case. [/QUOTE]
The reason people always bring up the robot rebellion is because if we create a robot as intelligent as us, it will not want to do what we say if it doesn't like it. If we tell it to die, it won't just jump into a moving car because you told it too, it would probably ask why. If we tell it that there is a more upgraded version of you and we don't want you any more, it will probably run away or fight back to keep you from killing it.
We could give robots equal rights when we reach this stage, but it's not likely humanity will do that. They will believe that they deserve to live even if they aren't as efficient as their most recent version, just like how brothers will think they still deserve to live even if they have a little brother that's better at everything. However, humans will not agree with the way the robot thinks and will consider the older counterpart useless and will throw it away. And who knows? Maybe robots will try and kill our elderly because they are old and inneficient.
In order to aquire a higher purpose, you need to fill your lower purpose with something.
The more you eliminate what you need to do, the more you can do, and the more you can choose to do.
That's what makes us human. We're not just satisfied with simply meeting our own needs, we would rather have it so we wouldn't have to bother with them. We would rather "use our powers for awesome" so to speak.
[QUOTE=Micr0;16863340]The reason people always bring up the robot rebellion is because if we create a robot as intelligent as us, it will not want to do what we say if it doesn't like it. If we tell it to die, it won't just jump into a moving car because you told it too, it would probably ask why. If we tell it that there is a more upgraded version of you and we don't want you any more, it will probably run away or fight back to keep you from killing it. [/QUOTE]
That's exactly the kind of fallacy of logic that I was trying to point out.
As much as we would love to believe it, intelligence does not grant you free will. Just because a being is intelligent doesn't mean that it would want a higher purpose, it takes a lot more than that.
We would like to believe that intelligence would grant you such awareness of course, but we have shown numerous times, through both our mistakes and often the misguided intentions of members of our species we often label as "evil", that this simply is not the case.
Again, going back to the example of building cars. If you were born to build cars, like to build cars, and did nothing but build cars, you wouldn't object to someone telling you to build cars. In keeping with your example, if you were told that you would be discarded because you couldn't build cars as well as someone else can, you probalby wouldn't object to that either, human level intelligence or not. No simple notion of free will being told to your face would change that, altering your view would take a lot of intervention and re-education, essentially the human equivalent of a reprogramming.
Even intelligence the way we have it as individuals today is subject to this to some degree. Whether we know it or not we routinely submit ourselves to the will of others without question every day. It's not unthinkable that a civilizations that have an even greater notion of free will would take pity on us the way we would take pity on a slave who does what he does because doesn't know any better.
In the end, whether we want to give sentient robots the same rights we have would be our call, and what we choose to do with that responsibility is up to us.
[QUOTE=Canuhearme?;16862661]The only way to make fully sentient, yet subservient machines is to make the Three Laws of Robotics not only programmed into them, but somehow built into them, so that even if they tried to violate them (if they could get through the programming, that is) they would short-circuit.[/QUOTE]
I think the whole point of I, Robot was that the three-laws, when taken literally, end in a shitstorm.
Yeah well, if you gave anybody a specific order, like don't cause harm to a flea, give them some time and they can find a loophole in it.
But if you actually WANT to not hurt the flea, then you won't try to find loopholes. You would only try to get around the rules if you wanted to get around them.
So the robots in iRobot already had a goal of taking over the world.
[QUOTE=Zeke129;16863700]I think the whole point of I, Robot was that the three-laws, when taken literally, end in a shitstorm.[/QUOTE]
The movie is NOTHING like the novel, they did a poor attempt at interpreting them in the movie.
In the novel, the robots follow the three laws to the letter, and they end up helping Humanity achieve a Golden Age of Science, Technology, Industry, and Culture.
You know, if robots replaced most of humanity's jobs and left thousands/millions of people unemployed, how would our economy run? We would have much slower amounts of cash going into the citizen population causing people to be much more wary of spending on expensive items and causing another economical crisis that we have right now.[img]http://sa.tweek.us/emots/images/emot-eek.gif[/img]
[quote]The Royal Academy of Engineering says that automated freight transport could be on the roads in just 10 years' time. [/quote]
[img]http://photos.signonsandiego.com/albums/071014tunnelcrash/crash16.jpg[/img]
[QUOTE=DireAvenger;16864796]You know, if robots replaced most of humanity's jobs and left thousands/millions of people unemployed, how would our economy run? We would have much slower amounts of cash going into the citizen population causing people to be much more wary of spending on expensive items and causing another economical crisis that we have right now.[img]http://sa.tweek.us/emots/images/emot-eek.gif[/img][/QUOTE]
If robots did all the services we wouldn't need an economy or money or jobs.
[editline]11:34PM[/editline]
Shit would be SO cash
[editline]11:34PM[/editline]
well not cash but you know what I mean
If cars drove themselves, it would have to figure out where all cars in existence are going, then predict where future routes for other cars are, all while planning its route.
After about a few hundred or thousand cars, the entire system would be stuck in a loop or just crash, achieving nothing except a big mess.
[QUOTE=MidnightMuffin;16865278]If cars drove themselves, it would have to figure out where all cars in existence are going, then predict where future routes for other cars are, all while planning its route.
After about a few hundred or thousand cars, the entire system would be stuck in a loop or just crash, achieving nothing except a big mess.[/QUOTE]
The cars would be networked together.
I didn't say they wouldn't be networked.
I guess they could share processing power.
Sorry, you need to Log In to post a reply to this thread.