You- "Time to hammer some nails."
Hammer- "This one does not feel like hammering today."
[QUOTE=MaverickIB;29583121]You- "Time to hammer some nails."
Hammer- "This one does not feel like hammering today."[/QUOTE]
Go get a bigger robot to put it into line. And a bigger robot to make sure the other one does what it's supposed to.
Why do people seem to think that robots in the future will use modern programming? In the Victorian times they used clockwork, as Steampunk has taught me, now we use Computers/Binary systems, in the future we may use something totally different, like Bio-Engineering or something.
[editline]3rd May 2011[/editline]
[QUOTE=MindMuncher;29583523]Go get a bigger robot to put it into line. And a bigger robot to make sure the other one does what it's supposed to.[/QUOTE]
And then a bigger one to make [I]that[/I] one stay in line, then a small one to make them look bigger.
[QUOTE=sergeantsmiles;29589137]Why do people seem to think that robots in the future will use modern programming? In the Victorian times they used clockwork, as Steampunk has taught me, now we use Computers/Binary systems, in the future we may use something totally different, like Bio-Engineering or something.
[editline]3rd May 2011[/editline]
And then a bigger one to make [I]that[/I] one stay in line, then a small one to make them look bigger.[/QUOTE]
And we'll line them up and it will go big robot, big robot, little robot, big robot, little robot, bigger robot!
An artificial intelligence would think very differently from humans. Humans are firstly animals and only secondarily sapient beings. An artificial intelligence would just be sapient. It would have no need for instinctual behaviour. The only purpose instincts serve is extremely rapid heuristics in situations where your survival depends on rapid (but potentially bad) decisions instead of careful but correct reasoning.
We can evolve AIs without presenting them to environments where such behaviour has any incentive to emerge.
And the AI would probably not even have a concept about identity like humans do. An observer-centric view of the world is an evolved trait of an animal. The AI might appear altruistic, but it wouldn't exactly be altruistic or selfish. It simply wouldn't think of the world as "me" and "others".
[QUOTE=ThePuska;29592039]An artificial intelligence would think very differently from humans. Humans are firstly animals and only secondarily sapient beings. An artificial intelligence would just be sapient. It would have no need for instinctual behaviour. The only purpose instincts serve is extremely rapid heuristics in situations where your survival depends on rapid (but potentially bad) decisions instead of careful but correct reasoning.
We can evolve AIs without presenting them to environments where such behaviour has any incentive to emerge.
And the AI would probably not even have a concept about identity like humans do. An observer-centric view of the world is an evolved trait of an animal. The AI might appear altruistic, but it wouldn't exactly be altruistic or selfish. It simply wouldn't think of the world as "me" and "others".[/QUOTE]
Let's [i]hope[/i] they're altruistic...
[QUOTE=sergeantsmiles;29589137]Why do people seem to think that robots in the future will use modern programming? In the Victorian times they used clockwork, as Steampunk has taught me, now we use Computers/Binary systems, in the future we may use something totally different, like Bio-Engineering or something.[/QUOTE]
While steampunk is no real example, you are somewhat correct.
There are examples of speculative biological computers that would be infinitely more complex and efficient than modern computers. Speculative...
Still, BloodMusic was a really good read, I suggest it to everyone.
[QUOTE=Laserbeams;29549574]Why would robots need sentience?
No, seriously, think about it, why would slaves that don't get tired, don't feel pain and don't need food and water need sentience? Then would just be either:
A: Sentient beings suffering from slavery
or
B: The loss of a resource as great as slaves that don't suffer from labor[/QUOTE]
just because we make a sentient robot doesn't mean we lose all toasters.
We still can make non sentient beings, but there will [i]also[/i] be sentient ones.
Also I don't know if they should have the same rights. I mean if you killed one, you could probably just take out whatever storage device it was using and put it in a new body... They would be extremely different from biological life, and shouldn't be treated equal.
Don't get me wrong, they should be treated well, but they don't need the exact same rights as humans.
treating everything equally is not a good idea.
[editline]3rd May 2011[/editline]
Also we could build them without feelings.
Honesly, on every point: NO. They should always be a tool, made to do a mechanical task with ONLY the means to do so.
[QUOTE=Jo The Shmo;29601288]
Also we could build them without feelings.[/QUOTE]
They likely would be built without feelings, as it would cause them to be unpredictable. If the AI could feel empathy for the pigs you told it to butcher, or angry if you say you're turning it off for the night, then it might do something crazy.
No because I don't want this asshole near me:
[URL=http://www.imagecross.com/][IMG]http://hostinga.imagecross.com/image-hosting-00/1600Mass-Effect-2-Overlord-David.jpg[/IMG][/URL]
Do we want to simply use them as simple workers or an actual members of society?
if the 2nd then:
Should robots be allowed to have sentience? And if so, how far should it go? Should we put things in them to prevent crime or allow them to live like actual human beings?yes,they could be just like us,make it so that they can develop personallity(atheist or religious,criminal or not,open-minded or close-minded)
Should robots with human-like artificial intelligence have [B]human[/B] rights?Yes
Should destroying a robot with human intelligence count as murder?Depends on a lot of things,most likely
Should robots be allowed to practice religion?If they wish
Should robots be allowed to marry other robots?if they wish for a system crash
Should robots be allowed to marry humans?errr....i guess they should have the right,but i don't really see how that would work...
The day humans create life(AI), God´s mind will fuck itself, aka, MINDFUCK. The world explodes and we all die.
The end.
and to be a little more serious here I will answer your questions because I am superior.
Should robots be allowed to have sentience? - Que?
Should robots with human-like artificial intelligence have human rights? - Robots aren´t humans, therefore robots should not. Should dogs have human rights?, should Humans have robots rights? Bad question.
Should destroying a robot with human intelligence count as murder? - Robots are robots, MACHINES,
Should robots be allowed to practice religion? - Honestly, if religion still exists the day we create artificial intelligence I will kill myself. The paradox here is that I am probably dead by that time so I cannot kill myself.
Should robots be allowed to marry other robots? - Only if dogs are allowed to marry eachother, yes I am comparing dogs and robots.
Should robots be allowed to marry humans? - Can I marry a donkey? Then you already know the answer.
[QUOTE=The DooD;29607194]They likely would be built without feelings, as it would cause them to be unpredictable. If the AI could feel empathy for the pigs you told it to butcher, or angry if you say you're turning it off for the night, then it might do something crazy.[/QUOTE]
It would be very predictable assuming they would be programmed. Any feelings shown emotion would be programmed in and there would be no predictability. In the scenario where you insult the AI, it interprets the language via, determines the meaning, and executes an action as programmed. The difference between an AI and any living creature is that an AI cannot respond to an event it isn't programmed to respond to.
[QUOTE=Lifeslash;29540740]I can just imagine the laws that we will come up with.
robot gay marriage.
robot voting rights.
robot fights.[/QUOTE]
Um, no. No. Why would robots even have a gender?! Even as a human, spending the time to live out a fffucking instict (which is sex and all that lovely jizz), it's not really beneficial. It's simply because we (due to the instinct) enjoy it that we practice sex; the sheer fact of our need to reproduce.
Robots? An AI? Why would it fuck?
[QUOTE=Pepin;29614897]It would be very predictable assuming they would be programmed. Any feelings shown emotion would be programmed in and there would be no predictability. In the scenario where you insult the AI, it interprets the language via, determines the meaning, and executes an action as programmed. The difference between an AI and any living creature is that an AI cannot respond to an event it isn't programmed to respond to.[/QUOTE]
But that's the thing, the whole point of an AI is that it should be able to adapt so that it can respond to something it's not initially programmed to. Otherwise all we have is a very complex program, not an AI.
Why do people keep thinking that if one robot is sentient, then all machines are?
If you had a powerful AI, it would most certainly not be performing menial tasks and would probably be in need of equal rights (not quite equal, as it is not human, but 'equitable' rights perhaps). If it was performing tasks, it would probably be solving complex math equations, setting up systems, or managing certain systems or software programs. It wouldn't be farming in the fields. That's stupid.
The original idea of sentience I had in the OP wasn't that they were programmed to a certain life and stuff, it was that they were programmed an ability to observe and learn in a human-like fashion. So they'd be just like humans in the area of minds and personalities, but they'd be physically much much different.
[QUOTE=trent_roolz;29625703]The original idea of sentience I had in the OP wasn't that they were programmed to a certain life and stuff, it was that they were programmed an ability to observe and learn in a human-like fashion. So they'd be just like humans in the area of minds and personalities, but they'd be physically much much different.[/QUOTE]
When you're talking about AI, the physical factors are so variable that it's about as useless to talk about that aspect of them as talking about what aliens will look like. We can talk about their thought processes just a little bit more.
[QUOTE=The DooD;29616147]But that's the thing, the whole point of an AI is that it should be able to adapt so that it can respond to something it's not initially programmed to. Otherwise all we have is a very complex program, not an AI.[/QUOTE]
I don't think there is anything to indicate that an AI will be anything other than a very complex program. Writing a program that responds to something it doesn't expect is extremely difficult to even think about. I do believe that there will be a point where a person could not tell the difference between interacting with an AI and a real human, but I wouldn't suggest that this means that the AI is aware or should have rights, more that they are very programmed far too well. I actually believe there have been some chat bots that have tricked people into believing they are human, so that step has already kind of taken place.
I think our best chance at any form of AI is replication of a brain though mechanical means. If we are able to take someone's brain and make some structure work in the same exact way, it should be aware, it should be human. I think we would do this without very much understanding. Maybe eventually we could get to the point where we understand the structure of the human brain, and can construct a new mechanical brain to make an AI self aware and have human like abilities. The chances of any of that happened I believe to be very low, but it it's going to happen, I don't see it happening through programming.
If AI really turn out to be self aware and is like GLaDOS, HAL 9000, other famous A.I.s... Then I don't think it's worth it.
Left 4 Dead director in real life would be cool though :v:
[QUOTE=FreeBee;29544652]
Q: Should robots be allowed to marry humans?
A: Can you marry your pet dog? No, then why should humans marry robots?[/QUOTE]
There's a man here in Australia who had a wedding ceremony for him and his dog.
[QUOTE=ThePuska;29592039]
And the AI would probably not even have a concept about identity like humans do. An observer-centric view of the world is an evolved trait of an animal. The AI might appear altruistic, but it wouldn't exactly be altruistic or selfish. It simply wouldn't think of the world as "me" and "others".[/QUOTE]
My favorite quote about this is:
"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."
I'd love artificial intelligence, and as a transhumanist I'm a huge fan of science and robotics and whatnot. A robotic companion won't stab you in the back, and it would listen to all your problems and be your friend....
:ohdear:
[QUOTE=Lazyboy0337;29553652]OP just watched Futurama
[img_thumb]http://24.media.tumblr.com/tumblr_l5c3m0yut21qc2d9bo1_500.jpg[/img_thumb][/QUOTE]
That's not the OP you quoted..
I can't escape the feeling that if we made a new race of AIs, we'd have to do some serious work and make strides to avoid mutual destruction of us both. Maybe I'm just a biochauvinist or have seen way too many movies, but if an AI works purely in the way a computer does, except with more liberty in regards to thought process, then would the chances of them deciding to eradicate humanity or at the least harbor a grudge against their creators be significantly higher? I'm probably just paranoid.
I don't know, it just feels odd for there to be a living, sapient organism that doesn't work within the normal bounds of sentient life. When we discover aliens, I have no doubt they too will be similar in thought process to humanity; their civilization(s) can and likely will be entirely different to ours, but their base origin likely will not, as evolution tends to cater towards the same trends, at least, on Earth. But AIs don't have that guiding principle, as per the artificial moniker. Even though they'll be sapient, they'll still be forged by human hands, not by their own evolution. Eventually, they would be, but at first, and for possibly thousands of years, they will have no guiding principles; no religions, no morals, no cultural taboos, nothing that we developed in our infancy as a stepping stone to our current path to advancement.
I can't help but stick to this feeling that if an AI were to be created without some immense restrictions at first, it'll end in nothing short of disaster.
[img]http://www.toplessrobot.com/epidatedrobot.jpg[/img]
More like this.
[QUOTE=Arachnidus;29644472]I can't escape the feeling that if we made a new race of AIs, we'd have to do some serious work and make strides to avoid mutual destruction of us both. Maybe I'm just a biochauvinist or have seen way too many movies, but if an AI works purely in the way a computer does, except with more liberty in regards to thought process, then would the chances of them deciding to eradicate humanity or at the least harbor a grudge against their creators be significantly higher? I'm probably just paranoid.
I don't know, it just feels odd for there to be a living, sapient organism that doesn't work within the normal bounds of sentient life. When we discover aliens, I have no doubt they too will be similar in thought process to humanity; their civilization(s) can and likely will be entirely different to ours, but their base origin likely will not, as evolution tends to cater towards the same trends, at least, on Earth. But AIs don't have that guiding principle, as per the artificial moniker. Even though they'll be sapient, they'll still be forged by human hands, not by their own evolution. Eventually, they would be, but at first, and for possibly thousands of years, they will have no guiding principles; no religions, no morals, no cultural taboos, nothing that we developed in our infancy as a stepping stone to our current path to advancement.
I can't help but stick to this feeling that if an AI were to be created without some immense restrictions at first, it'll end in nothing short of disaster.[/QUOTE]
I agree with you that they might be completely different. However I don't think that's a bad thing, as evolution doesn't exactly favour the most pleasant traits in animals. I don't want another hypercompetitive, selfish and extremely intelligent race on the planet. [i]That's[/i] the path to destruction. Another naturally evolved intelligence would think of us as a threat, while an artificial intelligence might not - for an animal to evolve naturally this far, they must survive, and to survive they must be selfish, always putting themselves and their kind first. Altruism doesn't occur naturally if it doesn't benefit the animal.
Sorry, you need to Log In to post a reply to this thread.