• Robot passes self-awareness test
    149 replies, posted
I liken it to 'emergent' behavior in games. Let's say you're playing an RPG and you stumble across a scene of wildlife fighting NPCs. This is a scene no other player has ever seen, because it's not scripted. You could play this game for a million hours and never see this again. It's STILL the product of the game developers. That wildlife was created by the game developers, as were the NPCs. The rules each operate under were also pre-written. The locations where they can appear, the damage they can do, their animations...and on and on. All pre-written by humans. Same with all robots so far. The robot may shuffle the commands around, may mix inputs and outputs in new ways, but it's all sourced from humans. No robot is doing anything unless somewhere deep in its code some human described the steps needed to take that action.
[QUOTE=Deathtrooper2;48245272]I think the most scariest part about this test was that they didn't program it to recognize itself. So in other words, the robot recognized himself on it's own... creepy.[/QUOTE] unless they created a sentient being capable of learning at 10 times the rate of humans, yes they did program it to recognize itself.
[QUOTE=cecilbdemodded;48254469]I liken it to 'emergent' behavior in games. Let's say you're playing an RPG and you stumble across a scene of wildlife fighting NPCs. This is a scene no other player has ever seen, because it's not scripted. You could play this game for a million hours and never see this again. It's STILL the product of the game developers. That wildlife was created by the game developers, as were the NPCs. The rules each operate under were also pre-written. The locations where they can appear, the damage they can do, their animations...and on and on. All pre-written by humans. Same with all robots so far. The robot may shuffle the commands around, may mix inputs and outputs in new ways, but it's all sourced from humans. No robot is doing anything unless somewhere deep in its code some human described the steps needed to take that action.[/QUOTE] And that's the way it's always going to be. We're never going to have spontaneous intelligence appearing inside or between computers that in some way is not connected to humans.
[QUOTE=Swebonny;48254739]And that's the way it's always going to be. We're never going to have spontaneous intelligence appearing inside or between computers that in some way is not connected to humans.[/QUOTE] What about the self modifying code? And genetic algorithm? [editline]21st July 2015[/editline] [QUOTE=Bordellimies;48252723]I've always wanted to be a parent to a robot, teaching it how things work and such. Hope this would be possible in my lifetime.[/QUOTE] Just get a kid
[QUOTE=Fourier;48255719]What about the self modifying code? And genetic algorithm? [/QUOTE] Then we could just say that we programmed the genetic algorithm and it'll never act outside the boundaries that the algorithm has. And so on. I personally don't agree with cecilbdemodded.
What if someone created a robot brain, and linked it to the internet. The initial programming, to get that brain functional and with the ability to connect online- that's human programming. What if that brain learned what 'games' are by doing its own research online? What if, once it learned what games are, it taught itself how to play chess? That's not human programming, that would be AI. That's because nowhere in the human written code does the concept of games exist, nowhere in the human written code is there mention of chess or the rules of chess. Technically you could argue that it couldn't learn to play chess without the initial code to boot up and connect online, but that's not what I'm talking about. I'm talking about the robot doing something the robot itself completely made up, something no human had any input in determining.
I'm sure we'll eventually come up with something that has similiar learning capabilities as ourselves.
[QUOTE=cecilbdemodded;48256483]What if someone created a robot brain, and linked it to the internet. The initial programming, to get that brain functional and with the ability to connect online- that's human programming. What if that brain learned what 'games' are by doing its own research online? What if, once it learned what games are, it taught itself how to play chess? That's not human programming, that would be AI. That's because nowhere in the human written code does the concept of games exist, nowhere in the human written code is there mention of chess or the rules of chess. Technically you could argue that it couldn't learn to play chess without the initial code to boot up and connect online, but that's not what I'm talking about. I'm talking about the robot doing something the robot itself completely made up, something no human had any input in determining.[/QUOTE] I think that's exactly what is so special about this case as well
[QUOTE=Swebonny;48254739]We're never going to have spontaneous intelligence appearing inside or between computers that in some way is not connected to humans.[/QUOTE] Why is this such a revelation? The human brain can't act out in ways that aren't predefined relative to it's material makeup and genetic instinct, why would robotic behavior be any different? The human brain might be a very impressive natural aberration, but it's still just an incidental organic computer and is not without it's own bound or limitations. We've been programmed by evolutionary and trial and error, but we're still programmed nonetheless, in overarching concept robotic learning behavior would be no different; just programmed intentionally instead of accidentally.
I don't agree with the organic computer description of humans, that we are just fancy robots. Animals don't have the self awareness we have, and they have in some cases longer evolutionary histories than us. If it was all about life being an organic computer then we'd be seeing more than just humans with this capability. I think it's obvious there is something beyond our programming that sets us apart. Animals are organic robots that don't have AI. We have AI. We have something beyond programming. So I believe we will one day create AI. Then, and only then, is when we'll have self aware robots. Until then what we have are robots following what are essentially fancy IF-THEN loops.
[QUOTE=Fourier;48255719] Just get a kid[/QUOTE] Kids shit themselves, robots don't.
[QUOTE=cecilbdemodded;48258319]I don't agree with the organic computer description of humans, that we are just fancy robots. Animals don't have the self awareness we have, and they have in some cases longer evolutionary histories than us. If it was all about life being an organic computer then we'd be seeing more than just humans with this capability. I think it's obvious there is something beyond our programming that sets us apart. Animals are organic robots that don't have AI. We have AI. We have something beyond programming. So I believe we will one day create AI. Then, and only then, is when we'll have self aware robots. Until then what we have are robots following what are essentially fancy IF-THEN loops.[/QUOTE] I disagree, the main difference between our brains and the brains of other animals is we have way more sophisticated language processing and generation. We can use that to build on the knowledge of previous generations, whereas for other animals the knowledge acquired by an individual is almost entirely lost when they die. We can also use it as a tool for chaining together concepts when performing abstract reasoning.
[QUOTE=cecilbdemodded;48258319]Animals don't have the self awareness we have, and they have in some cases longer evolutionary histories than us.[/QUOTE] Depending on your view, we all came from the same primordial soup. Some have learned faster, but we've all been evolving for the same amount of time.
[QUOTE=hypno-toad;48256767]Why is this such a revelation? The human brain can't act out in ways that aren't predefined relative to it's material makeup and genetic instinct, why would robotic behavior be any different? The human brain might be a very impressive natural aberration, but it's still just an incidental organic computer and is not without it's own bound or limitations. We've been programmed by evolutionary and trial and error, but we're still programmed nonetheless, in overarching concept robotic learning behavior would be no different; just programmed intentionally instead of accidentally.[/QUOTE] It's not, I'm just bringing it up in reply to ceil. [editline]21st July 2015[/editline] [QUOTE=cecilbdemodded;48256483]What if someone created a robot brain, and linked it to the internet. The initial programming, to get that brain functional and with the ability to connect online- that's human programming. What if that brain learned what 'games' are by doing its own research online? What if, once it learned what games are, it taught itself how to play chess? That's not human programming, that would be AI. That's because nowhere in the human written code does the concept of games exist, nowhere in the human written code is there mention of chess or the rules of chess. Technically you could argue that it couldn't learn to play chess without the initial code to boot up and connect online, but that's not what I'm talking about. I'm talking about the robot doing something the robot itself completely made up, something no human had any input in determining.[/QUOTE] It sounds like you're talking about machine learning, which already exists and can produce very interesting results. But they in turn also have a lot of "human programming" in it, although the results produced by such algorithms can be completely unknown to us. I guess you could make some kind of general AI system that manages to deduce and learn things more independently but that probably won't happen until far into the future.
[QUOTE=itisjuly;48245334]I find its voice really annoying. Why couldn't it be something robotic instead of that of a child?[/QUOTE] Because humans don't empathise with *bleep bloop bleep* MY SENSORS SHOW A POPTART ON THE FLOOR - 50% POP - 50% TART - ANALYSIS COMPLETE *bleep bloop* You only hear and see that in really bad sci-fi movies and it's never going to be their way to communicate.
[QUOTE=cecilbdemodded;48258319]I don't agree with the organic computer description of humans, that we are just fancy robots. Animals don't have the self awareness we have, and they have in some cases longer evolutionary histories than us. If it was all about life being an organic computer then we'd be seeing more than just humans with this capability. I think it's obvious there is something beyond our programming that sets us apart. Animals are organic robots that don't have AI. We have AI. We have something beyond programming. So I believe we will one day create AI. Then, and only then, is when we'll have self aware robots. Until then what we have are robots following what are essentially fancy IF-THEN loops.[/QUOTE] This is a fundamental misunderstanding of evolution. Intelligence is an extremely costly and complex system. Yes, after many, many tens of thousands of years of investment it has paid off well for us, but there are many cases where a stupider yet more efficient animal would be just as if not more fit for survival. Evolution isn't a linear process that leads towards the ultimate lifeform, that being us. It's a system that describes how the organisms most fit for survival are naturally selected, resulting in gradual adaptation to their environment. We are in many ways similar to animals. Many of our processes are practically identical, and are intrinsic to our functioning. We just have a few extra capabilities that let us get ahead. [editline]21st July 2015[/editline] [QUOTE=Swebonny;48258888]It sounds like you're talking about machine learning, which already exists and can produce very interesting results. But they in turn also have a lot of "human programming" in it, although the results produced by such algorithms can be completely unknown to us. I guess you could make some kind of general AI system that manages to deduce and learn things more independently but that probably won't happen until far into the future.[/QUOTE] I don't think any system can accomplish anything if it isn't given a task. And what is the brain rewarding us with dopamine if not it assigning us a task? What is boredom if not the brain telling us to find something of interest?
[QUOTE=catbarf;48252504]I implicitly agree with this argument, but I think it's stupid when people extrapolate to essentially say that if you program a robot to say 'ouch' when struck, it actually feels pain and should be treated like a person. It's straightforward to design a machine to emulate human behavior, and people seem to evaluate personhood on the basis of superficial human-like characteristics. Just look at this thread- designing a robot to recognize its own synthetic voice is not a huge feat of artificial intelligence, but it's being called 'self consciousness'. IBM's Watson has incredible emergent complexity, information association, and problem-solving skill, but people recognize it as just a computer. Give a far dumber robot a face and artificial personality and suddenly people think it's a person.[/QUOTE] Because human beings relate with other humans, and things that have human qualities. For instance, the SAINT robots from Short Circuit are somewhat humanoid in appearance, but it's Johnny 5's facial expressions that cause us to relate and sympathize with him. In contrast, a friend of mine watched Automata with me some time ago, and he said the part of the film that he didn't like was that he couldn't relate to the robots. They have human-like faces (particularly so with Cleo), they can speak like us (well, like a Speak-and-Spell, mostly), but they don't have the ability to use body-language via facial expressions. Another example would be The Brave Little Toaster which, while not featuring robots, basically slaps facial expressions and human needs/desires into things that normally people wouldn't relate to (because it's a toaster/vacuum/whatever), and they end up relating to the characters to a certain degree. NAO's facial expressions are lacking, but its voice makes up for it with various expressive tones.
[QUOTE=Jitterz;48245242]Man I fucking want one of these. [video=youtube;YdaEUJLArKs]https://www.youtube.com/watch?v=YdaEUJLArKs[/video][/QUOTE] This is fucking adorable. I would glady accept these as part of society.
Yeah, I won't believe a thing on their "conscience" until we can ask them why they don't help tortoises they're flipping over on the desert and get an interesting answer.
[QUOTE=Ziks;48258349]I disagree, the main difference between our brains and the brains of other animals is we have way more sophisticated language processing and generation. We can use that to build on the knowledge of previous generations, whereas for other animals the knowledge acquired by an individual is almost entirely lost when they die. We can also use it as a tool for chaining together concepts when performing abstract reasoning.[/QUOTE] You bring up an interesting point. What if having fingers, to manipulate and build things, and being able to speak and form language is what allowed us to be self aware? That would apply to AI. If we created an AI brain, but it had no ability to see, no ability to taste, or smell, would that limit its intelligence? It could be that without certain key sensory abilities having been included in the design from the start, NO machine intelligence can ever be created beyond the ones we have now. Maybe AI researchers should leave brain development for last and recreate a functioning human body first.
Even if this sounds nerdy [video=youtube;vjuQRCG_sUw]http://www.youtube.com/watch?v=vjuQRCG_sUw[/video] this should be the guild to knowing if an AI is sentient
[QUOTE=cecilbdemodded;48261579]You bring up an interesting point. What if having fingers, to manipulate and build things, and being able to speak and form language is what allowed us to be self aware? That would apply to AI. If we created an AI brain, but it had no ability to see, no ability to taste, or smell, would that limit its intelligence? It could be that without certain key sensory abilities having been included in the design from the start, NO machine intelligence can ever be created beyond the ones we have now. Maybe AI researchers should leave brain development for last and recreate a functioning human body first.[/QUOTE] I think you're right, although I suspect the minimum you would need is a sense that can be used to receive language, and a complimentary way to emit language. But yeah, the scope of its intelligence would be limited to the senses you give it. It would probably never be able to understand what an image is if it can't experience them, in the same way that we can't understand what echolocation feels like.
It's running an engine to interpret logical commands. That or they specifically programmed it for this test. Neither implies consciousness. I'd define consciousness as having enough layers of logical inter-connectivity connected to it's previous experiences that it starts to provide it's own motives and value it's own goals over the pre-imposed ones. This is with the constraint of having been provided no structured way of implicitly doing so, but while still allowing it access to the scripts that it uses to process infromation. Only fencing off the bare minimum for the systems that operate the scripts.
put the same stuff in fully armed drones and see what'll happen
[QUOTE=Trixil;48264346]put the same stuff in fully armed drones and see what'll happen[/QUOTE] It'll answer a question then do nothing else. :v:
[QUOTE=chimitos;48264163]It's running an engine to interpret logical commands. That or they specifically programmed it for this test. Neither implies consciousness. I'd define consciousness as having enough layers of logical inter-connectivity connected to it's previous experiences that it starts to provide it's own motives and value it's own goals over the pre-imposed ones. This is with the constraint of having been provided no structured way of implicitly doing so, but while still allowing it access to the scripts that it uses to process infromation. Only fencing off the bare minimum for the systems that operate the scripts.[/QUOTE] You're right, these guys probably have no idea what they're talking about, you should explain it to them
Sorry, you need to Log In to post a reply to this thread.