Warning over armies developing 'Terminator" style killer robots
92 replies, posted
[QUOTE=latin_geek;38515827]I can't really get behind the idea of completely autonomous robots, I still want someone pulling the trigger even if he's two countries away.
I mean have you seen those robot packmules the army uses? Imagine if they covered it in kevlar and ceramic plating, slapped a couple LMGs on the sides, and there was no human on the saddle. Shit's scary.[/QUOTE]
You know we already have some surgical tasks performed autonomously? Nobody dies, the robot never goes berserk or "forgets" something. They're also [I]better[/I] than humans at identifying things inside tissue.
People are far more fallible than a well-built machine. There's nothing but lower risks to be had from automating tasks, combat is no exception.
[QUOTE=FreakyMe;38517626]But when it's weapon systems are active and it is 'clearing' an area of hostiles, it won't be able to make snap judgments whether the movement in an alley is civilians or insurgents.[/QUOTE]
If it can't identify if a target is friend, foe, or civilian, it obviously would be programmed not to shoot it, else it would defeat the point of even having that system.
[QUOTE=Xenocidebot;38518930]You know we already have some surgical tasks performed autonomously? Nobody dies, the robot never goes berserk or "forgets" something. They're also [I]better[/I] than humans at identifying things inside tissue.
People are far more fallible than a well-built machine. There's nothing but lower risks to be had from automating tasks, combat is no exception.[/QUOTE]
Are they really doing automated surgery in practice, though? We have "proof-of-feasibility"-machines that can perform very specific surgical tasks automatically (such as locating shrapnel in tissue), but to my knowledge all practical robotic surgery is merely [I]robotically assisted[/I] (I.E there's a surgeon behind the controls), because it isn't reliable enough.
[QUOTE=Awesomecaek;38515656]Wow, never heard of that!
Got any more about that? I am heavily interested![/QUOTE]
I've been having some problems finding a direct source to be honest. I remember seeing one about a year ago, but can't seem to find it anywhere atm.
Found a reference post on Spacebattles so take it as you will.
[quote]
Robert_Utumno
I've been a strong proponent of the Merkava for quite some time due to the very impressive combat history it has had.
But recently I've come to the conclusion that much of said history is due to pretty low quality of the opponents it has faced.
Now the french Leclerc to my knowledge takes better advantage of modern technology then any other contender here. It's turret is basically built around it's autoloader which gives it a much higher rate of fire then any other contender and it's automation will for example trade retaliatory fire with any targets it has recognized even after the crew is dead. (I actually read about this from a case relating measurin biometrics (heartrate etc). The biometric monitoring system had failed in an excersize and the tank emptied thinking the crew was dead unloaded the whole magazine downrange into the preprogrammed target almost killing the crew by suffocation.)
[/quote]
[QUOTE=FreakyMe;38517626]But when it's weapon systems are active and it is 'clearing' an area of hostiles, it won't be able to make snap judgments whether the movement in an alley is civilians or insurgents.[/QUOTE]
Except that automated systems can generally put people into threath no-threath categories faster than actual humans as long as you program the necessary markers into them.
Why are people rating this winner
[QUOTE=Im Crimson;38519017]Are they really doing automated surgery in practice, though? We have "proof-of-feasibility"-machines that can perform very specific surgical tasks automatically (such as locating shrapnel in tissue), but to my knowledge all practical robotic surgery is merely [I]robotically assisted[/I] (I.E there's a surgeon behind the controls), because it isn't reliable enough.[/QUOTE]
I said surgical tasks, and yes, that tech already exists. A surgeon behind the controls doesn't actually mean they're using it. It means the surgery is being watched.
[QUOTE=Speich & Rosen, Medical Robotics]Similar to industrial robotics, the tool path of a surgical robot operating in a semi-autonomous mode (class i) is predefined based on a visual representation of the anatomy acquired by an imaging device (e.g., CT, MRI) and preoperative planning. Once the path is defined, the relative locations of the anatomical structure and the robot are registered, and [B]the robot executes the task using position commands without any further intervention on behalf of the surgeon.[/B] For obvious safety reasons, the surgeon can stop the action, but altering the path requires replanning. Semi-autonomous robotic systems are suitable for orthopedic or neurological surgical procedures with well-constrained anatomical structures such as hard tissues and bones or with soft tissue such as the brain, confined by the skull.[/QUOTE]
They're only getting better with sloppier situations. Unfortunately, paranoia has put the focus on "synergistic" robotics, where the human and robot assist each other, even in situations where a robot could handle the procedure solo. Bone cutting's a good example, there are automated bone drillers, but most people prefer to have a human doing it, even though they need a robot's arm holding theirs and pushing them into the correct position.
[QUOTE=laserguided;38519288]Why are people rating this winner[/QUOTE]
Fully autonomous doesn't mean sentient death machine intent on wiping out the human race (unless they were rating winner because of the warning issued :v:)
[editline]19th November 2012[/editline]
Plus, technology is cool
As far as I remember, the principal from the military school where Bart graduated in, specifically stated that we'll be the engineers for those robots that go into the actual battles, we'll be fine.
[QUOTE=Valdor;38520049]Fully autonomous doesn't mean sentient death machine intent on wiping out the human race (unless they were rating winner because of the warning issued :v:)
[editline]19th November 2012[/editline]
Plus, technology is cool[/QUOTE]
Something about a computer deciding who lives and who dies is fucked up.
[QUOTE=laserguided;38520240]Something about a computer deciding who lives and who dies is fucked up.[/QUOTE]
Not at all, in fact, there's one inside your very head at this very moment.
Can't stop progress.
I'm also okay with the military owning railguns that can fire 600-ton projectiles at 30 km/sec, but that's just me.
Holy shit i just watched the second renaissance a couple hours ago
[QUOTE=Sobotnik;38520248]Not at all, in fact, there's one inside your very head at this very moment.[/QUOTE]
Brains != man made electronics.
Which scenario would you like? Black Ops 2 scenario or Terminator Salvation scenario
[QUOTE=laserguided;38520327]Brains != man made electronics.[/QUOTE]
Not at all, your brain is very much like a computer.
Why should a human decide who lives or dies, but not a piece of machinery?
In fact, why should a human decide who gets to live or die in the first place?
[QUOTE=laserguided;38520240]Something about a computer deciding who lives and who dies is fucked up.[/QUOTE]
Assuming it has the correct parameters, and hasn't encountered any form of error. A computer deciding who to fire on in a battle is no more fucked up than a human doing the same thing. The main difference being one is panicky and fighting for their own life as well as following orders. And the other is just following procedures pre-programmed into it to asses a situation. Both are liable to error, one of the two can be turned off remotely without major loss.
[QUOTE=hexpunK;38520361]Assuming it has the correct parameters, and hasn't encountered any form of error. A computer deciding who to fire on in a battle is no more fucked up than a human doing the same thing. The main difference being one is panicky and fighting for their own life as well as following orders. And the other is just following procedures pre-programmed into it to asses a situation. Both are liable to error, one of the two can be turned off remotely without major loss.[/QUOTE]
This is assuming the commanders of said robot are correct in their actions. Humans have emotions, that is the difference. Humans are also capable of alot more in regards to assessing the situation. A good example is a soldier disregarding a action by his superiors to, for example execute prisoners etc.
[QUOTE=laserguided;38520403]This is assuming the commanders of said robot are correct in their actions. Humans have emotions, that is the difference. Humans are also capable of alot more in regards to assessing the situation.[/QUOTE]
People can let emotions cloud their thinking.
[QUOTE=Sobotnik;38520419]People can let emotions cloud their thinking.[/QUOTE]
Exactly, and thats a good thing. If we had no emotion we would be fighting over land/territorial gains today. On a side note, were you not the one who was spamming gif's of people getting their heads sawed off earlier or am I mistaken.
[QUOTE=laserguided;38520403]This is assuming the commanders of said robot are correct in their actions. Humans have emotions, that is the difference. Humans are also capable of alot more in regards to assessing the situation.[/QUOTE]
While that is true, emotions are also a major hindrance. A soldier who looses squad-mates in a battle isn't going to act particularly rationally. A soldier who has been under fire for a long period of time is probably going to be stressed and make mistakes. Yes, emotions will stop a soldier from firing on people who aren't acting aggressively towards them, which is a good thing. And as long as they are in the right frame of mind, they will rarely fuck up. This doesn't mean a robot is going to be completely incapable of not murdering the shit out of everything.
Obviously the technology required to make a robot identify weapons, work out where the fire is coming from, work out if a enemy can be fired upon, if they are surrendering or retreating, etc. Is not here yet, and probably won't be for a while. But robots in combat aren't all bad. Service tasks like carrying equipment, heavy lifting, evac, etc. These are all possible right now.
Autonomous combatants that save the lives of soldiers outweighs the comparatively miniscule risks of killing a wayward civilian or two until they can be shut down or redirected.
Remorselessness and non-pity are a pro rather than a con of these machines, I think. They're not susceptible to mental trauma and won't ever be the subject of a "Robert Bales" situation.
All of the identification problems can be fixed with chips and killswitches, ignoring the issue of the failed biometrics test.
[QUOTE=laserguided;38520450]Exactly, and thats a good thing. [/QUOTE]
No it isn't. It leads to situations like "I am going to do X in revenge".
An army robots main concern would be winning a war, with revenge not coming into that.
[QUOTE=laserguided;38520450]Exactly, and thats a good thing.[/QUOTE]
It really isn't. Being angry at something makes you want to get rid of it normally. Being upset makes you not want to partake in things normally. Not all emotions are positive, and letting them rule your actions isn't a good thing. Empathy and other such things are good for a soldier to have. Fuck, good for anyone to have. But that doesn't mean all emotions are good to have in combat.
[editline]20th November 2012[/editline]
Plus, as I said. You can shut down a rampaging machine remotely with ease. Can't really do that with a soldier until they are shot/ calm down/ other methods of subduing them.
[QUOTE=laserguided;38520450]Exactly, and thats a good thing. If we had no emotion we would be fighting over land/territorial gains today. On a side note, were you not the one who was spamming gif's of people getting their heads sawed off earlier or am I mistaken.[/QUOTE]
Soldiers on the ground don't decide who they're fighting, they don't need emotions getting in the way and making them do things they'd otherwise find unacceptable. It's not like we're replacing our leaders with robots.
But robots don't have the power to say no/disobey. Thus, but choice overall unless a soldier has the order to engage.
[QUOTE=laserguided;38520914]But robots don't have the power to say no/disobey. Thus, but choice overall unless a soldier has the order to engage.[/QUOTE]Robots also leave good records of who told them to do what. And its not like people have ever committed atrocities under orders, right?
I don't understand why people are rating this winner.
[QUOTE=laserguided;38520914]But robots don't have the power to say no/disobey. Thus, but choice overall unless a soldier has the order to engage.[/QUOTE]
Soldiers tend to follow orders, that's why they keep their job. Soldiers that don't tend to suffer from rather nasty problems.
There's also: [url]http://en.wikipedia.org/wiki/Milgram_experiment[/url]
[QUOTE=Valdor;38515778]If humans are required to pull the trigger on a drone wouldn't that mean it's a human's fault?
I don't see how autonomy would increase it tenfold either, anything to back that up?[/QUOTE]
[url]http://www.dailymail.co.uk/news/article-2208307/Americas-deadly-double-tap-drone-attacks-killing-49-people-known-terrorist-Pakistan.html[/url]
Humans don't 'pull the trigger'. They authorize shit. It's not a videogame where some guy in camo in an office is flying a drone with a joystick, hand targeting and shooting missiles like he's playing Ace Combat. These things are virtually autonomous already, and all that a human does is authorize steps and strikeplans.
Imagine the shit going down should these things become fully autonomous.
Humans do it better,leave the robots to work in trash depots where they belong
Sorry, you need to Log In to post a reply to this thread.