• Slaughterbots
    77 replies, posted
[video=youtube;9CO6M2HsoIA]https://www.youtube.com/watch?v=9CO6M2HsoIA[/video]
Got a very 'Black mirror' vibe from this
I saw this the other night, and it honestly gave me quite a bit of anxiety over how popular drones are becoming, how easily they could become mass produced, and weaponized. I truly hope we can avoid this.
Dang. P. Scary.
Simple just disconnect from social media and release smaller drones to kill the bigger ones.
Exploding drones. Great. Now we gotta upgrade countermeasures to exploding eagles.
[img]http://www.steves-digicams.com/assets_c/2016/09/DJI_Mavic_Pro-800px-thumb-450x204-30086.jpg[/img] this ad popped up on facebook after watching this video, then blipped away after I looked directly at it despite the page being open for a while we're already dead [sp]I own a mavic and want to build racing drones, they're fun[/sp] the sentiment that we need to avoid completely automating weapons and completely removing the human element has been very strong for a long time now (man I'm sure nobody's made a skynet joke yet!) even if we have the technology to operate fast and precisely, I feel like we're gonna have to have a geneva robotics convention state that remote weapons must remain human operated so a human decision has to be made in-action or whatever good qualifiers would be applicable
commit suicide by sharing a hashtag. god i cant wait for the future
[QUOTE=Davoc;52900836]commit suicide by sharing a hashtag. god i cant wait for the future[/QUOTE] [img]https://vignette4.wikia.nocookie.net/en.futurama/images/7/75/Lynn.png/revision/latest?cb=20111029002559[/img]
This is why I think hard sci-fi will always be cooler than soft sci-fi. Stories building off of existing technology and realism will always have a harder impact than lightsabers, power armor, aliens and mechs. [QUOTE=duckmaster;52900701]release smaller drones to kill the bigger ones.[/QUOTE] Imagine this but in the air. [video=youtube;lUpUQf16qzQ]http://www.youtube.com/watch?v=lUpUQf16qzQ[/video]
Would an EMP gun stop those do you guys think?
Why not just hang a bunch of string/wires from the ceiling and hope to tangle them? Or butterfly net?
Isn't the one big problem with weapons technology is that it inherently advances far faster than defenses can keep up with. So, say you develop a killbot on this scale, someone eventually develops a counter, someone would just develop a killbot that ignores that counter etc, etc.
[QUOTE=TheNerdPest14;52900957]Would an EMP gun stop those do you guys think?[/QUOTE] You would honestly be better off fashioning an entire outfit that emits a strong electromagnetic field around your entire body..
This is a neat Sci-Fi concept and it really sucks that it's being used as propaganda and fear mongering.
[QUOTE=Spacewizard;52901150]This is a neat Sci-Fi concept and it really sucks that it's being used as propaganda and fear mongering.[/QUOTE] Little drone worn the express purpose to quickly and accurately kill people. Man that's some crazy fear mongering
[QUOTE=Spacewizard;52901150]This is a neat Sci-Fi concept and it really sucks that it's being used as propaganda and fear mongering.[/QUOTE] Not using and abusing artificial intelligence to create advanced kill bots capable of widespread uninhibited and intelligent killing with no human element is fear mongering? Like this isn't shitting on Artificial Intelligence as a concept, it's shitting on the fact that we may very well weaponize this extremely useful technology and very much come to regret it.
[QUOTE=SunsetTable;52901175]Little drone worn the express purpose to quickly and accurately kill people. Man that's some crazy fear mongering[/QUOTE] You're assuming that these would be readily available to the public with military grade AI and hardware capable of accurately targeting and firing deadly weaponry. Not only does this technology not exist, but it would be impossible to build even one of these without enormous funds and some of the best AI and robotics experts in the world. Edit: And as long as we're on the topic, these things aren't even remotely as scary as chemical weapons attacks. If our military or a foreign military really wanted to kill massive amounts of people with no friendly casualties and no regard for ethics they would just drop whatever deadly chemical they wanted over a heavily populated area. We're specifically developing technology like this because it's more ethical and safer than what we have access to right now.
They can be made now :what: [editline]17th November 2017[/editline] Let's ignore the decorated AI researcher giving us a warning because its actually fear mongering guys. Progress, no restrictions.
[QUOTE=Spacewizard;52901217]You're assuming that these would be readily available to the public with military grade AI and hardware capable of accurately targeting and firing deadly weaponry. Not only does this technology not exist, but it would be impossible to build even one of these without enormous funds and some of the best AI and robotics experts in the world. Edit: And as long as we're on the topic, these things aren't even remotely as scary as chemical weapons attacks. If our military or a foreign military really wanted to kill massive amounts of people with no friendly casualties and no regard for ethics they would just drop whatever deadly chemical they wanted over a heavily populated area. We're specifically developing technology like this because it's more ethical and safer than what we have access to right now.[/QUOTE] At the start of the video it's handwaved by a quick news report that the military tech was hacked/reverse engineered and released on the internet. Combine this with a future of 3d printing and even cheaper, powerful electronics and the only thing unfeasible about it is people acquiring explosives. But a real terrorist cell would have an easier time of that. It's possible even now to rig up a handgun to a drone and fire it remotely, imagine if you had an even smaller drone with one bullet and a small firing mechanism attached to it if you couldn't get a hold of explosives. There are a lot of possibilities here. And the video also shows you that these start as small, targeted attacks. Chemical weapons aren't efficient if you want to assassinate a person or room full of people. I think you're missing a lot of the nuance and implications of the video.
What kind of explosive would you even need to do something like this ? I don't think it's something that your average terrorist would be able to cook up. And a bullet probably wouldn't work without a long enough barrel, you'd have to considerably size up the drones.
[QUOTE=honestfam;52901287]What kind of explosive would you even need to do something like this ? I don't think it's something that your average terrorist would be able to cook up. And a bullet probably wouldn't work without a long enough barrel, you'd have to considerably size up the drones.[/QUOTE] The magic one-shot kill 100% accurate explosive gun isn't the hard part of this to replicate despite how ridiculous that concept is. Like you said, if it's not feasible then you can just up the size of the drone and replace the magic gun with a realistic gun. The AI is the hard part. Producing and replicating the AI necessary to accurately identify and shoot targets would be ridiculously difficult to produce or reverse engineer. We have technology similar to this, literally the most advanced mass produced combat AI in the world, in an American aircraft right now and that AI hasn't been leaked or hacked specifically because this kind of tech is as secure as anything can get. It's not something a terrorist cell or a similarly sized group could get from hacking into the US military. The ethical implications of this kind of AI is interesting to discuss, but the idea that we're going to completely halt progress in this field of AI because it could somehow fall into the wrong hands is absurd. The group that made this video aren't interested in discussion, they just want to scare people into getting this kind of technology banned.
Think dirtier, light someone on fire, acid attacks. Terrorists would rather maim than kill. He went soft with it.
[QUOTE=honestfam;52901287]What kind of explosive would you even need to do something like this ? I don't think it's something that your average terrorist would be able to cook up. And a bullet probably wouldn't work without a long enough barrel, you'd have to considerably size up the drones.[/QUOTE] A bullet could very well work. A smaller pistol round with like, 1-2 inches of barrel should be plenty enough to penetrate a skull. edit: Not to mention that current drones can already hold regular guns.
That would work great until you realize that a countermeasure would have access to the same tech and the benefit of being on the defense.
[QUOTE=TheMrFailz;52901344]A bullet could very well work. A smaller pistol round with like, 1-2 inches of barrel should be plenty enough to penetrate a skull. edit: Not to mention that current drones can already hold regular guns.[/QUOTE] wouldn't need a gun, a ricin tipped needle could do the job just as well
[QUOTE=TheMrFailz;52901344]edit: Not to mention that current drones can already hold regular guns.[/QUOTE] Drone with a gun nowadays is gonna sound like a swarm of bees from a mile away given the size of the platform you'd need to carry that weight and ammunition. Plus I'd think firing off a bullet is probably gonna be too much kick for the buggers to handle multiple quick shots, especially if there isn't ample room for the recoil flight. So the point of these ones is, make them fast, tiny, accurate, cheap, and disposable, so you can just dump a shitton of them out to do the job with a quick pop, no need for reusability as they could perform actions that would damage themselves if it means getting the job done. It's more an evolution of grenades than guns We've already got 'tiny whoop' platforms capable of fast, tight maneuvers, imagine these things with omniscient AI, maybe swarm networking controlling them. [quote][media]https://www.youtube.com/watch?v=KMR0BeXYZBE[/media][/quote] and even [url=https://youtu.be/GzPloaaJ6-c?t=7m16s]human pilots capable of doing those breech charge maneuvers[/url] from the end of the vid, with bulkier units [t]https://i.imgur.com/GT4zvA9.png[/t][t]https://i.imgur.com/Xua29gw.png[/t] but in the end the problem is dissociating human action from those of the robots. Autonomy can quickly remove liability. Imagine military power having this level of broad power at nigh zero self-risk
[QUOTE=Spacewizard;52901334]The magic one-shot kill 100% accurate explosive gun isn't the hard part of this to replicate despite how ridiculous that concept is. Like you said, if it's not feasible then you can just up the size of the drone and replace the magic gun with a realistic gun. The AI is the hard part. Producing and replicating the AI necessary to accurately identify and shoot targets would be ridiculously difficult to produce or reverse engineer. We have technology similar to this, literally the most advanced mass produced combat AI in the world, in an American aircraft right now and that AI hasn't been leaked or hacked specifically because this kind of tech is as secure as anything can get. It's not something a terrorist cell or a similarly sized group could get from hacking into the US military. The ethical implications of this kind of AI is interesting to discuss, but the idea that we're going to completely halt progress in this field of AI because it could somehow fall into the wrong hands is absurd. The group that made this video aren't interested in discussion, they just want to scare people into getting this kind of technology banned.[/QUOTE] The feeling I got from the video is that the entire mass-killing thing was a blatant false-flag using a bullshit "them turrists done stole our trillion-dollar AI right from under our noses" news story to justify mysterious mass-killings of specifically-aligned senators and political activists.
[QUOTE=eatdembeanz;52901454]The feeling I got from the video is that the entire mass-killing thing was a blatant false-flag using a bullshit "them turrists done stole our trillion-dollar AI right from under our noses" news story to justify mysterious mass-killings of specifically-aligned senators and political activists.[/QUOTE] furthermore, it vaguely implied (then the creator said it was a major point) that automating this kind of stuff could result in it becoming far too heavy handed and blow back in our faces over shit like 'students protesters being targeted for using a hashtag', and that it may actually have been beyond direct control of the people they belong to, the AI learning what threatens its side's power and acting on that as well
That scene in the classroom was really chilling.
Sorry, you need to Log In to post a reply to this thread.