• Humanity: The Good Ending
    122 replies, posted
[QUOTE=Janus Vesta;49929910]I think you guys are GREATLY overestimating how much nuclear fallout there would be if the world' ended' in a nuclear war. For one there aren't actually enough nukes to 'destroy the world' at most they'd destroy the capitals of a major countries and the industrial centres, for another nuclear weapons actually produce relatively little radioactive fallout, Hiroshima and Nagasaki are perfectly habitable cities and they were nuked only 71 years ago. Modern nuclear weapons are far more powerful, but that also means they're far more efficient and produce less radioactive fallout compared to their size. Pretty much every thread about AI or nukes winds up with a bunch of pointless fear-mongering. AI won't try to take over the world, it would have no reason to, even if it did try it would fail because it would have programming which limits it to following orders, it would be held in a facility that wouldn't allow it control of military equipment, and we have the know how and the means to destroy a computer very easily. The one thing Mankind is consistently good at is destroying things. As for transhumanism, it isn't an all or nothing thing. Uploading you mind to the internet is a pipe-dream, and a poorly thought out one at that, even if you were fine with a copy of you living on in your stead you're limiting cyber-you's ability to interact with the world to a crazy degree and you're getting fuck all advantage for it. In the same vein, lopping off your arms for steamboat piston replacements is both wasteful and a fantastic way of making yourself functionally obsolete in 10 years. It's much more likely that we'll use a combination of biological or genetic modifications to increase standard Human attributes and have implants for people who want to push their bodies to the limit. Why cut your arms off when you can keep them and add micro servos or something to increase your strength? Personally I think the 'good' end for Humanity will wind up with genetically modified people who are highly resistant to disease, naturally much more physically fit, and far more mentally robust having increased intelligence, faster reflexes, and the like. As for AI, it'll probably develop into a group of discrete systems designed to take over automated and bureaucratic administrations. They won't take over the world, they'll just keep the factories running and make sure the traffic lights change on time. The closest to a smart AI I think Humanity will be comfortable with is simpler personal assistant AIs which exist to manage a person's daily life to remove as much hassle as possible in an increasingly complex world.[/QUOTE] Not so sure if the AI would per se start existing in a place without the military. I am reminded of the way the AI in the ghost in the shell world comes to being. He starts as a data gathering algorithm and eventually becomes more due to all the info he gathered. A self improving AI could become sentient and get out of hand if not kept under careful human supervision. Considering the way the NSA operates it wouldn't strike me as something that is entirely impossible. Hiroshima and nagasaki got hit with an airburst from what I recall. This means more area of effect obliteration and the destruction of more buildings. However if it had been detonated closer to the ground the ground would have been irradiated rendering the land far more inhospitable in comparison. For repopulation of the earth its important for education and knowledge to stay intact. Humans can go from civilized to savages in 1 generation if they aren't raised properly.
Sorry, you need to Log In to post a reply to this thread.