• Humanity: The Good Ending
    122 replies, posted
[QUOTE=Superkilll307;49914362]I like to think this will only ever happen if a solar flare hits and were all cyborgs.[/QUOTE] Well not quite, civilization can and will collapse again because it's already happened before dozens of times. I don't see what makes the latest iteration of it somehow immune.
[QUOTE=Swilly;49914293]Imagine Putin being immortal with the kind of powers in this. [editline]11th March 2016[/editline] And the Greek Gods aren't Villains nor good people. [B]They're greedy conniving jackasses looking out for themselves.[/B][/QUOTE] They're more like human beings with immortality and great power. I would recommend reading the Iliad for a good characterisation of the Gods, there's a really good scene where Zeus debates defying fate to save one of his mortal sons from death, but Hera reminds him that there are other gods with mortal sons who are dying, Zeus realizes its not fair to save his son while others are dying, so Zeus lets his son die and weeps. There's plenty of parts where the Gods do horrible things for their own gain, but often times they end up regretting it or paying for it in some way.
[QUOTE=Broguts;49914737]They're more like human beings with immortality and great power. I would recommend reading the Iliad for a good characterisation of the Gods, there's a really good scene where Zeus debates defying fate to save one of his mortal sons from death, but Hera reminds him that there are other gods with mortal sons who are dying, Zeus realizes its not fair to save his son while others are dying, so Zeus lets his son die and weeps. There's plenty of parts where the Gods do horrible things for their own gain, but often times they end up regretting it or paying for it in some way.[/QUOTE] This is true.
Transhumanism is the correct path for Humanity. While I think the conciousness-merging super-intelligence goes a little too far, I'll be first in line to upload myself into cyberspace (even if it technically kills me). With the vast potential of AI, it's the only way that humans could conceivably keep up.
I have never understood the whole "upload your conciseness" thing. It would literally just be a copy of you, not yourself. It's not immortality in the least bit.
[QUOTE=BusterBluth;49914827]I have never understood the whole "upload your conciseness" thing. It would literally just be a copy of you, not yourself. It's not immortality in the least bit.[/QUOTE] It seems like the same problem with teleporters, I really, [I]really[/I] don't want to take that chance.
[QUOTE=IrishBandit;49914810]Transhumanism is the correct path for Humanity. While I think the conciousness-merging super-intelligence goes a little too far, I'll be first in line to upload myself into cyberspace (even if it technically kills me). With the vast potential of AI, it's the only way that humans could conceivably keep up.[/QUOTE] [media]https://www.youtube.com/watch?v=tcdVC4e6EV4[/media] To think we need to keep up is to think that AI will think like us. Further why do we need AI? For almost all of technology we have created these things because they answer a question, a need or a want to help make our lives easier or to answer further questions we don't know but AI don't fit in this category at all. AI would not help us in any capacity and in fact if everything is to be believed would be a huge detriment to the middle and poorer classes on Earth. They are an ego rub.
[QUOTE=BusterBluth;49914827]I have never understood the whole "upload your conciseness" thing. It would literally just be a copy of you, not yourself. It's not immortality in the least bit.[/QUOTE] True that. It's more akin to making a sperm donation if anything, since the code that comprises your memories, accumulated knowledge, emotional bonds and so on, is used to create something new. But when it comes to preserving the individual stream of consciousness, that's a gestalt property that can't be extracted or transferred as simply as the contents of one's porn folder. And teleporters are essentially vaporization-based murder machines connected to a universal constructor. Disassemble the computer that holds your stream of consciousness, and that is lost forever. There's no way to get it back, no matter how perfect the clone that pops out on the other side.
TBH this sounds like an Anti-Dystopian ending. People are uploaded into a mega-consciousness without their will and are forced to endlessly merge, expand, merge, expand, merge, expand...
The other issue with this good ending and tranhumanism in general is its utopian in nature. These are utopian ideals being passed off as an actual reality.
[QUOTE=antianan;49913415]That's why i think that at some point we will have to deal with the fact that the human race as we know it must either dissapear to be replaced by something completely different, or just dissapear completely.[/QUOTE] The end goal of modern technology is getting us off this death trap we call a planet so we don't all get killed by the climate or a stray meteor in another million years. Humanity has at least that long to gets it shit together and get a sizable amount of people on one of the Earth-like planets in or around our galaxy. Humanity ain't going anywhere unless we blow ourselves up.
[QUOTE=Swilly;49914897][media]https://www.youtube.com/watch?v=tcdVC4e6EV4[/media] To think we need to keep up is to think that AI will think like us. Further why do we need AI? For almost all of technology we have created these things because they answer a question, a need or a want to help make our lives easier or to answer further questions we don't know but AI don't fit in this category at all. AI would not help us in any capacity and in fact if everything is to be believed would be a huge detriment to the middle and poorer classes on Earth. They are an ego rub.[/QUOTE] AI are an inevitability (barring some unknown factor that blocks us from creating them), simply due to our ever-increasing need for smarter and smarter machines. We could end up making one on accident. They have a wide range of uses, anything that requires large amounts of data analyzing (stock trading, medicinal research) would be the ideal use.
[QUOTE=IrishBandit;49914970]AI are an inevitability (barring some unknown factor that blocks us from creating them), simply due to our ever-increasing need for smarter and smarter machines. We could end up making one on accident. They have a wide range of uses, anything that requires large amounts of data analyzing (stock trading, medicinal research) would be the ideal use.[/QUOTE] That doesn't say why. What you said is we're inevitably going to create one because we've let our lives become so incredibly complex that we as humans cannot accurately live in a world we created. Which I take as both an insult and a fault of our own egos.
[QUOTE=Swilly;49915022]That doesn't say why. What you said is we're inevitably going to create one because we've let our lives become so incredibly complex that we as humans cannot accurately live in a world we created. Which I take as both an insult and a fault of our own egos.[/QUOTE] Increasing complexity is the result of all life and progress. If you don't progress, you stagnate and die. [QUOTE=BusterBluth;49914827]I have never understood the whole "upload your conciseness" thing. It would literally just be a copy of you, not yourself. It's not immortality in the least bit.[/QUOTE] Alternately, keep your brain in a jar and hook it up to cyberspace. That way you don't lose continuity of self, but still get most of the benefits.
[QUOTE=IrishBandit;49915041]Increasing complexity is the result of all life and progress. If you don't progress, you stagnate and die. .[/QUOTE] Misquoting and taking the rules of evolution out of context. [editline]11th March 2016[/editline] You also attribute stagnation to death when uncontrolled expansion is just as deadly in populations. [editline]11th March 2016[/editline] Furhter, this idea that it will inevitably happen is more forcing a particular way forward instead of actually giving options. Which is why I'm against it.
[QUOTE=IrishBandit;49915041]Increasing complexity is the result of all life and progress.[B] If you don't progress, you stagnate and die.[/B] Alternately, keep your brain in a jar and hook it up to cyberspace. That way you don't lose continuity of self, but still get most of the benefits.[/QUOTE] We are progressing naturally though. Less and less people are born with evolutionary carryovers like wisdom teeth, our brains are getting smaller and more efficient. The human race is literally evolving before our eyes, we merely need to be patient, evolution will create the superman. AI is nothing more than a trap.
[QUOTE=Swilly;49915066]Misquoting and taking the rules of evolution out of context. [editline]11th March 2016[/editline] You also attribute stagnation to death when uncontrolled expansion is just as deadly in populations. [editline]11th March 2016[/editline] Furhter, this idea that it will inevitably happen is more forcing a particular way forward instead of actually giving options. Which is why I'm against it.[/QUOTE] It's not rules of evolution, it's the fundamental basis of life. Expansion only is death when there is nowhere to go, expansion is what saved humanity from extinction in the past, and is what will save us from extinction in the future. Unless we specifically stop AI from being created (which is a valid route, though unlikely), standard humans will become outmoded. There is always the chance that there is some large barrier to creating true AI that we do not overcome in the mid-future. [editline]11th March 2016[/editline] [QUOTE=Broguts;49915110]We are progressing naturally though. Less and less people are born with evolutionary carryovers like wisdom teeth, our brains are getting smaller and more efficient. The human race is literally evolving before our eyes, we merely need to be patient, evolution will create the superman. AI is nothing more than a trap.[/QUOTE] Natural evolution is too limited and slow. When we can improve our bodies ourselves (gene-modification or cybernetics), we will greatly expand our own capabilities and longevity.
[QUOTE=IrishBandit;49915115] [editline]11th March 2016[/editline] Natural evolution is too limited and slow. When we can improve our bodies ourselves (gene-modification or cybernetics), we will greatly expand our own capabilities and longevity.[/QUOTE] Oh I'm fine with cybernetics and gene-modding, its AIs and brain emulation that I dislike. I see no purpose in creating an AI that could potentially destroy humanity, what is the point? So the AI advances a crazy amount, great! Except its killed all of humanity/integrated them into its collective or whatever, what is the purpose of this?
[QUOTE=IrishBandit;49915115]It's not rules of evolution, it's the fundamental basis of life. Expansion only is death when there is nowhere to go, expansion is what saved humanity from extinction in the past, and is what will save us from extinction in the future. Unless we specifically stop AI from being created (which is a valid route, though unlikely), standard humans will become outmoded. There is always the chance that there is some large barrier to creating true AI that we do not overcome in the mid-future. [editline]11th March 2016[/editline] Natural evolution is too limited and slow. When we can improve our bodies ourselves (gene-modification or cybernetics), we will greatly expand our own capabilities and longevity.[/QUOTE] And that's the anthropomorphism I was talking about in my first paragraph, you are of the opinion that we are above nature to a point that we should take completely control of things we don't understand. You wanna skip getting your Driver's License to start driving a school bus full of children. Also the huge barrier to AI is in the first video I posted. [I]An accurate model of reality.[/I] Also that pish about 'stagnation means death' does not have any basis in ecosystems that constantly seek to restore or maintain a homeostasis, aka don't rock the boat, keep things level and stagnate. This is why I'm against Transhumanism because the members who subscribe to it think they're better at understanding how things work than the natural laws that have been in existence since the big bang.
all hail THE BORG
[QUOTE=Swilly;49915157]This is why I'm against Transhumanism because the members who subscribe to it think they're better at understanding how things work than the natural laws that have been in existence since the big bang.[/QUOTE] I'm really not sure what you're getting at here? Nothing we do violates natural laws, the nature of us being able to do it means it is permissible by the universe. The "laws of nature" are not rigid constructs, they are something we came up with to help us model the world around us. As we realise that we can "evolve" ourselves through technology, we realise the world around us doesn't fit that model any more and create a new one. Discovering "oh neat we can just do this" doesn't violate a law that a unthinking, unfeeling entity like the universe didn't even create in the first place. Transhumanism isn't a curse if we can do it right. There's loads of interesting places we could go. Not everybody would opt in to it, we'd still evolve naturally, but we would definitely begin to diverge (assuming homosapiens as a whole can even save ourselves to survive that long).
[QUOTE=IrishBandit;49915041] Alternately, keep your brain in a jar and hook it up to cyberspace. That way you don't lose continuity of self, but still get most of the benefits.[/QUOTE] Your Brain still ages though.
[QUOTE=Swilly;49915157]And that's the anthropomorphism I was talking about in my first paragraph, you are of the opinion that we are above nature to a point that we should take completely control of things we don't understand. You wanna skip getting your Driver's License to start driving a school bus full of children. Also the huge barrier to AI is in the first video I posted. [I]An accurate model of reality.[/I] Also that pish about 'stagnation means death' does not have any basis in ecosystems that constantly seek to restore or maintain a homeostasis, aka don't rock the boat, keep things level and stagnate. This is why I'm against Transhumanism because the members who subscribe to it think they're better at understanding how things work than the natural laws that have been in existence since the big bang.[/QUOTE] Sentient life is inherently above all other forms of life. We are not above nature, we are the crowning achievement of nature. Stagnation and homeostasis did not produce sentient life, and attempting to enter a state of "natural balance" is not only a waste of 4.5 billion years of evolution, it will also just get us killed. [editline]11th March 2016[/editline] [QUOTE=BusterBluth;49915227]Your Brain still ages though.[/QUOTE] Yea, that's the downside. Although, I'm sure once we get to the stage where we're able to become brain-in-jars, we'll have medical technology that can extend the life of the brain for a long time.
[QUOTE=IrishBandit;49915243]Sentient life is inherently above all other forms of life. We are not above nature, we are the crowning achievement of nature. Stagnation and homeostasis did not produce sentient life, and attempting to enter a state of "natural balance" is not only a waste of 4.5 billion years of evolution, it will also just get us killed. [/QUOTE] Sentient life is not inherently above all other forms of nature and this folly of understanding is what lead to our planet dying over the course of less than a fucking century. Also you misread what I said, I meant a homeostasis that means if something is failing, another life form or evolutionary trait will occur to fix the problem. We've fucked the balance so hard fish currently don't know which gender they are, are genetically modified crops are being with super bugs and we've used antibiotics so much strains resistant our strongest forms are now a thing. This line of think is dangerous in every regard to our species as well as every species on the planet. Tell me that again after reading how we caused a mass extinction event with your ideal that because we can think, we are somehow above others. [editline]12th March 2016[/editline] [QUOTE=hexpunK;49915223]I'm really not sure what you're getting at here? Nothing we do violates natural laws, the nature of us being able to do it means it is permissible by the universe. The "laws of nature" are not rigid constructs, they are something we came up with to help us model the world around us. As we realise that we can "evolve" ourselves through technology, we realise the world around us doesn't fit that model any more and create a new one. Discovering "oh neat we can just do this" doesn't violate a law that a unthinking, unfeeling entity like the universe didn't even create in the first place. Transhumanism isn't a curse if we can do it right. There's loads of interesting places we could go. Not everybody would opt in to it, we'd still evolve naturally, but we would definitely begin to diverge (assuming homosapiens as a whole can even save ourselves to survive that long).[/QUOTE] Transhumanism, in its current state with its interactions with Capitalism and growth based economies as well as perverted and corrupted misquoting of Evolution, will not help humanity. Further any system can be a good thing if its done right. However we as humans never do it right. Am I completely against transhumanism? No, but we barely understand how our own weather fucking works. After 20+ years of using speed to help children with ADHD just last year it was revealed that the medications cause short term memory issues. We're just finding out now that children spending 2+ hours on the internet actually stunts their social growth and its been nearly 30 years. Those interesting places are idealic at best and while its always good to dream toward them you should never expect things to be done right. That's what I have against transhumanism, its not the concept, its the people pushing an idealic utopia like the next hot thing.
lmao fuck becoming a computer so I can trade stocks better I rather enjoy a regular human life. like I can understand the appeal in becoming a computer program or a brain in the jar if all you do as a regular human is sit in front of a computer anyway, but if you have a interesting (to yourself) social life, friends, a significant other; I really don't find the appeal in trading what makes you human just so you can be smarter. All of you that are seemingly super pro this don't seem to appreciate the more physical aspects of life imho. not to mention there are certain aspects of human society that AI should be kept the fuck away from, like the military. Think about the cold war and all those bugs where nuclear launches were detected and the humans in-charge correctly determined they were false despite all their hardware and software was telling them otherwise. If that was AI it'd be like "naw nigga" and blew up everything.
[QUOTE=Slim Charles;49917129]lmao fuck becoming a computer so I can trade stocks better I rather enjoy a regular human life. like I can understand the appeal in becoming a computer program or a brain in the jar if all you do as a regular human is sit in front of a computer anyway, but if you have a interesting (to yourself) social life, friends, a significant other; I really don't find the appeal in trading what makes you human just so you can be smarter. All of you that are seemingly super pro this don't seem to appreciate the more physical aspects of life imho. not to mention there are certain aspects of human society that AI should be kept the fuck away from, like the military. Think about the cold war and all those bugs where nuclear launches were detected and the humans in-charge correctly determined they were false despite all their hardware and software was telling them otherwise. If that was AI it'd be like "naw nigga" and blew up everything.[/QUOTE] The A.I. is so advanced in this scenario that it's a self replicating cloud of nanites, and the consciences contained within are immortal and all knowing with access to all of Earth's history (and any alien civilizations they encounter) with the ability to flawlessly recreate any moment, scene, world, or person you can possibly imagine and it would be indistinguishable from the real thing. You are pretty much a god. And the A.I. in your war scenario would be just as good if not better than human, a robotic algorithm detects missiles and launches counter-missiles even if it was false, an A.I. in charge of launching counter-missiles would go under the same ethical hardships as a person. I'm not on Team Hivemind either but that's really not as suckish as you make it out to be.
[QUOTE=Broguts;49915110]We are progressing naturally though. Less and less people are born with evolutionary carryovers like wisdom teeth, our brains are getting smaller and more efficient. The human race is literally evolving before our eyes, we merely need to be patient, evolution will create the superman. AI is nothing more than a trap.[/QUOTE] That's not how evolution works, in fact evolution in Humans has more than likely stopped due to globalization and such.
[QUOTE=Superkilll307;49917484]That's not how evolution works, in fact evolution in Humans has more than likely stopped due to globalization and such.[/QUOTE] Uh evolution is still going on, humans are still reproducing and random mutations are still happening. The humans of today (especially those from societies with a long history of agriculture) are extremely different to the humans that lived 12,000 years ago. People are even appreciably different to those that lived in the Roman period. There's also mentioning all of the different populations around the world which are pretty different to one another too as a result of this. The most obvious thing is considering there are nearly 7.5 billion people on the planet, that the pool of potential beneficial mutations has grown by a factor of 7500 since the ice age. The result is that it's more likely for these beneficial mutations to develop and spread, and they have been doing so rapidly enough to the point it has had an impact on recent history.
[QUOTE=Sobotnik;49917585]Uh evolution is still going on, humans are still reproducing and random mutations are still happening. The humans of today (especially those from societies with a long history of agriculture) are extremely different to the humans that lived 12,000 years ago. People are even appreciably different to those that lived in the Roman period. There's also mentioning all of the different populations around the world which are pretty different to one another too as a result of this. The most obvious thing is considering there are nearly 7.5 billion people on the planet, that the pool of potential beneficial mutations has grown by a factor of 7500 since the ice age. The result is that it's more likely for these beneficial mutations to develop and spread, and they have been doing so rapidly enough to the point it has had an impact on recent history.[/QUOTE] There's no inherit reason for these mutations to carry on as society ensures survival under most circumstances.
[QUOTE=Superkilll307;49917593]There's no inherit reason for these mutations to carry on as society ensures survival under most circumstances.[/QUOTE] But they do for the simple reason that people are still being born with random mutations (some of which may be beneficial). As time goes on, the genes which confer these reproductive advantages will continue to spread throughout the population. I don't see what makes us immune from this process considering it's going on literally right this second.
Sorry, you need to Log In to post a reply to this thread.