• Stephen Hawking: Humanity won't last longer than 1000 years on Earth.
    140 replies, posted
[QUOTE=Emperor Scorpious II;51402112]To be human is to not give a shit about the "objective meaningless" of life, else why not save space and off oneself?[/QUOTE] Well, that would be your definition of being human, not mine. And why not off myself? Because I'm having fun. Does it accomplish anything? No, but I enjoy it so I keep at it.
would you object to someone enjoy themselves at your expense?
[QUOTE=HumanAbyss;51402027]there's no way your arguments will be well received though as you're being incredibly blithe and arrogant and dismissive of other people and their desires so why would anyone even remotely respect your desires[/QUOTE] He's just living up to his username. Ignore him.
[QUOTE=HumanAbyss;51402153]would you object to someone enjoy themselves at your expense?[/QUOTE] If I knew that it would make economical sense for me to object then I would. If it was an useless endeavor or was too taxing to be worth the possible profit then no, I wouldn't. [editline]20th November 2016[/editline] [QUOTE=Govna;51402154]He's just living up to his username. Ignore him.[/QUOTE] I picked my username because I was contemplating the nature of human-produced BS and why the world is so full of it, so it was the username that came to mind. It has nothing to do with the content of my posts.
[QUOTE=RampantBS;51402173]If I knew that it would make economical sense for me to object then I would. If it was an useless endeavor or was too taxing to be worth the possible profit then no, I wouldn't. [/QUOTE] That sounds like a quote by every major oil tycoon in the past century
[QUOTE=Emperor Scorpious II;51402194]That sounds like a quote by every major oil tycoon in the past century[/QUOTE] Majority of oil tycoons are usually very unreasonable and superstitious people. I think that's just your perception of oil tycoons, not the reality of it.
[QUOTE=RampantBS;51402173] I picked my username because I was contemplating the nature of human-produced BS and why the world is so full of it, so it was the username that came to mind. It has nothing to do with the content of my posts.[/QUOTE] Sounds a lot like bullshit to me.
[QUOTE=Emperor Scorpious II;51402205]Sounds a lot like bullshit to me.[/QUOTE] Was mostly active subject on my mind because of the US election, populist vote, and eventually religion, spirituality, and re-emerging mainstream Occult.
[QUOTE=RampantBS;51402215]re-emerging mainstream Occult.[/QUOTE]the whaaaaat?
[QUOTE=Joazzz;51402262]the whaaaaat?[/QUOTE] Recent bullshit like meme magic and all it's fake evidences, Occult art performance news on every tabloid, South Korean president being controlled by shaman, and other such events let to resurgence in people interested in Occult, thinking it's a ticket to the top, not to the delusion-land.
[QUOTE=_Axel;51397885]Why do people always talk about sentient AI as if they know how it will be like? We're the ones who will design it in the first place. How it behaves will be entirely dependent on how we build it.[/QUOTE] Before addressing this point, consider why many leading experts in Computer Science/physics are pointing this out as a concern, and whether you are a leading expert. Now, on to the point: It is completely possible to program something that has unpredictable behavior/is far more complex than we can understand. Hell, we already have programs like that, such as neural networks. In more traditional paradigms, we can analyze what the programs do and get an idea of how they work. Neural networks are exremely difficult to "look into". They seemingly "just work". The idea of a technological singularity is that if it's possible to make a system that improves itself it will likely exponentially increase its capabilities and will improve far beyond our abilities to understand it. As another example of low intelligence systems creating higher intelligence beings, evolution does exactly that. I don't it's paranoid at all to consider us humans can do something similar.
[QUOTE=Megadave;51385419]It will be interesting to see what lifeforms will form after humanity has long gone, because I assume at this point if humans go so will 90% of the ecosystem too, that leaves room for the next great being to conquer the earth.[/QUOTE] Only nobody will get to see it.
[QUOTE=DoctorSalt;51422021]Before addressing this point, consider why many leading experts in Computer Science/physics are pointing this out as a concern, and whether you are a leading expert. Now, on to the point: It is completely possible to program something that has unpredictable behavior/is far more complex than we can understand. Hell, we already have programs like that, such as neural networks. In more traditional paradigms, we can analyze what the programs do and get an idea of how they work. Neural networks are exremely difficult to "look into". They seemingly "just work". The idea of a technological singularity is that if it's possible to make a system that improves itself it will likely exponentially increase its capabilities and will improve far beyond our abilities to understand it. As another example of low intelligence systems creating higher intelligence beings, evolution does exactly that. I don't it's paranoid at all to consider us humans can do something similar.[/QUOTE] And I never said anything against that? There's a difference between an AI's behavior depending on how we build it and us being able to predict how it will evolve. Maybe you should consider what I'm saying and what I'm replying to before you assume what I mean. [QUOTE=_Axel;51397885][QUOTE=Firetornado;51389435]Our emotions of Morality and love are result of the need to breed, the social nature of our species (which we needed to survive) and chemicals in our brain. AI would not have this, its not gonna feel bad for killing us, it would just make logical sense to wipe us out in the interest of self preservation. In a way, morality is a silly remnant of our caveman times, but our laws and society are all based around it today, then again I myself feel bad for my family and friends but AI would exhibit no such behavior as this behavior is somewhat pointless and illogical in the grand scheme of things.[/QUOTE] Why do people always talk about sentient AI as if they know how it will be like? We're the ones who will design it in the first place. How it behaves will be entirely dependent on how we build it.[/QUOTE] I'm replying to someone who claims AI wouldn't have any sense of morality or empathy. If self-improving AI is so unpredictable, how can we even know that? It's just the typical Spock interpretation of empathy and emotions being opposed to rationality, which completely ignores the rational backing behind emotions. You yourself talk of low intelligence systems creating higher intelligence beings, emotions and empathy are just that. Without them, we wouldn't have grouped up and formed civilization in the first place, which as a collective union of humans [I]is[/I] a form of higher intelligence being. I think an AI more intelligent than us would be able to understand that as well. Not to mention even if we consider self-preservation the sole goal of an AI, wiping off humanity would be incredibly dumb considering they are required for any form of maintenance. Emotions and empathy are a product of the evolutionary process, self-improving AI being another form of evolutionary process I don't see why it wouldn't develop its own morality system. You can't just say that self-improving AI is unpredictable and then in the same breath baselessly predict that it would be a cold-blooded genocidal maniac. [editline]24th November 2016[/editline] The idea of morality being "a silly remnant of our caveman times" , "silly" and "illogical" is just stupid, especially when you admit that all our laws are based on it. What happens when we remove those laws? Existential transcendence? More like anarchy. Believing that higher intelligence is characterized by a total lack of empathy and complete selfishness when all evidence points towards it leading to a worse scenario for everyone involved is something you usually grow out of after your teenage years.
[QUOTE=Turnips5;51385379]I'm gonna say 50[/QUOTE] Agreed, bud. I genuinely believe I'll witness the end of the world in my lifetime, and I'm only twenty-three/
Someone make stephen hawking shut up, he won't stop saying that humanity is doomed when we already know it :v:
[QUOTE=The Rifleman;51385367]At the rate were going I'd be amazed if it were made it 500[/QUOTE] I'd be surprised if we made it through 2017 at this rate really.
Sorry, you need to Log In to post a reply to this thread.