Are you really ready for your new overlords, should science end humanity? NPR talks Post Humanism a
78 replies, posted
[quote]Horn's main argument was that, in the near future, we will build machines surpassing us in intelligence. What the machines — those machines — then build will surpass their own creator's intelligence. This process will rapidly continue until, very soon, it yields a new force on the planet — superintelligence. This runaway process is often called the "singularity" — and Horn's main job was to argue that, given current trends in technology, something more or less like it is coming.[/quote]
[url=http://www.npr.org/blogs/13.7/2014/11/17/364619831/should-science-end-humankind?utm_source=facebook.com&utm_medium=social&utm_campaign=npr&utm_term=nprnews&utm_content=2042]Read more at NPR[/url]
I for one, believe science will in fact, go too far on this one. If not only based on this quote alone.
[quote]Imagine the loveliest face you have ever seen. Now replace it with a thin fish-head topped with high ribbed fin. Perhaps that configuration — better for displacing heat from super-charged brains — will be the trans-humanist ideal of beauty.[/quote]
[quote]Imagine the loveliest face you have ever seen. Now replace it with a thin fish-head topped with high ribbed fin. Perhaps that configuration — better for displacing heat from super-charged brains — will be the trans-humanist ideal of beauty.[/quote]
[quote]Imagine the loveliest set of abs you have ever seen. Now replace it with a pouch similar to a kangaroo but less slimey. Perhaps that configuration — better for holding your super-sized wallet — will be the trans-humanist ideal of beauty.[/quote
[quote]fish shit[/quote]
This is already someone's idea of beauty, and their deviantart is undoubtedly full of flawless examples of highly conductive perfection.
Honestly? The idea of this kind of thing excites me.
It'll be weird as fuck transitioning to it, but we'll be capable of so much more if a singularity happens
[QUOTE=BANNED USER;46516934][url=http://www.npr.org/blogs/13.7/2014/11/17/364619831/should-science-end-humankind?utm_source=facebook.com&utm_medium=social&utm_campaign=npr&utm_term=nprnews&utm_content=2042]Read more at NPR[/url]
I for one, believe science will in fact, go too far on this one. If not only based on this quote alone.[/QUOTE]
Let it. That sort of Frankenstein creature sounds awesome.
I get that we're scared little animals who often are too stupid to know our asses from a hole in the ground, but these types of concepts are fascinating and should absolutely be explored in the future because of the astonishing potential they could hold for improving our species and understanding it; what our limits are, what are the boundaries of possibility, etc. We're not "playing God", we are becoming gods. Hopefully smarter and gentler gods as we pass through the years.
I'm completely fine with replacing/improving my body with tech, not so keen on the hivemind idea though
Hope more articles like this start popping up.
I am sick to death of the highly drawn-out "Fear the future" message that has been coming from media for the last X amount of decades.
To truly survive and thrive as a species we NEED to change.
Honestly, this is relevant.
[media]http://www.youtube.com/watch?v=1FgSmdfRUus[/media]
We're by far the smartest creatures on Earth, but all the intelligence in the world can only take you so far. We might just get too far ahead of ourselves one day, if we haven't already.
Here lately i've been wondering if you could get a computer implant after a hemispherectomy, of course better than the "me" that was cut out.
Anyway, I'd love a benevolent robot ruler. After all, humanity can't get seem to get it right, even at this day and age.
Transhumanism is fucking awesome. Anyone against it is against forward progress.
So who exactly is working toward creating a singularity? How exactly is one supposed to come about? Why does Horn think we will soon be able to make machines smarter than us?
[QUOTE=Solomon;46517509]Transhumanism is fucking awesome. Anyone against it is against forward progress.[/QUOTE]
You just want us to by your Neuropyzene, you bastard.
[QUOTE=HWECQI;46516958]Honestly? The idea of this kind of thing excites me.
It'll be weird as fuck transitioning to it, but we'll be capable of so much more if a singularity happens[/QUOTE]
Such as?
I'm fine with transhumanism as long as they don't enforce it on people. Blatant breech of freedom to just chop someone's limbs off without permission and say "Hey dude! You're all sciencey now! Isn't that awesome? ;)" .
[QUOTE=Solomon;46517509]Transhumanism is fucking awesome. Anyone against it is against forward progress.[/QUOTE]
agent of the machines! you'll betray us all when the singularity hits.
There is no way that the singularity is going to happen in the "near future". We can't even replicate some of the most simple activities our brains are capable of doing. There is a very long way ahead before we have a machine that is capable of inventing anything.
I don't think that anyone of us is going to experience singularity and for me that is most definitely not the "near future".
Well yes and also the fact that we understand so little of the brain anyway.
[QUOTE=EpicRandomnes;46517624]I'm fine with transhumanism as long as they don't enforce it on people. Blatant breech of freedom to just chop someone's limbs off without permission and say "Hey dude! You're all sciencey now! Isn't that awesome? ;)" .[/QUOTE]
Who implied it'd be forced, though? I haven't heard of a single person advocating forced transhumanism.
Most people don't like not being the dominant force.
[editline]18th November 2014[/editline]
It's like being jealous of your child becoming a billionaire.
[QUOTE=Mingebox;46517555]Such as?[/QUOTE]
Getting rid of world hunger and financial inequality would be remarkably easier to name two.
[QUOTE=mochisushi;46517899]It's like being jealous of your child becoming a billionaire.[/QUOTE]
Well if your child isn't a complete douche, you also gain something from that.
[QUOTE=Govna;46517013]Let it. That sort of Frankenstein creature sounds awesome.[/QUOTE]
Fuck that shit, I don't even like tattoos, why would I want to live in a world full of fish-headed freaks?
I may be for a lot of crazy shit, but cyborg is where I draw the limit. There are so many environmental hazards, the deadliest being a solar flare. If they figure out a way to do this without using any electrical components, it can be doable.
[QUOTE=Solomon;46517509]Transhumanism is fucking awesome. Anyone against it is against forward progress.[/QUOTE]nut
Even if superintelligence became possible only <0.000001% of the populations going to have it
My biggest problem with this whole transhumanist shebang is that it would undoubtedly cost money.
So who could afford their own artificial betterment? Rich people. They would become literally capable of becoming superior by the power of money. Would this new, mentally and physically superior class of society who already held power then willingly get rid of its own superiority to bestow the benefits of transhumanism upon those lower than them? Most probably not.
[editline]18th November 2014[/editline]
Or if they did, it would be expensive and slow and they would keep the controls in their hands. Want to get that company promotion? You better acquire X brain implant to help you multitask better. Oh you cannot afford it? The company can cover the cost... for a price. Perhaps some of your salary. Perhaps contracts that bind you to them. And of course, if you want your children to get ahead in any way, shape or form, you better start augmenting them at an early age! O, you cannot afford that? Then your children will forever be inferior, fodder unfit for important roles. People at large would lose control of their lives and their importance in society, and in order to regain that control they would be dependent on those that supply artificial superiority.
The socioeconomic repercussions of a competitive "forced evolution" is not something everyone, including me, is going to want. Imagine the conditions for people who couldn't afford augmentations and are thus obsolete in an every increasingly competitive work ethic? Those who can afford it will exponentially increase in capability, while the rest just rot.
If there's some kind of way to distribute this sort of thing with everyone in mind, particularly the currently disabled, then I'm all for it. Giving the rich godlike augmentations is asking for everyone to get fucked.
[QUOTE=Krinkels;46517546]So who exactly is working toward creating a singularity? How exactly is one supposed to come about? Why does Horn think we will soon be able to make machines smarter than us?[/QUOTE]
Well, everybody. The concept of the singularity is that it's not an invention, it's the point in time in which we tip the scales of science and technology, where machine intelligence begins to outpace our own, and triggers an unstoppable chain reaction of increasingly rapid forward progress. No one person would be responsible for it, because it is a reference to the entire scientific field, not just a specific invention or set of them. It is very likely that nobody will even know we've reached that point until we're already well past it.
I don't necessarily know whether I buy the cyber punk vision of the future, full of intentional self mutilation of limbs in favor of robot arms and things, however. Real life isn't a video game: super strength and gun fingers just don't help enough in the course of every day life to robot arms being attractive enough for the average Joe to lop his limbs off without it being medically necessary.
More realistically, we'll see a series of smaller, every day improvements that will invade our lives in the same way as smart phones as tablets did. Handheld and wearable devices are much less scary sounding to people than surgically implanted computers and full limb replacements.
Sorry, you need to Log In to post a reply to this thread.