[PHILOSOPHY] Against Transhumanism

in #philosophy8 years ago (edited)

Transhumanism is basically the idea that biotechnology and information technology will merge and eventually improve the human body and mind. More intelligence. Cancer-eating nanobots in the blood vessels. Increased longevity and perhaps even eternal life. Simply everything we could wish for - after we've lost the ability to ask for anything other than what the techno-commercial masters claims to be able to give us, that is. But will we still be human if we do all this to ourselves? Transhumanists reply: "No! And the sooner the better! We must evolve beyond the human."

Transhumanism and post-humanism is about, in advance, learning to accept a hypothetical technology which main function is to be a retroactive compulsion. A technology that will force us to say: "This technology is now a reality whether we like it or not. We have become dependent on it, so let's just adapt." This is the bitter little triumph that transhumanism has to offer, and its representatives are already celebrating. We are supposed to invest so much time and unfortunate fantasies in this biotechnological concept of the future that we don't have any options left - so that we have to actualize it. We're not there yet, but we are made to believe that it's too late to change course for any other possible future.

You say you want to make man better. What do you mean by "better"? Better for what purpose? After a few such questions, peeling off layer after layer of ill-conceived notions, this "better" will prove to be completely hollow.

If you want to make a transhumanist miserable, you should, for any future improvement that he's listing, in turn respond with Socratic questions:

Transhumanist: "We will be able to live 150 years - at least!" 

You: "Yes, but why is it important?"

Soon, you'll both discover that transhumanism doesn't have anything other than acceleration to offer. An acceleration that quickly becomes a desperate end in itself - and to drown the looming realization that the acceleration never reaches anything the pace is turned up faster and faster.

Behind this visions of a hyper-advanced technology lies a handful of simplistic values. This is especially apparent in Ray Kurzweil, one of the future visionaries who have gone furthest in this direction. He's hoping for something called "the singularity" - a technological big bang which is expected to occur when the accelerating artificial intelligence takes a leap beyond anything any human can imagine. An exciting thought, but what does he mean by intelligence? His conception of intelligence most of all resembles a five year old's idea of horsepower: a million horsepowers must be better than 250! All Kurzweil wishes for is more.

People who sell these kind of visions of the future are remarkably fond of biological metaphors. Transhumanism, artificial intelligence and "the singularity" is described as a continuation of the 'explosion' of life from unicellular organisms to humans. The biological metaphors make their vision of the future seem natural and inevitable. By implication: saying no to their future is just as stupid as if a bacteria three billion years ago would have refused evolution. Another implication is that evolution somehow would have stagnated after reaching humanity - a hard technological kick in the ass is what it needs to get to the next level.

Transhumanism is a fantasy of power. Just as much, it's a fantasy of our total submission to technology. Therein lies the double allure that makes it so appealing. Man is as strongly attracted by submission as he is to power. We so much want to believe that "development" is a power we have to submit ourselves to, just as a stone age farmer was a slave to the weather. But development is a tapestry of human decisions - economic, political, everyday and scientific. Therefore, the development is in our responsibility. The qualities we should be practicing for the future is not submission and acceptance, but rather judgment and sense of responsibility.

The greatest challenge now is to get people to lose interest in transhumanism. How will that come about? If we want to find something that attracts us more than transhumanism we must drop the idea that development and evolution is a narrow line along which we can only move in two directions, forward or backward.

From the point where we are now, an incalculable number of possible futures radiates in all directions of the compass. "Forward" is not a direction; forward is 360 degrees. The proponents of transhumanism want to make it appear as the only way forward. All the other possible futures that await in the remaining 359 directions we have not even begun to imagine.

Sort:  

Such negativity. Have a snicker's bar.

Cancer cures are abundant in nature already. I think its very hard to have a choice in this since geoengineering is an implicit part of this agenda and so are Monsanto and the bioengineers, we already have artificial semi biological creatures being created, I personally feel that the bees are being purposely killed off by Monsanto as they cannot wait to release their bee bots to take over fertilizing all crops and their dream of owning and controlling humanity by controlling the world's food will be complete as well as their new aluminum resistant seeds. Their goal is to patent all life and its not just human life they want to own. Many civilians are already walking lab rats having breathed in nano microchips and smart dust that are self assembling nanobots when they get into the human body, the journals out there that have been published on this are in the hundreds, then the plasma field that is used to pulse Electromagnetic waves in an attempt to control human thought, the corporations that can now beam subliminal messages into the human mind when we are out and about, Geordie Rose and his D Wave computers that he describes as an alter to an alien god, Elon Musk says that AI is the biggest threat to Humanity. And do you ever feel like your computer is already interacting with your unspoken thoughts? Its all real and happening, the scientific papers are out there confirming it all, we are in far deeper, than we know sadly.

[PHILOSOPHY] Against Luddite Caricatures of Transhumanism

First of all, this post is full of obvious bullshit that Luddites say, all pathetic arguments not worth wasting another breath on, all pathetic arguments totally dispelled in the "Responses to Critics" section of Kurzweil's "The Singularity is Near," and also fully addressed in his earlier book "The Age of Spiritual Machines."

Anyone who agrees with this author owes it to themselves to simply read Kurzweil's refutation of the Unabomber manifesto (and the arguments of his other short-sighted, confused, and/or unphilosophical critics).

Transhumanist: "We will be able to live 150 years - at least!"

Idiot: "Yes, but why is it important?"

You: "If you don't value your own life, why don't you end it? Life is always optional. If you lack values and desires sufficient to make you desire more life, why don't you kill yourself? Before you get to 150 years old, that's always an option."

Death will certainly remain an option into the future, even as Transhumanists expand the possibilities in billions of imaginitive ways. (All of steemswede's "other possible futures that await in the remaining 359 directions we have not even begun to imagine" are impossible if one is dead.)

And if Transhumanists are narrow-minded and short-sighted, and crony capitalist? ...Then a better answer is to simply be one who isn't, and benefit from the amazing technology they give you to do ____ in the future.

Technology expands options, it doesn't reduce them. Intelligent people, transhumanists foremost among them, want more options, not just "more speed."

Second of all, the only thing that's dumb about transhumanists is the name they chose. It literally couldn't have been more calculated to engender fear and confusion among the idiot masses. Even "post-human" (which was being used in the 1700s as an ideal) is better. Most of the ideas in that domain space are sound though, at least at the level of thinkers like Kurzweil, et al.

I was taking my breath to respond, but you did it before me. I wouldn't be so belligerent, but if you already posted the impolite version, I'll support it.

The irony here is that Kurzweil is a guarded optimist, who, like K. Eric Drexler and Robert Freitas, also imagines that Transhumanism may make things far worse for humanity. The promise is intertwined with peril. Both are far greater(far better and far worse) than the current possibilities. Tech is just an amplifier, often of majority tendencies toward survival (which is why everyone has a refrigerator, but not everyone has a nuclear weapon).

I'm not like Truman, I don't give my letters to my secretary to hold for 24 hours before sending them out. LOL

I always find the claim that transhumanism​ is the next step of human evolution quite absurd.

It was the same line of thinking that got us into GMO's, the idea that we have the technology so we should tinker with DNA like we owned it. No consideration for the unintended outcomes, no debate as to the consequences. Just do it because we can. I don't buy that whole transhumanism pitch, no way. Great post. Up voted

Thank you for reading, appreciate it

Transhumanism might be valuable when sending humans to far reaching planets. Living 150 years might make it feasible.

I nominated this for Project Curie over in the Curie chat channel. I'm happy to see it got you some votes!

Appreciate it!

As long as you're not blowing up wetware labs, and they're not holding you down and modifying you, I think we'll all get along fine.

If there is a tech bubble and it pops, transhumanism will dive in popularity.
I hadn't thought of it as a limiting philosophy, but of course it is. I appreciate your thinking and writing on this topic.

Most of what I've read here goes way over my head. I'm a simple man and all this dialogue seems to be going in circles. Can't tell who's for and who's against. I personally believe that there could be a great advantage to merging biotechnology and cybernetics. However, just like everything in life, there can be too much of a good thing. Using technology to cure disease and improve handi-capped is great. But to utilize technology just to "improve" the human body, (what God created ) is terrifying. Resistance is not futile; I will not be assimilated

Coin Marketplace

STEEM 0.17
TRX 0.14
JST 0.029
BTC 59060.44
ETH 2608.94
USDT 1.00
SBD 2.43