The Spock effect

in #philosophy7 years ago (edited)

Are the most intelligent the ones who have the most, know the most, laugh the most? I do not know, but it is something to contemplate on.

I like to dabble in conversations about better worlds and utopias and the more likely dystopias we will face before they come about. In my opinion, artificial intelligence will play a large role in either the destruction of our species as we know it to be or the resolution of it but, it is something outside of us, an external event.

I was having a coffee with one of my new sign ups the other day and helping him understand a little of the basics behind Steemit. As he and I often do, we soon turned the conversation to the future and the possibilities and threats we see coming.

We like to lay blame as a species, we like to find the causes of our miseries and find avenues of escape, anything to stop us looking at our role in the situation. but, even when we find a scapegoat for our emotions, if we look closely, what do we find?

This morning, my daughter, a book-loving person, tore a page out of one of her 'Spot, the Dog' books and was subsequently told off. But, what is the issue here, is it that she did it on purpose to destroy, accidentally or, she just did not understand what the future experience with one of her favourite books now torn looks like?

When we look at the world, the errors, the failures, the damage and violence we do to each other and ourselves, is it because we are terrible, evil? Or, is it because we do not understand ourselves well enough or the part we play in the future ramifications of our actions? For me, education is always key but, up until this point in time, we are far too stupid to realise and create solutions.

When speaking with my friend over coffee, I described a scenario where a small group of people were given a pill and granted super intelligence and what may come about. He thinks that with such power, they will battle it out and quickly dominate the rest of the species, essentially enslaving them.

I disagree. This is the thinking of us now, the uneducated, the inexperienced in such matters, the one's who are potentially yet to understand what 'for our own good' truly means.

When I look at the most intelligent among us from history and the present, there seems to be a trend that runs through nearly all, even though they are incapable of achieving it, unable to think far enough. Peace, Unity, Community.

They may have different ideas about how this looks but, they are almost unanimous that this is the goal. Einstein had an IQ of 160, what if we double it, triple it, what would he have discovered? Like @tonyr's piece from a few hours ago, would he have thought of the 'right words' to bring us all together?

Unlikely, for we all need different words no doubt to take us to the understanding he may have reached. what this means is that we cannot wait to be saved, we must save ourselves. Waiting for an external event to save us is foolish but, searching for that event need not be. Perhaps, this is what Einstein and others like him were doing and even though he failed, it moved humanity further along the path. Whether they realised it or not, it does not matter as virtue is virtuous even without acknowledgement.

But, what of the group that have access to the pill of Super intelligence? What will they discover, what conclusions will they reach? Will they mercilessly enslave the rest or, will they now have the capacity to understand the solution is simple, it is understanding itself.

With an intellectual awakening like none ever experienced by the species before, will they realise that there is only one way, to awaken all. Peace, Unity and community. With their supercharged minds holding unforeseen visions and abilities, how long until they give the pill to others, how long until they can flip the switch and illuminate humanity?

Most cannot imagine this world as most if granted such power will use it for personal gain and assume others would too. And, they are most likely right but, they are thinking from an unenlightened position, their thoughts aren't from the mind of the supercharged. This makes it impossible to understand what is in the darkness that may become so blindingly obvious when the light goes on.

We tend to think the worst of people for we ourselves are those people. We may operate at different levels, it may present in various ways but, we see what we think we know and that is what we fear in ourselves. It takes one to know one as they say but, a supercharged mental capacity at this level is so far, unknowable to us.

Our thinking limits us, our thoughts place ceilings on how far we can go, how much we can do as a species, but as our understanding develops, the roof gets shifted upward. At what point is it high enough that we recognise ourselves in the mirror and find the solution to our problems?

It is inching upward though, day, year, decade and century at a time. With the advancements in AI and gene splicing technology, how long until that switch is found and enlightenment is not only within reach, but superimposed or genetically implanted? How far can we go if we are unlocked and free at such levels with the understanding; we are all in this together.

I do not know, I am too limited of mind and knowledge to see what that may look like. You may see different to me but, in this scenario, you are likely wrong also, no matter how you feel about it. It is just too different from how we think, what we can imagine and who we are now.

However, we are on the path and tomorrow, we will be slightly different, tomorrow we may have taken another small step and at some point, one of us may uncover the switch. From that point on, logic and reasoning could lead decisions whilst the heart holds us united. Perhaps only then as a species we can, Live long and prosper.

All of us.

Taraz
[ a Steemit original ]

Image source

Sort:  

I too, find myself pondering over humanity's eventuality, and I do think that AI will be a big part of us evolving. Whether that is a good thing, as you pointed out, is yet to be seen.

Most of our "great minds" of today are very passionate about their intuition that AI itself could be the downfall of humanity, like Mr. Hawking and Elon Musk. Maybe they feel that way because what if having that sort of intellectual computing power somehow turns off the compassionate side of our human instinct?

It is well known that people who are intellectual are often socially stunted in some way. My daughter was a "gifted & talented (GAT) student" (for those not in the USA, that means smart with high standardized test scores) and her GAT teachers would send home tips to the parents on how to cope with a socially awkward child who is intelligent - sort of as an intervention.

So many questions, most of which will probably not be answered in our lifetimes... but maybe they will. I believe the world is far more technologically advanced than we laypersons know of, and who knows when full disclosure will occur.

And, indeed - Live long and prosper, fellow Steemians! (making Vulcan salute - haha)

I am quite sure that humanity will screw up any benefits before they can be realised due to impatience and yes, AI is likely to destroy us but, I don't see it's destructive power in the same way. AI already is manipulating the feeds an content and turning back the clock on human creativity. Perhaps eventually, intellectually we will come to a standstill where there aren't enough creative minds to drive us forward. When something stops moving, it is dead.

I definitely see your point - and, yes, it is already occurring... especially with the younger generations who simply cannot function without their devices, and they are our future. And lack of creativity equals lack of innovation. Thanks for giving me something of substance to think about over the weekend!

When speaking with my friend over coffee, I described a scenario where a small group of people were given a pill and granted super intelligence and what may come about. He thinks that with such power, they will battle it out and quickly dominate the rest of the species, essentially enslaving them.

That's the problem with the unknown. And if we can't know the unknown can we scope it's impact in known terms? And even if they knew ways to peacefully accomplish their goals, with or without enslaving people, then you still have to consider their personal motivations and their morals.

You could try putting it into terms of altruism/selfishness. They could be acting with the best interests of everyone if they are altruistic, or they could be acting in their own interests if they are selfish.

But then you have to consider you might have a mix of reactions amongst this super, intelligent group. And what's the split of altruism versus selfishness. Maybe they'd see each other as the bigger threat to accomplishing their goals. They could engage in mutual assured destruction. Or maybe they divide the world up. And you end up with some utopias for all in their dominion, and some utopias from just their perspective.

But maybe what's in the best interest of all is actually to be enslaved. As a cold and intelligently calculated outcome, maybe there's too much unknown in this problem and we just cant conceive of the scenario.

Perhaps, as you explored, it's just unknowable because we didn't get that pill, but what if they know all of these things and choose to do nothing, are they immoral if they don't do anything?

Fun experiment.

And what's the split of altruism versus selfishness.

It is also possible that both of these views drop away and there is just what is necessary, and what is not.

are they immoral if they don't do anything?

Perhaps that would be better.

There are a lot of reasons why most people just hope for great things from themselves or to reach a certain level of success. One of the main reasons is because they have let where they are at right now in life dictate their ways of thinking.
Moving forward, increase your expectations in every area of your life.

One of the main reasons is because they have let where they are at right now in life dictate their ways of thinking.

Yes, this makes imagining vastly different, quite impossible.

There are many terrible movies about artificial intelligence.if one day artificial intelligence evolves too much in the future
these films can inspire them
we should first accept that the technology has improved
and we must take our precautions
maybe artificial intelligence is not just robots.
it may be a modification to improve people
@tarazkp

Good story. I think it's quite hard to forecast the effect of being super-intelligent.

But my general point of view is that super-intelligent people tend to focus on a specific topic without considering that this may change the whole world (at least short-term perspective). Intelligence needs a platform to grow - and this could be found best when focusing on a specific topic. Einstein was not good at school, because the system was not right for him. So it was not the right platform for his intelligence :)

What the future my bring though could be far above Einstein levels. What platforms will they build? :)

Yeah the cycles of innovation have shortened. About the platforms - I don't know. But i expect them to be connected to a single place. It's more like where intelligence/innovation/improvement is needed....

Very truly so. I agree with your friend that after taking the "intelligence" pill and leaving all other instincts as is it most probably break out into the act of intelligent violence we never seen before. But if there was a "no greed" pill then we might end up living in an amazing world of utopia...

This is such a beautiful piece, it's almost uncanny. I remember having this kind of conversation with a friend once. He was skeptical too. There are a lot of cynics out there, and the worst part is one can't even blame them, so I understand how he might have thought, like you said, that humans with 'super intelligence' might enslave the rest. But your optimism is remarkable, and I share it. Who knows what heights we might reach! Maybe that mantra of peace unity and community would be realized after all.

Perhaps if we drill it long enough, they might remember it as they shackle us.

On earth the idea of enlightment has been intellectually dead for 200 years, but here on Steemit you guys still believe in it I see. I wonder where this leads us!

Why do you think this? Or are you conflating enlightenment with religion?

The whole idea of enlightment is that by illuminating the dark corners of our world with the lamps of science we will find a hidden meaning that will guide us through our lives. Darwin, Nietzsche and Einstein were in my opinion the strongest lights that we ever had, but all they showed us were relativity and nihilism. We are beasts that will power in a universe that isn't even solid. So how can there be expected guidance from lights when there are no directions to be shown at all?

Good article!

When we look at the world, the errors, the failures, the damage and violence we do to each other and ourselves, is it because we are terrible, evil? Or, is it because we do not understand ourselves well enough or the part we play in the future ramifications of our actions? For me, education is always key but, up until this point in time, we are far too stupid to realise and create solutions.

I think we need moral discipline on civility.

reading through your post made me realize a few things, first, the enlightened intellect and true intelligence might be the same thing after all. Like your friend, my instinct was to see intelligence as being synonymous with "wit", "tact" or the ability to find your way through difficult situations. This is an amoral attribute and will likely be used by those who are endowed with it to oppress others and exploit systems. However true intelligence, as has been shown in some of the most intelligent specimens from our species entails the ability to see things from a broader and higher perspective, to realize the future effect of every action regardless of the gratification in the present moment.
I also want to believe that the greater the intelligence of the human specie, the closer we get to any form of Utopia, our current level of intelligence only pushes us to exploit and dominate each other in most cases.

speaking of AI being either end of the human specie or the way to Utopia, it reminds me of irobot (the movie) where the robots believed that to save humanity, they had to stop humans from being in control. it's fictional but true, our collective intelligence at the moment is at a state where exploitation of resources is the main goal.

I wrote a piece a ways back about whether people would accept the directions of a super intelligent AI that was outside of political circles and had all information at its fingertips. I wonder what plans for humanity it may suggest and whether we would willingly follow or rebel if it made absolute sense to us.

Thats interesting. personally, I want to think we humans will accept the directions of a non political super intelligent AI as long as it is not anthropomorphous, but I believe our ego wont allow us have a human-like AI overlord. As long as it functions like an impersonal decision making system (something like Jarvis from Iron man or a gps guide), we will likely see it's decision in good light. for example if AI would tell us the best way to live on mars, end hunger, stop climate change, we wouldn't mind following the suggestions but this will be different if the AI has distinct personality and is made to appear human.

That is interesting too but, what if it has already begun, recognised the problem and instead is nudging us into a group through internet content algorithms? :P

Well, that will be a very dangerous AI, that level of stealth or manipulation should never be trusted. :-)

I whould absoluttly not agree on having ANY kinda AI making the desitions like gouvernments is trying to do now. Even if it was to be moral and beneficialin some ways. Mostly because that goes agains what life is, so to speak. Are we not meant to live ? - As opose to merely exist? If most decitions are made for us in how to live, and settling issues with others and explorations, and we are given what we need, then whats the challenge to overcome? Living includes exploration, failing, succeding, trail and error.. This is how we grow and do develope wisdome.
This is why the nany state is ultimatly not beneficial.

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.030
BTC 62704.97
ETH 2442.73
USDT 1.00
SBD 2.66