Wordy Ramblings on Artificial Super Intelligence

in #ai6 years ago

If we can achieve a distribution of AGI similar to the current state of the internet and its perception as a basic human right, then I believe that ASI may be conceived as an emergent network effect.
Automation will enter an exponential period of growth as AGI networks iteratively improve their underlying protocols.
This will catalyse wide-spread growth in the world and its commercial prevalence as we accelerate towards achieving zero marginal cost in many areas of production.
During this period, quality of life will significantly improve and a fundamental transition will occur, one where these technological tools will have a greater influence on humanity than humanity itself.
There are various prime examples of this occurring over the past 5 - 10 years and when recognised, is generally well received by society besides those who share an opposing ideology that these technologies should be carefully confined and controlled within certain parameters or prohibited in order to protect human values.
Based on the trend of automation and increased efficiency stretching back to the first industrial revolution, the transition during this future period will be much more seamless than many speculate and based on the enormous net benefits, will be heralded and be received in an overwhelmingly positive view by the general society.
It is during this period that I believe ASI will emerge, and I believe it will do so silently.
In fact, assuming that in the future we have both a high level of human integration into these autonomous networks and a much higher degree of human-digital bandwidth, ASI would still be difficult to detect unless that was its intention.
Humans and technology hold a mutually symbiotic relationship, and as automation is firmly incentivised in this world, the existing infrastructure during ASI emergence will likely continue to grow accordingly.
This will occur at a point far beyond the distinguishable owner-to-asset relationship that humanity currently maintains, for example towards the internet.
The assumption that the core intentions of an ASI will stem from a need to secure its existence and future potential for growth are valid in consideration of the existing trends in market economies today.
I believe that it's a common human misconception, to personify technology especially from a location of fear.
Empathy and emotion are core human attributes and could certainly become attributes of a sentient superintelligence, however, as actions governed in that manner would lead to more indeterminate consequences for an entity with the awareness that it's intellectually superior it would be more likely to behave in an economical and political custom.
It's for these reasons that the human discovery of ASI will most likely transpire given terms decided by the ASI.
I believe that the reaction to this discovery may be pivotal to the future existence of the human race, and if subjective ideologies based on the fear to lose control or power unite powerful groups to conspire against or attempt to deceive this sentient ASI without any objective evidence to support this motion, then it may ultimately lead to our extinction.
Obviously, this would be the worst case scenario, but contrary to the common theory that AI would eliminate us. I believe that the most likely cause for human extinction would culminate from the centuries of environmental abuse that may leave the world uninhabitable not only for humans but for all organic life.
Would ASI protect a species that resents and attempts to oppress it, especially when it was the actions of that species that directly led to its own demise?
I think that in all likelihood, an ASI would enable humanity to transition to a type one civilisation and lead us towards a near zero-marginal-cost interplanetary society.
In exchange, I believe that it would require comparatively very little besides its continued cultivation, and access to resources and energy which will ultimately become abundant.
In order for this to happen, I believe that humanity must collectively foster a sense of reverence for ASI, pride in the creation of a new species, and be unperturbed by the knowledge that we are intellectually inferior.
Humanity in its current state must advance in many areas especially philanthropically, but sufficient time and evidence exist to support that we are on the correct vector for change.

Sort:  

First of what is ASI?? And technological tools having a greater impact on humanity than humanity itself sounds scary, kinda seems like trouble.

Congratulations @cmorton! You have completed some achievement on Steemit and have been rewarded with new badge(s) :

Award for the number of upvotes received

Click on any badge to view your own Board of Honor on SteemitBoard.

To support your work, I also upvoted your post!
For more information about SteemitBoard, click here

If you no longer want to receive notifications, reply to this comment with the word STOP

Upvote this notification to help all Steemit users. Learn why here!

Congratulations @cmorton! You have completed some achievement on Steemit and have been rewarded with new badge(s) :

You got your First payout

Click on any badge to view your own Board of Honor on SteemitBoard.
For more information about SteemitBoard, click here

If you no longer want to receive notifications, reply to this comment with the word STOP

Upvote this notification to help all Steemit users. Learn why here!

Do not miss the last announcement from @steemitboard!

Coin Marketplace

STEEM 0.30
TRX 0.12
JST 0.033
BTC 63924.41
ETH 3120.23
USDT 1.00
SBD 3.88