The non-algorithmic Intelligence in the computational Akasha

in #science7 years ago (edited)

Is classical physics deterministic as we learn in high school and is quantum physics non-deterministic? Or has the physicist David Deutsch proven that the fact that quantum computing is possible shows the opposite for the quantum world? Does this mean that reality and Intelligence are in fact algorithmic?

Background

In this article I will address the question of my book "Is Intelligence an Algorithm?" in the framework of my other book "Transcendental Metaphysics" on pancomputational panpsychism:

If Deutsch is right and nature is indeed completely computational one might infer that nature is completely deterministic. If it is completely deterministic and computational it can be considered as the mere processing of an algorithm. Then Intelligence as a manifestation of nature must follow the same determinism and must therefore also be algorithmic.

Conversely -as I will argue- if nature is non-deterministic at the quantum level and possibly also to a certain extent at the classical level, it can (despite its computational aspects) still have elements that cannot be captured in an algorithmic manner. My assertion that intuition and creativity might involve certain quantum effects then still can turn out to be correct and these parts of intelligence then do not necessarily follow an algorithm.

Quantum Mechanics

If you learn a bit about quantum mechanics in high school and the first years of university, you will probably learn about the so-called Copenhagen interpretation. Quantum mechanical observations sometimes defy our logic: Instead of following specific deterministic rules that always give the same outcome as in classical mechanical experiments, when we look at the dimensions of the atomic and subatomic world, certain of these rules are violated. For instance photons (the particles that make up light) when beamed individually through a double slit can depending on the experimental set up behave as either a wave (creating an interference pattern of multiple lines on a screen behind the slits) or as a particle (creating two lines corresponding to the two slits on a screen behind the slits). This so-called wave-particle duality is not only observed for photons but also for material subatomic particles such as electrons. If you put a detector at one of the slits the particle behaviour is observed, if you don't put a detector at the slits, the interference pattern is observed. Before the photon or electron is observed the system can be considered to be in both states (wave and particle) at the same time. Similarly, quantum systems can take on other different apparently mutually exclusive states simultaneously before observation and they are said to be in a superposition of those states (which are called wave-functions) as long as they have not been observed.  When the photon or electron hits the screen on which the particles are detected or when a quantum system is observed, the so-called wavefunction is said to "collapse" into one of its states according to the Copenhagen interpretation.

A famous example of this is Schrödinger's cat. A cat is placed in a box with a radioactive material. If the radioactive material undergoes decay, the cat dies. But as long as you have no opened the box, you don't know whether the cat is dead or alive. The radioactive decay is a probabilistic quantum event and might or might not have happened. Only upon opening the box, upon observation you can establish whether the cat is dead or alive. Before the box is opened the cat is said to be in a superposition quantum state of both dead and alive. Opening the box and observing its state collapses the wavefunction, say the Copenhagen adepts.

Deutsch, Everett and MWI 

No, says David Deutsch, the wavefunction does not collapse, but both possibilities occur. In fact the universe splits into two or multiple parallel dimensions. In one universe the material undergoes radioactive decay and cat dies, in another it does not and the cat is still alive.

David Deutsch follows the so-called multiple worlds interpretation (MWI) by Everett, which claims that all possible outcomes of quantum events are de facto realised, each in a separate parallel universe. In this way according to him there is still full determinism because everything effect is bound to have been caused by a specific cause. Note however that this is not traditional determinism, since not every cause has one specific outcome but rather multiple different ones in different universes.

But David Deutsch theory is not merely a repetition of Everett's MWI. Deutsch insists that these outcomes are the result of quantum computations. 

Quantum computing

Quantum computing is excruciatingly difficult to understand. People may have fancy ideas how atoms or subatomic particles compute; how individual quanta compute, but it requires quite some investigation to get even a hunch of what is going on here. A normal logical gate in a binary computer accepts a 0 or 1 value of a so-called "bit" as input, performs a calculation or transformation on this bit and gives an output in the form of a bit value: a 0 or a 1.

Quantum computers accept qubits as input, which can have as value both 0 and 1, a superposition of both or a plurality of values between 0 and 1. 

Also for quantum computers logical gates are present which perform transformations on the multiple different input values. The output is however a single value.

These quantum computers are said to yield gain in computing time for certain operations such as factoring big numbers.

But how do these gates really function? How are these multiple isolated values kept separate from each other during the computation?

Deutsch claims the computation must be done somewhere for each of the values, which then must somehow interact to yield a final value. The somewhere according to Deutsch is that each value undergoes a real classical transformation in a separate universe. If I understand it well, these universes then somehow create an interaction with each other which decides the final outcome. It would not be meaningful if in our universe only the calculation done on one of the input values would yield a result, right?

If we look closely at systems used for quantum computing we'll discover amongst others quantum ensemble computing and quantum annealing computing. In quantum ensemble computing we don't need Deutsch's MWI, because many molecules are subjected to a transformation and each molecule gives an output and the end output is a kind of average thereof.

In quantum annealing computing we have isolated qubit systems on which the calculation is performed, such as systems in which a limited number of superconducting metal atoms are entangled such as the famous D-wave computer (created in collaboration with Google, NASA and USRA). Upon performing a computation the entangled atoms together search for a minimal energy outcome, which is presented as the output. This process can physically involve the so-called quantum tunnelling process in which subatomic particles can go through an energy barrier as if they were going through a tunnel. If we try to mimic such a system with a classical computer, we cannot go through the energy barrier and this is why similar operations on a classical computer take a longer time. Here too, there is no need to assume a multiverse that computes the outcome either. Rather the entangled atoms seem to sense each other and collectively adapt to an ideal value.

The followers of Deutsch and Deutsch himself claim that his ideas about quantum computation have proven beyond doubt that MWI is correct or at least that this is the most parsimonious explanation in view of Occam's razor.

Really? Is an infinity of universes that is generated at each quantum event parsimonious? What Deutsch cum suis reproach the Copenhagen interpretation is that it somehow magically performs the calculations which then result in a collapsed outcome. Here is what bothers me about Deutsch's approach: Quantum mechanical effects rely on interference, yet the input states of the quantum system must remain separated during the calculation. But the true calculation arises from the interference between the different outcomes in the different worlds in Deutsch's model! In other words, as Scott Aaronson (a professor in computer science) wrote:

"The key thing that quantum computers rely on for speedups -- indeed, the thing that makes quantum mechanics different from classical probability theory in the first place -- is interference between positive and negative amplitudes. But to whatever extent different "branches" of the multiverse can usefully interfere for quantum computing, to that extent they don't seem like separate branches at all! I mean, the whole point of interference is to mix branches together so that they lose their individual identities. If they retain their identities, then for exactly that reason we don't see interference. 

... a quantum computer is not a device that could "try every possible solution in parallel" and then instantly pick the correct one. If we insist on seeing things in terms of parallel universes, then those universes all have to "collaborate" -- more than that, have to meld into each other -- to create an interference pattern that will lead to the correct answer being observed with high probability".

If the different branches interact and interfere to give a compounded outcome, is this not as magical and mysterious as the Copenhagen approach? And if these branches can interfere and hence influence each other, how can they warrant separate states? How can this multiplicity of parallel universes still be considered separate if they influence each other? If they do influence each other they forma  kind of unimetaverse!

Most parsimonious explanation not involving magic and only logic? I don't think so. Whereas Deutsch cum suis consider the Copenhagen theory and other interpretations as pre-Copernican errors of an obsolete Paradigm, they certainly do not represent the perspective of the average physicist or computer scientist. rather their theory is still considered fringe. But woe thee if you state that quantum mechanics is non-deterministic. They will treat you as if you are an idiot.

Personally, I am not versed well enough in physics and computer science to establish which theory is right. But I wanted to draw your attention to the fact that despite what Deutsch lobby tries to sell, namely that quantum computation via multiple worlds is the sole logical and true explanation for this phenomenon, this is not an established verified and accepted fact.

So the discussion whether reality is fully deterministic or whether quantum mechanics allows for a certain degree of indeterminacy remains an open one. Likewise it is still possible that Intelligence has facets such as intuition, which do not operate via an algorithm.

Pancomputational Panpsychism

Ha! you might say, "now you contradict yourself, Antonin Tuynman". Didn't you in your book "Transcendental Metaphysics of Pancomputational Panpsychism" claim that every process in reality we observe is computational? Yes I did, but I used a very broad definition for "computation". I considered that every process has an input and output which has somehow been integrated in a throughput phase. But this does not exclude quantum computing involving non-deterministic effects (if we do not follow Deutsch's interpretation of MWI). I also spoke about Panpsychism or Hylozoism: Every self-sustaining form of energy (or "life") traverses a kind of matrix called the Akasha, which I equated with the fabric of the so-called zero point filed or quantum vacuum. In a sense, since these conscious energies make the state of the Akasha digitally change from 0 (absence of energy) to 1 (presence thereof), you could consider this as a kind of digital computer - but not as you know it. The Akashic or Eschaton Omega computer as I have described is more holistic system generating a holographic interference patterns of the energies that traverse it. But on top of that it is not a static inflexible structure; it is more a flexible interdependent quantum foam. Rather when energies traverse this "isotropic vector matrix" they create anisotropies or distortions in this foam, which reverberate throughout the whole. The conscious energies themselves do not compute. they merely travel, obey certain laws of the foam structure and make choices whenever prompted. Conscious energy in this model is foundational. It is the irreducible Prima Materia, which on the one hand as a cell division process creates the foam structure which I call the Akasha and on the other hand also penetrates this Akasha to sense it from within as a multiplicity of individualised perspectives. The primordial conscious energy is not emergent or computed in this model. It simply IS everything, including the unorthodox digital Akasha and its traversing energies. These energies are aware in their individualised perspective, in the sense that they sense their environment. This is a kind of proto-consciousness. They become aware of each other upon approaching each other and their mutual interactions provide the 4 primary forces of physics. This creates a kind of consensus reality.

Whereas the laws of physics which emerge from the structure of the Akasha and its penetration by multiple energies create certain obligatory pathways or rules or chreodes, which the energetic entities have to follow, within those chreodes the entities have a freedom of movement. A chreode is a necessary pathway, like a valley-like trough in a mountain, which an object going down that mountain must follow. But within the chreode there is a certain room of freedom of movement. Thus nature is Grosso Modo deterministic as regards its laws, which form the chreodes, but allows for freedom of choice within the chreode. This leads to a certain degree of non-determinism and taken as a whole reality is then ultimately neither 100% deterministic nor 100% in deterministic.

Similarly, "Intelligence", the ability to achieve complex goals -which is an inherent aspect of the conscious energies- is neither fully algorithmic nor is it completely devoid of algorithmic behaviour. As explained in an earlier chapter Intelligence chooses the way of the least resistance (as does everything in Nature), and automatising routines of learnt/evolved strategies is a part thereof. Thus Intelligence in Nature does follow a lot of self-taught algorithms or default pathways when encountering known situations, to save its energy investments for truly novel situations which require enhanced attention and intuitive insight. In this respect the algorithms of Intelligence in Nature can be compared to the Default Mode Network activity of the brain, which avoids investments of high frequency energies, which are employed for novel situations that require attention.

If my Panpsychic Pancomputational model involves a kind of quantum computing, it is not necessarily the type of quantum computing which is developed by present day technology. Rather the holistic computations of the Akasha set chreodes, within which there is room for freedom, which you could even call the free will of the energetic entities. The weird quantum effects such as entanglement, tunneling, wave-particle duality have been described in my book Transcendental Metaphysics as essentially deriving from the holographic structure and dynamics of the Akasha, its interactions with its inhabitants and their mutual interactions, including their free will decisions and fall outside the scope of this chapter. Quantum computing states that are adopted by entangled particles in an annealing process for energy minimisation can be explained by a concerted communication between different energies in the matrix, which form the qubit states together. they communicate by reverberating distortions through the Akashic matrix which each of them senses and thus they seek their common energetic minimum, which, once established forms the outcome of the quantum computation. There is neither a need for multiple parallel universes, nor is there a need for a magical collapse. Rather, the non-computational sensing results in a computational outcome!

It must also be realised that in quantum theories the arrow of time points in both directions and certain delayed choice experiments have shown that a phenomenon as retrocausation might effectively be happening in Nature. This appears also in contradiction with Deutsch's claim to determinism. In Pancomputational Panpsychism the sensing energies could sense retrotemporal movements of the Akasha. Alternatively, there is no retrocausation at all, but the distortion of the Akasha after the choice is in time to set the right pathway to yield results consistent with the ultimate set-up.

Conclusion

The claim by David Deutsch that everything in reality is deterministic because it derives from quantum computing via multiple worlds, is a thesis which has neither been proven nor has been generally accepted by the scientific community. The Copenhagen interpretation is one among several interpretations which does not involve an MWI à la Everett. Both theories rely to a certain extent on something happening magically, mysteriously. My theory of Pancomputational Panpsychism IMHO does not require magic, although you may consider the notion of conscious sensing energies exactly as that. It is a theory, just as other theories, and it is not less parsimonious or less logic than the existing theories. Rather, it does not violate the basic notions of established physics. The only thing it does is provide a rationale for consciousness as irreducible ground of existence and thereby overcomes the hard problem of consciousness.

 If you liked this story, please upvote and/or resteem. You can find my book "Is Intelligence an Algorithm?" here. My other book "Transcendental Metaphysics" is available here. By Antonin Tuynman a.k.a. @technovedanta. 06-09-2017.  

Sort:  

To dig into the philosophical concept of algorithmic intelligence go there.

Very interesting and informative post.. like it and upvoted!

nice post..
very good...

yes same..
i vote you post..
you vote my post..
plase...help me..

The image of this post is copyright protected and has been copyright protected since the publication of my books Technovedanta, Transcendental Metaphysics and Is Intelligence an Algorithm.

Coin Marketplace

STEEM 0.19
TRX 0.15
JST 0.029
BTC 63350.70
ETH 2595.60
USDT 1.00
SBD 2.85