Sort:  

Entropy is not disorder. See.: http://www.science20.com/train_thought/blog/entropy_not_disorder-75081 -- The relation order/entropy, see.: http://www.informationphilosopher.com/solutions/scientists/layzer/ -- It is not even necessary to start with actually low entropy. See.: http://www.science20.com/the_hammock_physicist/immortal_unbounded_universe-134704 -- Current state at subatomic level - you can't currently, see.: https://en.wikipedia.org/wiki/Mutual_information -- BUT, information is indestructible... so you can unscramble any past state if you live long enough and grow to sufficient Shannon Scale. See.: https://en.wikipedia.org/wiki/Information_theory -- A thing does not need to be a 'who' to 'observe' and 'measure'. Measurement is ... interaction. Information is a verb, not noun :)

Too much pseudo-science in there for my liking!

I challenge you to point out a piece of such here.

I strongly disagree with the piece on entropy. True, it is not disorder per se but it is strongly related to it. I don't like the way that the author makes broad, sweeping statements like "The temperature and entropy of a system is only well defined for systems that are homogeneous and in thermal equilibrium." - which I would consider untrue. Also that he never really defines entropy, at least not beyond the "heat" explanation at the beginning, which is rather incomplete and/or irrelevant to his later discussions. That end summary of his is missing a LOT of substance.

As for "information is a noun" I disagree wholeheartedly. The action of measuring something can affect it, especially at the subatomic level. But that does not necessarily imply that all information is dynamic in nature. All matter and energy is dynamic, but not information. For example: my statement that "all matter and energy is dynamic" is immutable. It's a fixed piece of information, a noun.

Okay. I won't argue on ''Did you check Layzer?'' etc. Lets compact it - on your first paragraph will ask you.: ''Name at least one closed system.'' and on your second paragraph.: ''you do not measure streams? your noun-sentence occured whole and instantly?'' - name please a think which is not matter-form-process simultaneously? Information is a verb, not noun - exactly in-form ... forget about quantum collapses by seeing something :) it is not in the formulae. But how things limit each others' entropy is = measurement, interaction. And entropy is nothing but degrees of freedom.

Coin Marketplace

STEEM 0.23
TRX 0.12
JST 0.029
BTC 66756.40
ETH 3468.50
USDT 1.00
SBD 3.19