Skip to content

Information Entropy

SamXBauer edited this page Mar 20, 2020 · 13 revisions

Information entropy used to be treated as a purely mathematical thought model. But it's more. We are living in an information universe. Entropy grows over time means that the amount of information is constantly increasing. When our teachers used to tell us that the desk was becoming increasingly messy in order to explain the second law of thermodynamics, they always ignored the most important phenomenon in this picture: the person who not only constantly cleans up the desk, but even builds it, as well as creates all the things that lie on it.

At the particle level, life is an inexplicable loss of entropy. Where is it compensated? The answer is simple: on an electromagnetic level. From the point of view of entropy, it is necessary to organize effective particles as brains and to stimulate electromagnetic information processes through mental processes, which exponentially accelerate the increase in entropy, instead of senselessly scattering the same particles in space as the thermodynamicists of the 19th and physics teachers of the 20th century imagined when they talked about the constant growth of entropy.

Everything is information and the amount of information is constantly increasing. Everything that resists the growth of information is swept away. Those who want to implement a successful system today better build one that increases the entropy of information. How do you increase the entropy of information? Build dialectical infrastructures and let Dialectical Emergence

Informationsentropie {Historically, the entropy of information is constantly increasing. Social computing systems that are able to further accelerate them for users relative to the general increase have a fish-shaped value creation area until they corrupt and then lose their value. Historically, MySpace has been such a case, and we're currently watching it happen on Facebook.}