Sublime
An inspiration engine for ideas
What Is Entropy? A Measure of Just How Little We Really Know. | Quanta Magazine
Zack Savitskyquantamagazine.org
statistical entropy, a measure of configurational disorder.
Bobby Azarian • The Romance of Reality: How the Universe Organizes Itself to Create Life, Consciousness, and Cosmic Complexity
One shouldn’t necessarily think of information in terms of meaning. Rather, one might think of it in terms of its ability to resolve uncertainty. Information provided a recipient with something that was not previously known, was not predictable, was not redundant. “We take the essence of information as the irreducible, fundamental underlying uncert
... See moreJon Gertner • The Idea Factory: Bell Labs and the Great Age of American Innovation
Compressibility and Shannon’s entropy provide a baseline measure of the intrinsic information content of data. But the health of that data must also relate to how robust it is; how well encoded it is to withstand noise and corruption. And when an organism’s data contains mutual, survival related information about the organism and its environment, t
... See moreCaleb Scharf • The Ascent of Information: Books, Bits, Genes, Machines, and Life's Unending Algorithm
Well, the mathematical equation for Shannon’s uncertainty, which measures the amount of ignorance one has concerning the information in a message, is the same expression used to calculate Boltzmann’s statistical entropy, a measure of configurational disorder.
Bobby Azarian • The Romance of Reality: How the Universe Organizes Itself to Create Life, Consciousness, and Cosmic Complexity
the buildup of information with input energy in ecosystems is actually quite slow and sublinear (bS < 1).
Luis M. A. Bettencourt • Introduction to Urban Science: Evidence and Theory of Cities as Complex Systems
the increase in disorder is also associated with a loss of information