Sublime
An inspiration engine for ideas
What Is Entropy? A Measure of Just How Little We Really Know. | Quanta Magazine
Zack Savitskyquantamagazine.org
This region of the dynamic spectrum, where outdated order dissolves into a creative and responsive chaos from which novel order can emerge, is often referred to as “the edge of chaos.” Stuart Kaufman suggested: “The best place for a system to be, in order to respond appropriately to a constantly changing world, is at the edge of chaos.” He explains
... See moreEntropy is the supreme law of the Universe, which illustrates that systems tend toward disorder, chaos, and destruction on all planes of existence unless energy is put into the system. In other words, nothing is permanent.
Dan Koe • The Art of Focus: Find Meaning, Reinvent Yourself and Create Your Ideal Future
statistical entropy, a measure of configurational disorder.
Bobby Azarian • The Romance of Reality: How the Universe Organizes Itself to Create Life, Consciousness, and Cosmic Complexity
One shouldn’t necessarily think of information in terms of meaning. Rather, one might think of it in terms of its ability to resolve uncertainty. Information provided a recipient with something that was not previously known, was not predictable, was not redundant. “We take the essence of information as the irreducible, fundamental underlying uncert
... See moreJon Gertner • The Idea Factory: Bell Labs and the Great Age of American Innovation
Compressibility and Shannon’s entropy provide a baseline measure of the intrinsic information content of data. But the health of that data must also relate to how robust it is; how well encoded it is to withstand noise and corruption. And when an organism’s data contains mutual, survival related information about the organism and its environment, t
... See moreCaleb Scharf • The Ascent of Information: Books, Bits, Genes, Machines, and Life's Unending Algorithm
Well, the mathematical equation for Shannon’s uncertainty, which measures the amount of ignorance one has concerning the information in a message, is the same expression used to calculate Boltzmann’s statistical entropy, a measure of configurational disorder.