Sublime
An inspiration engine for ideas

Chaos theory is about limitations on predictability in classical physics,
David Deutsch • The Fabric of Reality
Engressia, is memorably told in Phil Lapsley’s 2013 book, Exploding the Phone, and Joybubbles is the subject of a new documentary film.)
Andrew Leland • The Country of the Blind: A Memoir at the End of Sight

The second law of thermodynamics states that the entropy of closed physical systems always tends to increase, meaning that systems march from order to disorder. Think of dropping a dash of ink into a glass of clear water. The initial state, the one in which the drop of ink is localized in a gorgeous swirl, is information-rich. There are few ways
... See moreCesar Hidalgo • Why Information Grows: The Evolution of Order, from Atoms to Economies
Wiener was as worldly as Shannon was reticent. He was well traveled and polyglot, ambitious and socially aware; he took science personally and passionately. His expression of the second law of thermodynamics, for example, was a cry of the heart: We are swimming upstream against a great torrent of disorganization, which tends to reduce everything to
... See moreJames Gleick • The Information: A History, a Theory, a Flood
The key was control, or self-regulation. To analyze it properly he borrowed an obscure term from electrical engineering: “feed-back,” the return of energy from a circuit’s output back to its input. When feedback is positive, as when the sound from loudspeakers is re-amplified through a microphone, it grows wildly out of control. But when feedback
... See moreJames Gleick • The Information: A History, a Theory, a Flood
But immediately with those first written words came the problems of storage, indexing, and accessing: Where should the writing be stored so that it (and the information it contains) won’t get lost? If the written message is itself a reminder, a kind of Stone Age “To Do” list, the writer needs to remember to look at it and where she put it.
Daniel Levitin • The Organized Mind
Shannon gave the average information rate of a code a name. He called it entropy.