The Ascent of Information: Books, Bits, Genes, Machines, and Life's Unending Algorithm
Caleb Scharfamazon.com
The Ascent of Information: Books, Bits, Genes, Machines, and Life's Unending Algorithm
Homo sapiens got lucky and kept on going, coevolving with its external data.
You can think about Shannon’s entropy as a way to measure the size of those instructions, and therefore a measure of the thermodynamic conditions they describe. Informational entropy and physical entropy are two inextricably linked sides to the same story.
In the history of information theory, and science in general, one of the most influential research papers of the twentieth century is Claude Shannon’s “A Mathematical Theory of Communication,”
An answer might be that more complex life can do so much better at decision making, at parsing environmental information to its advantage (overseen by the selective demons), that it outweighs the burden.
When I say that an organism “has information,” what I mean is that it encodes something about the external world in itself.
clever illustration, a group game of twenty questions that he called “negative twenty questions.” In this variant of the usual play, the guesser asking yes/no questions believes the group being interrogated has a single item in mind that they’ve all agreed on in the guesser’s absence. In actuality, each person can start with whatever they want in m
... See moreShannon’s insight was that we can, with care, make the encoding more robust, at the cost of some extra data.
The answer to whether or not robots have DNA appears to be that they have something that accomplishes most, if not all, of the same function in the world. But it’s differently implemented. Their core, heritable information does not need to be held in individuals, or even within a given species. That information is dispersible, although often locali
... See moreHere, meaningful information is information that influences the processes of natural selection.