Sublime
An inspiration engine for ideas
In the early eighteenth century it was discovered that in many situations the amount of information in a set of data was only proportional to the square root of the number n of observations, not the number n itself.
Stephen M. Stigler • The Seven Pillars of Statistical Wisdom
In order to make the case for evolution by natural selection, it was essential to establish that there was sufficient within-species heritable variability:
Stephen M. Stigler • The Seven Pillars of Statistical Wisdom
In even approximate equilibrium, the variability Darwin both required and demonstrated existed was in conflict with the observed short-term stability in populations.
Stephen M. Stigler • The Seven Pillars of Statistical Wisdom
Keeping Up with the Quants: Your Guide to Understanding and Using Analytics
amazon.com
According to the central limit theorem, proven in 1810 by Pierre-Simon Laplace, any such random process—one that amounts to a sum of a large number of coin flips—will lead to the same probability distribution, called the normal distribution (or bell-shaped curve). The Galton board is simply a visual demonstration of Laplace’s theorem.
Dana Mackenzie • The Book of Why: The New Science of Cause and Effect
Laplace continued his research throughout France’s political upheavals. In 1810 he announced the central limit theorem, one of the great scientific and statistical discoveries of all time. It asserts that, with some exceptions, any average of a large number of similar terms will have a normal, bell-shaped distribution.
Sharon Bertsch McGrayne • The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy
Galton was, in essence, able to establish some of the practical consequences of Mendelian genetics without the benefit of knowing any genetics, nearly two decades before Mendel’s work was rediscovered.
Stephen M. Stigler • The Seven Pillars of Statistical Wisdom
Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are
amazon.com
Over the course of several summers in the late 1960s, Baum and Lloyd Welch, an information theorist working down the hall, developed an algorithm to analyze Markov chains, which are sequences of events in which the probability of what happens next depends only on the current state, not past events.