The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
amazon.com
The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World
If you want to be tomorrow’s authority, ride the data, don’t fight it.
My “customer” neurons, downstream in the network, will tell me how well I’m doing in the next round.
Despite the popularity of decision trees, inverse deduction is the better starting point for the Master Algorithm. It has the crucial property that incorporating knowledge into it is easy—and we know Hume’s problem makes that essential. Also, sets of rules are an exponentially more compact way to represent most concepts than decision trees. Convert
... See morethe way to combine the two is to use genetic search to find the structure of the model and let gradient descent fill in its parameters. This is what nature does: evolution creates brain structures, and individual experience modulates them.
just start by guessing a class for each object any way you want—even at random—and you’re off to the races. From those classes and the data, you can learn the class models; based on these models you can reinfer the classes and so on.
Nearest-neighbor is prone to overfitting: if we have the wrong class for a data point, it spreads to its entire metro
The Master Algorithm is neither genetic programming nor backprop, but it has to include the key elements of both: structure learning and weight learning.
Hebb’s rule, as it has come to be known, is the cornerstone of connectionism. Indeed, the field derives its name from the belief that knowledge is stored in the connections between neurons. Donald Hebb, a Canadian psychologist, stated it this way in his 1949 book The Organization of Behavior: “When an axon of cell A is near enough cell B and repeat
... See more