
🧠AI’s $100bn question: The scaling ceiling

Huge “foundation models” are turbo-charging AI progress
economist.com
“There was a long period of time where the right thing for [Isaac] Newton to do was to read more math textbooks, and talk to professors and practice problems ... that’s what our current models do,” said Altman, using an example a colleague had previously used.
But he added that Newton was never going to invent calculus by simply reading about geomet... See more
But he added that Newton was never going to invent calculus by simply reading about geomet... See more
Madhumita Murgia • OpenAI CEO Sam Altman wants to build AI “superintelligence”
I don't think that the scaling hypothesis gets recognized enough for how radical it is. For decades, AI sought some kind of master algorithm of intelligence. The scaling hypothesis says that there is none: intelligence is the ability to use more compute on more data.
Joscha Bach • Tweet

