LLMs are much worse than humans at learning from experience

The Illusion of Thinking: Understanding the Strengths and Limitations of Reasoning Models via the Lens of Problem Complexity

machinelearning.apple.com
Thumbnail of The Illusion of Thinking: Understanding the Strengths and Limitations of Reasoning Models via the Lens of Problem Complexity

Benedict Evans Unbundling AI

Azeem Azhar 🧠 AI’s $100bn question: The scaling ceiling

LLMs are mirrors of operator skill

Geoffrey Huntleyghuntley.com
Thumbnail of LLMs are mirrors of operator skill

A non-anthropomorphized view of LLMs

addxorrol.blogspot.com