Saved by Prashanth Narayan and
Deep Learning Is Hitting a Wall
In 2020, Jared Kaplan and his collaborators at OpenAI suggested that there was a set of “scaling laws” for neural network models of language; they found that the more data they fed into their neural networks, the better those networks performed.10 The implication was that we could do better and better AI if we gather more data and apply deep... See more
Gary Marcus • Deep Learning Is Hitting a Wall
Manipulating symbols has been essential to computer science since the beginning, at least since the pioneer papers of Alan Turing and John von Neumann, and is still the fundamental staple of virtually all software engineering—yet is treated as a dirty word in deep learning.To think that we can simply abandon symbol-manipulation is to suspend... See more
Gary Marcus • Deep Learning Is Hitting a Wall
In time we will see that deep learning was only a tiny part of what we need to build if we’re ever going to get trustworthy AI.
Gary Marcus • Deep Learning Is Hitting a Wall
For at least four reasons, hybrid AI, not deep learning alone (nor symbols alone) seems the best way forward:
- So much of the world’s knowledge, from recipes to history to technology is currently available mainly or only in symbolic form.
- Deep learning on its own continues to struggle even in domains as orderly as arithmetic. A hybrid system may have
Gary Marcus • Deep Learning Is Hitting a Wall
There are serious holes in the scaling argument. To begin with, the measures that have scaled have not captured what we desperately need to improve: genuine comprehension.
Gary Marcus • Deep Learning Is Hitting a Wall
But symbols on their own have had problems; pure symbolic systems can sometimes be clunky to work with, and have done a poor job on tasks like image recognition and speech recognition; the Big Data regime has never been their forté. As a result, there’s long been a hunger for something else.
Gary Marcus • Deep Learning Is Hitting a Wall
Belittling unfashionable ideas that haven’t yet been fully explored is not the right way to go.
Gary Marcus • Deep Learning Is Hitting a Wall
Deep-learning systems are outstanding at interpolating between specific examples they have seen before, but frequently stumble when confronted with novelty.