Saved by Prashanth Narayan and
Deep Learning Is Hitting a Wall
What does “manipulating symbols” really mean? Ultimately, it means two things: having sets of symbols (essentially just patterns that stand for things) to represent information, and processing (manipulating) those symbols in a specific way, using something like algebra (or logic, or computer programs) to operate over those symbols.
Gary Marcus • Deep Learning Is Hitting a Wall
"There is no one way the mind works, because the mind is not one thing. Instead, the mind has parts, and the different parts of the mind operate in different ways: Seeing a color works differently than planning a vacation, which works differently than understanding a sentence, moving a limb, remembering a fact, or feeling an emotion.” Trying to squ... See more
Gary Marcus • Deep Learning Is Hitting a Wall
Gary Marcus • Deep Learning Is Hitting a Wall
Gary Marcus • Deep Learning Is Hitting a Wall
When the stakes are higher, though, as in radiology or driverless cars, we need to be much more cautious about adopting deep learning. When a single error can cost a life, it’s just not good enough. Deep-learning systems are particularly problematic when it comes to “outliers” that differ substantially from the things on which they are trained.
Gary Marcus • Deep Learning Is Hitting a Wall
With all the challenges in ethics and computation, and the knowledge needed from fields like linguistics, psychology, anthropology, and neuroscience, and not just mathematics and computer science, it will take a village to raise to an AI.
Gary Marcus • Deep Learning Is Hitting a Wall
For at least four reasons, hybrid AI, not deep learning alone (nor symbols alone) seems the best way forward:
- So much of the world’s knowledge, from recipes to history to technology is currently available mainly or only in symbolic form.
- Deep learning on its own continues to struggle even in domains as orderly as arithmetic. A hybrid system may have
Gary Marcus • Deep Learning Is Hitting a Wall
But symbols on their own have had problems; pure symbolic systems can sometimes be clunky to work with, and have done a poor job on tasks like image recognition and speech recognition; the Big Data regime has never been their forté. As a result, there’s long been a hunger for something else.
Gary Marcus • Deep Learning Is Hitting a Wall
In 2020, Jared Kaplan and his collaborators at OpenAI suggested that there was a set of “scaling laws” for neural network models of language; they found that the more data they fed into their neural networks, the better those networks performed.10 The implication was that we could do better and better AI if we gather more data and apply deep learni... See more