Why large language models struggle with long contexts
What can LLMs never do?
strangeloopcanon.comWhy Chat With PDF Is Hard And How ChatLLM Gets It Right
Chatting on long docs is hard because most LLMs other than Gemini don't have a large context.
However, even with Gemini's 1M context length, in-context learning is hard, and if you stuff the doc in the context, it doesn't do a good job.... See more
Bindu Reddyx.com