They don’t know how to solve the hard problems of LLMs: hallucinations, unreliability, jailbreaking, prompt injections, fallibility – the same problems that existed five years ago exist today. None have been solved. On the contrary, hallucination rates have gone up lately. What’s going on? Do they just not care enough? Do they not know how to solve... See more