Generative AI
"A key challenge of (LLMs) is that they do not come with a manual! They come with a “Twitter influencer manual” instead, where lots of people online loudly boast about the things they can do with a very low accuracy rate, which is really frustrating..."
Simon Willison, attempting to explain LLM
Ethan Mollick • Which AI should I use? Superpowers and the State of Play
Writing code from scratch is gone. This was the most exciting part of the profession. Now what remains are debugging/fixing. The same concerns graphical design (original character design in games is gone), writing music (new melodies can be generated by clicking "refresh" and then improved), literature. You never start from scratch anymore. Which i
... See more“The fact that these things model language is probably one of the biggest discoveries in history. That you (LLM) can learn language by just predicting the next word … — that’s just shocking to me.”
- Mikhail Belkin, computer scientist at the University of California
Over the next year or two, I expect GPT-4 and its successors to become a copilot for the mind: a digital research assistant that will bring to bear the sum total of everything you’ve read, everything you’ve thought, and everything you’ve forgotten every time you touch a keyboard.
It will solve some of the perennial problems in produc
... See moreDan Shipper • GPT-4: A Copilot for the Mind
“We don’t know what capabilities GPT-5 will have until we train it and test it. It might be a medium-size problem right now, but it will become a really big problem in the future as models become more powerful.”
Ethan Mollick • Which AI should I use? Superpowers and the State of Play
Meta AI released LLaMA ... and they included a paper which described exactly what it was trained on. It was 5TB of data.
2/3 of it was from Common Crawl. It had content from GitHub, Wikipedia, ArXiv, StackExchange and something called “Books”.
What’s Books? 4.5% of the training data was books. Part of this was Project Gutenberg, which is public dom