Generative AI
"A key challenge of (LLMs) is that they do not come with a manual! They come with a “Twitter influencer manual” instead, where lots of people online loudly boast about the things they can do with a very low accuracy rate, which is really frustrating..."
Simon Willison, attempting to explain LLM
“The fact that these things model language is probably one of the biggest discoveries in history. That you (LLM) can learn language by just predicting the next word … — that’s just shocking to me.”
- Mikhail Belkin, computer scientist at the University of California
4 questions of organizations [about using AI].
- What did you do that was valuable that's no longer valuable?
- What impossible things can you now do that you could not before?
- What can you democratize and bring down market?
- What can you do upmarket so you have new ways of competing?
Ethan Mollick, https://www.forbes.com/sites/jenamcgregor/2024/
Ethan Mollick • Which AI should I use? Superpowers and the State of Play
Ethan Mollick
Ethan Mollick • Which AI should I use? Superpowers and the State of Play
Meta AI released LLaMA ... and they included a paper which described exactly what it was trained on. It was 5TB of data.
2/3 of it was from Common Crawl. It had content from GitHub, Wikipedia, ArXiv, StackExchange and something called “Books”.
What’s Books? 4.5% of the training data was books. Part of this was Project Gutenberg, which is public