In 2019, OpenAI announced GPT-2 with this post:
https://t.co/jjP8IXmu8D
Today (~5 years later) you can train your own for ~$672, running on one 8XH100 GPU node for 24 hours. Our latest llm.c post gives the walkthrough in some... See more
Really like how accessible but also thoughtful this book is.
Instead of jumping into the mechanics of ChatGPT (that it gets to) the first part of the book is about the evolution of neural nets (and deep learning), limitations and the concept of computational irreducibility. https://t.co/gPIvaz7wcf
Great little write-up on Copilot Workspace that really differentiates from the rest of the market: https://t.co/bfvbYjpEa0
> Among all the talk of “agents”, this need to design for this repeated AI-human divergence and alignment can be lost. If the AI is going to take steps towards a solution, it must “take the human... See more