Sublime
An inspiration engine for ideas
Elliott Cost
elliott.computer





What does DeepSeek R1 & v3 mean for LLM data?
Contrary to some lazy takes I’ve seen, DeepSeek R1 was trained on a shit ton of human-generated data—in fact, the DeepSeek models are setting records for the disclosed amount of post-training data for open-source models:
- 600,000 reasoning data [1]
- 200,000 non-reasoning SFT data [2]
- human prefere... See more
Justin Tejada
justintejada.com
Alexander Obenauer
alexanderobenauer.com
First time founders are obsessed with product.
Second time founders are obsessed with distribution.