Sublime
An inspiration engine for ideas

Great read from @sarahookr and @seb_ruder on Steering Synthetic Data Generation to Target Non-Differentiable Objectives.
Active Inheritance with a Mixture of Agents is a powerful tool to increase diversity, leveraging the collective strengths of various models and creating an "ensemble" effect that fosters a richer and... See more
Guérot was targeted by powerful forces, including elements of the German state, because she was an influential public intellectual who represented a threat to the status quo. For this, she needed not just to be cancelled but to be destroyed.
My latest:
https://t.co/iThC8YQl0K
Thomas Fazix.comKeynote: Rewiring How We Learn: The Power of an Experimental Mindset | SXSW EDU 2025
youtube.comregen is the final meta
(my talk from #consensusaustin2023) https://t.co/gr5oZo4IzY
Kev.Ξth brewin’ GG24 🍲✨x.comSilicon Valley is an extraordinary theatre of sociology right now. Apropos of nothing, my observations about a few people I'm paying attention to:
- Why Sam Altman does interviews at all is beyond me. He's so verbally slithering it's an absolute insult to the public. He thinks you are stupid or too cowardly to say it... See more
Justin Murphyx.com
Many recent frontier LLMs like Grok-3 and DeepSeek-R1 use a Mixture-of-Experts (MoE) architecture. To understand how it works, let’s pretrain an MoE-based LLM from scratch in PyTorch…
nanoMoE is a simple (~500 lines of code) but functional implementation of a mid-sized MoE model that can be pretrained on commodity... See more
Second, continuing with what they are doing is going to be increasingly costly. Already Facebook recently promised to hire another ten thousand moderators; YouTube has also promised to hire “thousands” of moderators. Hiring all those people is going to be an increasing cost on these companies as well. Switching to a protocols-based system would... See more

