[1] Examples of this kind of positive transfer: (1) pre-training on language helps speech generation, (2) pre-training on code helps in broader reasoning tasks, (3) LLMs improve on math and coding after multimodal training.[2] scGPT is an early example of an approach that can potentially scale much further.[3] In a recent paper, a group of biologis... See more
DeepMind’s initial pitch was to “solve intelligence and then use it to solve everything else.” This has been the guiding philosophy of AGI labs for the past decade. I believe that rather than solving intelligence, we should be focusing on solving the “everything else” part first. That is, we should be bu... See more
The more I meet people who've gone deep into generating AI media, the more I realize we're all reaching the same conclusion: this is a new medium. Not an evolution of something else. Something entirely new, the way photography and film were new.
To understand any medium, you need to look beyond its surface to its core. Some ... See more