"We hope that such tools may help us to gain novel insight into the psychology of an understudied pool of humans—namely, the dead"
Overview of work on HLLMs - language models trained on historical texts to simulate historical attitudes and perspectives. https://t.co/joGjm7brgs https://t.co/lg59eRS1So
Cody now has a mechanism for pulling in context from *outside* the codebase!
Introducing OpenCtx, a protocol for providing relevant technical context to humans and AI. This builds on Sourcegraph's foundation as the world's best code search and connects our code graph to entities like issues, designs, technical docs,... See more
🎥 New talk: "How Might We Learn?"
A (proto-?)vision talk of sorts—a first attempt at a broader picture of the future of learning I want to create, particularly given developments in AI.
Thanks to @HaijunXia and @ProfHollan for hosting me! 🙇♂️
(YT link in thread... See more
Instead of treating AGI as a binary threshold, I prefer to treat it as a continuous spectrum defined by comparison to time-limited humans.
I call a system a t-AGI if, on most cognitive tasks, it beats most human experts who are given time t to perform the task.
More details:
A single fairly unknown Dutch company makes maybe the most expensive and complex non-military device ($200M) that builds on 40 years of Physics and has a monopoly responsible for all AI advancement today.
Here's the story of ASML, the company powering Moore's Law..
1/9... See more
In 2019, OpenAI announced GPT-2 with this post:
https://t.co/jjP8IXmu8D
Today (~5 years later) you can train your own for ~$672, running on one 8XH100 GPU node for 24 hours. Our latest llm.c post gives the walkthrough in some... See more