Google just released A Personal Health Large Language Model (PH-LLM), a version of Gemini fine-tuned for personal health and wellness.
When I used to race triathlon at a competitive level I was collecting so much data: sleep data, workout metrics, professional examinations, behaviour tracking, etc.... See more
"We hope that such tools may help us to gain novel insight into the psychology of an understudied pool of humans—namely, the dead"
Overview of work on HLLMs - language models trained on historical texts to simulate historical attitudes and perspectives. https://t.co/joGjm7brgs https://t.co/lg59eRS1So
Don't use Sci-Hub — it's a "controversial" website with 84M+ research papers freely available.
We should try to make billion-dollar academic publishers richer.
Here's an updated thread on integrating Sci-Hub with Zotero to get free papers.
Please don't do this😉
I wanted to see if AI could code me a complex app.
Not a crappy little one-off script. A real program.
Just one little problem: I mostly suck at coding.
So can AI make magic for someone like me?
Yeah. But...it's... See more
Cody now has a mechanism for pulling in context from *outside* the codebase!
Introducing OpenCtx, a protocol for providing relevant technical context to humans and AI. This builds on Sourcegraph's foundation as the world's best code search and connects our code graph to entities like issues, designs, technical docs,... See more
In 2019, OpenAI announced GPT-2 with this post:
https://t.co/jjP8IXmu8D
Today (~5 years later) you can train your own for ~$672, running on one 8XH100 GPU node for 24 hours. Our latest llm.c post gives the walkthrough in some... See more