r/LocalLLaMA - Reddit
Open-source LLMs are getting really good.
They’re not as powerful as GPT-4, but they’re improving quickly and worth experimenting with.
If you want to run AI models like Mistral-7B on your laptop this is the easiest way to do it. https://t.co/MxFaA8ZNGC
Mckay Wrigleyx.comOllama
ollama.comIs @ollama still the best way to run LLAMA 3 locally? How can I get it to run in a ChatGPT style UX locally?
Peter Yangx.com