r/LocalLLaMA - Reddit
Open-source LLMs are getting really good.
They’re not as powerful as GPT-4, but they’re improving quickly and worth experimenting with.
If you want to run AI models like Mistral-7B on your laptop this is the easiest way to do it. https://t.co/MxFaA8ZNGC
Mckay Wrigleyx.comI have successfully compiled and run GLM-130b on a local machine! It's now running in `int4` quantization mode and answering my queries.
I'll explain the installation below; if you have any questions, feel free to ask!
https://t.co/EjYPBTbq1J
Alex J. Champandard 🌻x.com