Fine Tune DeepSeek R1 | Build a Medical Chatbot
Here’s a video showing how to fine-tune Mixtral, Mistral's 8x7B Mixture of Experts (MoE) which outperforms Llama2 70B!
The video walkthrough is easy-to-follow and uses QLoRA so you don’t need A100s
YT link below 🤙 https://t.co/Z6vWAQvk6z
Harper Carrollx.com