r/MachineLearning - Reddit
My recommendation for anyone looking to do ML work, especially training LLMs, is to use a cloud service like Lambda Labs. You'll spend less time training and you'll still be able to code while it's going on.
The 36GB RAM is dynamically shared between your system and your GPU. If you're planning to run containers and an IDE and a browser alongside... See more
The 36GB RAM is dynamically shared between your system and your GPU. If you're planning to run containers and an IDE and a browser alongside... See more