Guess who has DeepSeek-R1 running in their new AI Homelab? I really need to build a better rig. The integrated GPU I’m running is fine for smaller models, but this isn’t going to be much fun “unless you’ve got more power!”
Do you have any photos of this new experiment you've got running?
Soon
I wonder if these sorts of USB devices are still useful for these kinds of models and sizes of models? https://coral.ai/products/accelerator/
How are accessing it? Home Assistant?
I'm so excited to get started with this. I... need a GPU first.
@Justin Garrison had a good video on doing this with Talos: https://www.youtube.com/watch?v=HiDWGs1PYhc
I run Talos across a few Raspberry Pis. I'm going to either build a tower with a GPU to act as another node that can run AI or try to get a Jepson.
I've had good success running Deep Seek with Ollama on my Mac M4. Not really home lab, since I don't have my (work) laptop running 24/7, but just wanted to point out that a Mac Mini for local AI is a real option. Depending on the use cases, even running on a CPU is fine, if the task isn't time sensitive (like some agentic/reasoning workflows)
Last updated: Apr 03 2025 at 23:38 UTC