Stream: general

Topic: AI Homelab


view this post on Zulip Adam Stacoviak (Feb 01 2025 at 20:33):

Guess who has DeepSeek-R1 running in their new AI Homelab? I really need to build a better rig. The integrated GPU I’m running is fine for smaller models, but this isn’t going to be much fun “unless you’ve got more power!”

view this post on Zulip Matt Johnson (Feb 01 2025 at 20:35):

Do you have any photos of this new experiment you've got running?

view this post on Zulip Adam Stacoviak (Feb 01 2025 at 20:35):

Soon

view this post on Zulip Ron Waldon-Howe (Feb 01 2025 at 23:11):

I wonder if these sorts of USB devices are still useful for these kinds of models and sizes of models? https://coral.ai/products/accelerator/

view this post on Zulip Sukhdeep Brar (Feb 03 2025 at 01:08):

How are accessing it? Home Assistant?

view this post on Zulip Thomas Eckert (Feb 15 2025 at 22:56):

I'm so excited to get started with this. I... need a GPU first.

@Justin Garrison had a good video on doing this with Talos: https://www.youtube.com/watch?v=HiDWGs1PYhc

I run Talos across a few Raspberry Pis. I'm going to either build a tower with a GPU to act as another node that can run AI or try to get a Jepson.

view this post on Zulip Sukhdeep Brar (Feb 17 2025 at 01:00):

I've had good success running Deep Seek with Ollama on my Mac M4. Not really home lab, since I don't have my (work) laptop running 24/7, but just wanted to point out that a Mac Mini for local AI is a real option. Depending on the use cases, even running on a CPU is fine, if the task isn't time sensitive (like some agentic/reasoning workflows)


Last updated: Apr 03 2025 at 23:38 UTC