Techno Tim joins Adam to catch up on the state of Homelab for 2025, the state of AI at home and on-prem (AI Homelab) and where that's heading, building a creator PC, choosing the parts for your build, GPU availability, Windows being user hostile, and why Tim is happy to be using Windows, Mac AND Linux. :link: https://changelog.am/79
Ch | Start | Title | Runs |
---|---|---|---|
01 | 00:00 | Let's talk! | 00:38 |
02 | 00:38 | Sponsor: Retool | 02:45 |
03 | 03:23 | Skipping breakfast | 02:14 |
04 | 05:37 | Tim snacks | 01:00 |
05 | 06:36 | PEE CAN or Pecan? | 02:05 |
06 | 08:41 | Berdoll Pecans FTW | 01:13 |
07 | 09:53 | Let's talk Homelab | 02:04 |
08 | 11:57 | How Homelab began | 01:17 |
09 | 13:14 | Smaller and less power is the way | 03:43 |
10 | 16:57 | What's motivates Tim? | 03:29 |
11 | 20:26 | Script it OR wing it? | 02:44 |
12 | 23:10 | Cameras for filming | 05:04 |
13 | 28:13 | Content practices and workflows | 04:16 |
14 | 32:30 | Changelog core beliefs | 03:43 |
15 | 36:12 | Homelab stuff that's resonating with Adam | 02:21 |
16 | 38:34 | AI Homelab | 04:51 |
17 | 43:25 | On-prep AI is the future | 01:45 |
18 | 45:12 | Sponsor: Temporal | 02:02 |
19 | 47:14 | Self-hosting AI | 08:33 |
20 | 55:47 | AI at home is the next frontier | 00:32 |
21 | 56:19 | GPUs on eBay | 01:42 |
22 | 58:02 | Local AI and Ollama are the way | 06:14 |
23 | 1:04:16 | Make Ollama first-class integration | 02:10 |
24 | 1:06:26 | AI as a Service on my LAN (AIaaS) | 01:54 |
25 | 1:08:20 | GPT is the first draft word calculator | 06:28 |
26 | 1:14:49 | AI builds from Tim | 00:31 |
27 | 1:15:20 | Building a creator PC | 05:14 |
28 | 1:20:34 | BUT Windows is user hostile | 06:35 |
29 | 1:27:09 | Paying the Apple tax | 02:51 |
30 | 1:30:00 | Why not BOTH Windows and Mac? | 01:23 |
31 | 1:31:25 | Sponsor: DeleteMe | 01:52 |
32 | 1:33:17 | Adam's creator PC build | 05:18 |
33 | 1:38:35 | Understanding the various GPUs | 02:44 |
34 | 1:41:19 | Planning for PCIe lanes | 04:09 |
35 | 1:45:28 | Gamers pushed the innovation | 04:07 |
36 | 1:49:35 | The GPU bottleneck | 02:58 |
37 | 1:52:33 | Tim's Linux Workstation | 04:03 |
38 | 1:56:35 | The hard drive conspiracy! | 02:59 |
39 | 1:59:35 | Should we do this more often? | 02:54 |
40 | 2:02:29 | Wrapping up | 00:46 |
41 | 2:03:16 | Closing thoughts and stuff | 01:58 |
Feel the pain on the PCIe lanes - built my first PC in many years last week and that was the deciding factor on motherboards - trying to find one that will support some flexibility. (and workstation CPUs were out of budget!)
My next challenge is GPU. I'm curious on the feelings for whether 7b parameter models are any good for document work (summarizing, reviewing)? I'm currently thinking of a 16GB Radeon although I may need to work with cuda in the future but getting 16GB on NVidia is not cheap!
I've been using 5GB models on AMD and nVidia GPUs and they're fine (code completion, idea generation, basic chat)
A human with subject matter expertise has to review and edit the output of any LLM all the time anyway
Magical episode - the start with @Adam Stacoviak and Tim talking about pecans for 10 min. Awesomeness
Very fun episode to listen to. Glad to see I'm not the only one that struggled with the creator PC build. I also loathe Windows but I do miss the ability to build my system when using macOS. Linux is great for development but content creation gets more difficult.
I'm in the process of changing my opinion on Windows due to the level of integration with WSL2. I'm in the process of installing Windows 11 Pro (this weekend) so maybe I'll have more to share soon.
I love building machines and there's enough de-bloat work out there for Windows for me to give it a true attempt.
For folks who won't abandon Windows: https://atlasos.net/
I tried to like the whole WSL2 model but it wasn't for me. This was a few years ago now though so maybe worth giving it another go. No Ghostty on Windows yet though (I think you can build from source maybe?).
Let us know your experience and share some de-bloat resources if you can.
Ron Waldon-Howe said:
For folks who won't abandon Windows: https://atlasos.net/
Oh this is such a debloat resource. Hmmmm. Maybe I'll try this.
https://github.com/ChrisTitusTech/winutil
This is what I’m looking at when I do the install this weekend.
I went down this road lately and was even attempting to use Windows + WSL for my day job, which involved a lot of docker/linux VM work. I don't know what it was specifically, but WSL just started to feel really laggy to the point where I ended up enabling Hyper-V and just spinning up a regular Ubuntu VM and worked there. Definitely not as easy to get started with as WSL, but felt a lot nicer as a daily driver. Windows terminal too felt great at the beginning, but in day-to-day use it just felt like the experience became worse and worse the more I used it, and I replaced it with Alacritty. I think a future story of native Ghostty + Zed windows releases with first-class WSL integrations could be pretty nice, but it's just not there yet.
WezTerm is pretty great and runs on Windows, for what that's worth. I switched back to it from Ghostty
Yep, I'm back on wezterm, after briefly trying ghostty and then getting back into alacritty :)
Ghostty and Alacritty choosing OpenGL when Vulkan is sitting right there is a real head scratcher now :)
Hello from my AI Homelab powered by Windows 11 Pro. WSL2 is cool and seems to be super fast for me. I'm running Ollama and Open WebUI in Docker. Tailscale runs the same here as macOS or Linux. So, connecting to Ollama as a service anywhere in the world is too easy. Now I'm testing Plex Media Server out because why waste all this RTX 3090 GPU that I have.
Post those hardware specs!
I have a TOML file that maps package names across different Linux distributions, and Ollama+Qwen2.5 actually suggested the correct name for a Debian package after I typed in the name for the same package on Arch Linux
It had incorrect punctuation around it, of course, but it was otherwise surprisingly correct
Last updated: Apr 05 2025 at 11:36 UTC