In this video, we’ll take you through the creation of an 11 GPU Homeserver for AI with a massive 228GB VRAM! If you’re considering building a powerful setup to handle large AI models and workloads, this project shows you exactly what it takes.
Our server is loaded with:
3x RTX 4060 TI (16GB VRAM each)
1x RTX 3060 (12GB VRAM)
1x RTX 3090 (24GB VRAM)
6x Nvidia P40 (24GB VRAM each)
With a total of 228GB VRAM, this build is capable of handling even the most demanding inference tasks – but today, we’re focusing on the build itself. From choosing the right motherboard and power supplies to managing cooling and PCIe bifurcation, we’ll cover all the key considerations for a complex DIY server build.
This is more than just a budget setup; it's a learning experience in power management, airflow, and hardware customization. For those looking to push the limits of AI hardware without breaking the bank, this video is for you!
Ещё видео!