Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Advice on NAS and local LLM server combo

#2
Option 1: Low-Spec NAS + RTX 3090 eGPU
Pros:
Cost-effective for storage and AI compute if you already own the laptop and GPU.
Easy to scale storage separately from compute.
Cons:
Limited by Thunderbolt’s bandwidth for AI workloads.
Laptop performance might bottleneck the RTX 3090 for LLMs.
Verdict: Viable if cost and flexibility are top priorities, but performance for AI tasks may be constrained.
Option 2: Build a Home Server (Recommended)
Pros:
Combines high-performance local AI compute and NAS functionality.
Fully customizable—can include a high-performance GPU (e.g., RTX 3090) and robust storage options (SATA + NVMe).
Future-proof with upgrade potential for AI and storage.
Cons:
Initial setup complexity and potential noise/power considerations.
Verdict: Best balance of performance and scalability for your AI and NAS needs. A Jonsbo N5 or similar chassis is ideal for this setup.
Option 3: Buy a NAS + Mini PC with eGPU
Pros:
Dedicated devices for storage and compute.
Easier maintenance and management for NAS.
Cons:
Higher cost due to separate systems.
Limited upgrade flexibility compared to a combined server build.
Verdict: Good for simplicity and modularity but less cost-efficient for high-performance use.
Option 4: Buy Zettlab D6 Ultra
Pros:
AI-focused features out of the box.
Simplifies setup with an integrated solution.
Cons:
Limited local AI customization compared to an RTX 3090-based solution.
Expensive relative to performance for LLM workloads.
Verdict: Ideal if you want a plug-and-play AI NAS but lacks the flexibility of custom builds.
Recommended Build (Option 2)
Chassis: Jonsbo N5 or similar for compact yet spacious design.
CPU/Mobo: AMD Ryzen 7000 series or Intel 13th Gen with an ITX/ATX board supporting high PCIe bandwidth for GPUs.
GPU: RTX 3090 (or better if budget allows) for LLM tasks.
RAM: 64GB+ for handling AI workloads.
Storage:
NVMe: 1–2TB for OS and AI datasets.
SATA HDDs: 16–24TB for NAS storage in RAID 1/5.
PSU: 750W+ for stability with high-performance components.
This setup ensures high performance for local AI tasks while providing ample storage and future-proofing for both NAS and LLM use.
Reply


Messages In This Thread
RE: Advice on NAS and local LLM server combo - by ed - 01-10-2025, 11:41 AM

Forum Jump:


Users browsing this thread: 2 Guest(s)