Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

NAS LOCAL LLM

#1
So I’ve been teaching myself as much as I can as fast as I can lately. Either the help of AI have created a generative ai architecture where I use Huggins face api then will keep mom as I call it for MY OWN MANUS , stored locally! I can deploy different models for different tasks but ultimately mom stays with me and keeps our projects and inventions, ideas locally. Problem is I’m almost done and I don’t know what to purchase to store MoM. I do t want to throw my money down the toilet only to have to purchase total upgrades in the short term future. I would like to use what I purchase now and be able to expand later but not necessarily have to buy the same parts but better in near future. I’d also like to be able to access it from different sources, web, phone and easy setup but also have to worry about security. $1,0000 is at the top right now but if I have to I’ll figure something out to come up with more. I don’t like to think about storing in the cloud. It with my budget??
Reply
#2
First off — you’ve done an impressive job getting this far. Architecting and running your own LLM stack locally, with HuggingFace integration and your own “MoM” setup, is exactly the kind of forward-thinking self-hosting we love to see. Well done.

Now, onto your question: if you want to make sure your hardware investment is future-proof, allows for expansion later, and supports secure, local-first access from all your devices — while staying close to your $1,000 budget — here’s what I’d recommend.

Why You’ll Want Expansion

Because you’re running generative AI workloads, you will eventually benefit from a GPU upgrade. LLMs, even when running locally and trimmed down, scale much better with GPU acceleration. So you don’t want to get stuck with a platform that has no PCIe slot or way to attach an external GPU later.

3 Strong Options

1️⃣ QNAP TVS-h474 / h674 (or 74 Series)
• Around $1,000–$1,200 for the 4-bay h474.
• Comes with a PCIe Gen4 slot so you can add a GPU later.
• Supports ZFS via QuTS hero OS, which is ideal for data integrity on sensitive models and projects.
• Well-tested platform with excellent remote access features and decent security tools out of the box.
• Quiet and small enough for home/office use.

2️⃣ Aoostar WTR Max NAS
• Similar pricing ($950–$1,100 depending on where you source it).
• Ryzen 7 Pro CPU, tons of NVMe slots for fast SSD storage.
• No internal PCIe slot — but it has an OCuLink port, which lets you attach a PCIe external GPU enclosure later on. So you still have an upgrade path.
• Runs TrueNAS or Proxmox well if you want to stay closer to open-source.
• More power and flexibility than most commercial NAS units.

3️⃣ Minisforum N5 Pro
• ~$1,100–$1,200.
• AMD Ryzen 9, very capable CPU for AI workloads.
• Multiple NVMe slots for fast SSD storage.
• No PCIe slot inside, but does support OCuLink external PCIe expansion like the WTR Max.
• Very compact and quiet, but more DIY-friendly than turnkey.

What I’d Choose

If you want something more plug-and-play with strong software support (and don’t mind QNAP’s quirks), the QNAP h474 is a solid choice.
If you prefer a more open, DIY-friendly, Linux/ZFS stack with tons of compute headroom, and don’t mind setting up your own services, the WTR Max is probably your best match.
The N5 Pro sits in between: super powerful, but a bit less flexible than the WTR in the long run.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)