Enthusiast arms 12-slot NVMe NAS with an Nvidia RTX GPU to run local ChatGPT

StorageReview's image of an RTX A4000 in QNAP NAS.
(Image credit: StorageReview.com)

News outlet StorageReview is running an AI chatbot on a NAS powered by Nvidia's RTX A4000 graphics card, putting ChatGPT capabilities on a local server. YouTube computer hardware creators typically have a penchant for the unusual and silly, but StorageReview has created a solution that's matched in its uniqueness by its usefulness.

As seen in the embedded YouTube video above, Jordan from StorageReview has accomplished this task without too much rigorous DIY. The base of the project is a QNAP TS-h1290FX, a prebuilt NAS with 12 NVMe slots, an AMD Epyc processor, and 256 GB of RAM out of the box. The reason for this specific NAS is two-fold: It allows for an internal GPU to be added, and it supports the QNAP hypervisor program Virtualization Station for creating virtual machines (VMs). To this shell is added the Nvidia RTX A4000, an Ampere GPU that is Nvidia's most powerful single-slot graphics card. 

With a NAS full of terabytes of data and an RTX workhorse ready to go, Jordan could use the new program "Chat with RTX," an Nvidia creation that sets up a ChatGPT experience on a PC powered by your graphics card, that can draw from local files for data. 

Having a NAS with AI inside may seem impractical, but for an increasingly AI-reliant world, this solution makes a lot of sense. The enterprise world's use of AI includes analyzing and optimizing data and performance, which is potentially connected to sensitive data one wouldn't want on the open internet. The online ChatGPT model trains itself on its own use, which would justifiably scare users with sensitive data away from potentially getting it leaked into an AI database. Having a local AI chatbot, disconnected from the open internet, but plugged into your data for these solutions, is potentially a game changer for security-focused companies, and putting that tool on a NAS allows everyone on a network to access the AI tools. 

This project is refreshingly useful in an increasingly novelty-focused YouTube tech environment. Adding the ChatGPT capabilities to a NAS that is already a powerhouse in its own right (15 TB NVMe drives are becoming more common in enterprise settings, and this QNAP beast fits 12 of them) is a no-brainer for companies who want to use the power of AI to make themselves more competitive in a fast-evolving world. Some of the moral concerns or fears with AI language models are avoided here too: Using a fully off-the-grid language model keeps you safe from AI stealing your responses, and having the bot trained on your data alone ensures no piracy or data theft of others is being perpetuated after you download the LLM model to power the chat app. 

For a deeper dive into the project and how to replicate it for yourself, check the article on StorageReview. For help starting your own AI NAS project, you'll want the best SSDs and HDDs as reviewed by us. And if you want to go really DIY, consider 3D printing your own NAS case.

Dallin Grimm
Contributing Writer

Dallin Grimm is a contributing writer for Tom's Hardware. He has been building and breaking computers since 2017, serving as the resident youngster at Tom's. From APUs to RGB, Dallin has a handle on all the latest tech news. 

  • ezst036
    Admin said:
    YouTube computer hardware creators typically have a penchant for the unusual and silly

    I'm guilty of that. I once created a multi-seat using a bitcoin adapter to add in a third GPU to create a third seat where otherwise there could only be two.
    Reply
  • Li Ken-un
    To this shell is added the Nvidia RTX A4000, an Ampere GPU that is Nvidia's most powerful single-slot graphics card.
    That can’t be. The A4000 Ampere is more powerful than its successor 4000 Ada?
    Reply
  • Notton
    "YouTube computer hardware creators typically have a penchant for the unusual and silly"

    It wasn't always this way.
    Creators are forced to make content to keep the algorithm overlords happy, and the current PC hardware sector is high and dry for interesting products.
    Reply
  • palladin9479
    While this is kinda outrageous for now, this shows that it will be possible for us to run our own non-cloud non-spying local AI tools. This can be very useful in the future for having a local system you can use to sort and categorize data then query against.
    Reply