Nvidia's DGX Spark AI mini-PC goes up for sale October 15 — 1 petaFLOP developer platform was originally slated for May
Potent, pint-size platform also got a $1000 price hike between announcement and launch

Nvidia's DGX Spark AI mini-PC got its first star turn at CES this year and was penciled in for a May launch date at the show, but the platform has since experienced delays on its road to market. Whatever wrinkles were preventing a launch have apparently been ironed out. Nvidia has announced that DGX Spark systems will be available to buy starting October 15, both from Nvidia itself and from partners including Dell, Asus, MSI, and HP.
As a refresher, the DGX Spark is a Grace Blackwell GB10-powered mini-PC platform that's custom-tailored to the needs of local AI inference and development. Running inference on many of today's state-of-the-art AI models requires far, far more GPU-local memory than even the 32GB that an RTX 5090 can provide. (The RTX Pro 6000 Blackwell offers up to 96GB of GPU-local memory, but that's an $8000+ product before you add in the cost of a host server or workstation).
The DGX Spark (formerly known as Project DIGITS) includes a unified, coherent pool of 128GB of LPDDR5X memory that's shared between a 20-Arm-core Nvidia Grace CPU and a Blackwell GPU that purports to deliver up to 1 petaFLOP of AI inferencing performance (assuming a model has been reduced to FP4 quantization with sparsity).
The company says a single DGX Spark supports up to 200-billion-parameter models locally (again assuming FP4 quantization). If one Spark isn't enough, two of these units can be connected using the built-in Nvidia ConnectX 7 NIC to double up on memory and compute resources.
The DGX Spark runs Nvidia's own DGX OS (a fork of Ubuntu) and supports the all-important CUDA software stack for AI developers. Unlike Strix Halo, which has found a niche as a (costly) gaming chip in devices as diverse as handhelds, the DGX Spark's Arm- and Linux-first nature makes it less appealing as a turn-key gaming platform, though curious enthusiasts can probably get their gaming fix on it with some work.
Until now, mini-PCs and laptops built around AMD's Ryzen AI Max+ 395 SoC (aka Strix Halo) have had the market of "relatively reasonably priced chip with a massive memory pool and enough compute for reasonable inferencing performance" all to themselves. Strix Halo supports up to 112GB of GPU memory (out of a possible 128GB of onboard RAM). But those systems don't natively support the widespread CUDA stack, making for some hurdles for developers and enthusiasts who want to get their AI projects up and running.
For its part, Nvidia says it's been working with a wide range of software partners to ensure that their tools work well with DGX Spark, including Anaconda, Cadence, ComfyUI, Docker, Google, Hugging Face, JetBrains, LM Studio, Meta, Microsoft, Ollama, and Roboflow, so it seems likely that if you have an LLM you want to run locally, a DGX Spark should be a solid foundation.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Nvidia originally said DGX Spark systems would start at $3,000 back in January, but at least the first-party DGX Spark will now retail for $3,999. Even at that price, its tiny size, relatively modest 240W power envelope, and complete turn-key support for the CUDA stack are likely to win it a lot of fans in the burgeoning AI space. We’ll have to see whether its long time in the oven has been a liability in a market where everything can still change in the space of hours or days.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

As the Senior Analyst, Graphics at Tom's Hardware, Jeff Kampman covers everything to do with GPUs, gaming performance, and more. From integrated graphics processors to discrete graphics cards to the hyperscale installations powering our AI future, if it's got a GPU in it, Jeff is on it.
-
Notton I like how there's a $1000 price hike with no explanation as to why. It just is.Reply
A Ryzen 395 mini-PCs with the 128GB/2TB config is $2000, half the price. -
DougMcC
And it performs the same? 1 petaflop of fp4 cuda throughput?Notton said:I like how there's a $1000 price hike with no explanation as to why. It just is.
A Ryzen 395 mini-PCs with the 128GB/2TB config is $2000, half the price. -
Notton
I doubt the Ryzen 395 has 1 petaflop of FP4 CUDA, but the point I was trying to make, before my ADHD kicked in, was there's clearly an Nvidia tax.DougMcC said:And it performs the same? 1 petaflop of fp4 cuda throughput?