TinyBox AI accelerator now available starting at $15k, available in AMD 7900XTX and Nvidia RTX 4090 variants
The TinyBox is now on sale after all the drama.
The TinyBox AI accelerator is now available on sale on the tinygrad website starting at $15k. This small 16.25-inch deep, 12U freestanding unit sports six AMD Radeon RX 7900XTX or Nvidia GeForce RTX 4090 GPUs designed for deep learning and AI acceleration applications. While the computer is relatively more expensive versus consumer gaming desktop PCs, it’s much more affordable than a single Nvidia H100 or AMD MI300X accelerator, which are priced between $10 to $40k apiece.
18 months in to the company, tinyboxes finally have a buy it now button! We have 13 in stock today, go to our website (link on @__tinygrad__) to buy one.The $15k tinybox red is the best perf/$ ML box in the world. It's fully networkable, so that's the metric that matters. pic.twitter.com/gFxc873Q1yAugust 26, 2024
tinygrad founder George Hotz shared the news on X about the availability of TinyBox, saying that they have 13 units in stock as of August 27. He also claims that it offers “the best perf/$ ML box in the world” and that it’s fully networkable. Although the company faced some challenges with the AMD GPUs it used for its systems, it was eventually able to find a solution to this problem. At the same time, it also added the option to use Nvidia GPUs instead to avoid AMD’s driver instability, although this comes at a 67% premium.
Header Cell - Column 0 | red | green |
---|---|---|
TFLOPS | 738 FP16 TFLOPS | 991 FP16 TFLOPS |
GPU Model | 6x RX 7900XTX | 6x RTX 4090 |
GPU RAM | 144GB | 144GB |
GPU RAM bandwidth | 5,760 GB/s | 6,050 GB/s |
GPU link bandwidth | 6x PCIe 4.0 x16 (64 GB/s) | 6x PCIe 4.0 x16 (64 GB/s) |
CPU | 32-core AMD EPYC | 32-core AMD EPYC |
System RAM | 128 GB | 128 GB |
System RAM bandwidth | 204.8 GB/s | 204.8 GB/s |
Disk size | 4TB raid array + 1TB boot | 4TB raid array + 1TB boot |
Disk read bandwidth | 28.7 GB/s | 28.7 GB/s |
Networking | 2x 1 Gbe + open OCP3.0 slot (up to 200 Gbe) | 2x 1 Gbe + open OCP3.0 slot (up to 200 Gbe) |
Noise | <50 dB, 31 low speed fans | <50 dB, 31 low speed fans |
Power supply | 2x 1600W, 100~240V | 2x 1600W, 100~240V |
BMC | AST2500 | AST2500 |
Operating system | Ubuntu 22.04 | Ubuntu 22.04 |
Dimensions | 12U, 16.25-inch deep, 90 lbs. | 12U, 16.25-inch deep, 90 lbs. |
Rack mount | Freestanding or rack mount | Freestanding or rack mount |
Driver quality | Acceptable | Great |
Price | $15,000 | $25,000 |
Despite these seemingly high prices, the company reportedly already has 583 pre-orders for the system. And, with the drama with AMD’s GPUs behind it, it looks like that tinygrad is ready to go full steam ahead with the sale of its TinyBox AI accelerators. Companies have a choice between the more affordable AMD GPU or the Nvidia system. However, as the tinygrad said in a Tweet before, “If you like to tinker and feel pain, buy red. The driver still crashes the GPU and hangs sometimes, but we can work together to improve it.”
The factory is at full power! If you have a tinybox preorder and somehow haven't been contacted, reach out to support@tinygrad.orgSales will open to the public shortly. pic.twitter.com/xljpYU1HjvAugust 26, 2024
We'd like to believe that it has finally solved the issues with the AMD GPUs. Hence, it’s making it available to the market. But with the popularity of Nvidia in the AI accelerator space, it does make sense for the company to give this option, too. It’s just unfortunate, though, that the Intel Arc-powered TinyBox is still in the prototype stage and that it has no plans to ship it. It would’ve been nice if we had the option for that as well, so we hope that the company changes its mind in the future and starts shipping ‘blue’ models of the TinyBox.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.
-
vanadiel007 Hopefully this will not take off well, because otherwise we will have another shortage of consumer GPU's.Reply -
derekullo
A 4090 is barely a consumer gpu lol.vanadiel007 said:Hopefully this will not take off well, because otherwise we will have another shortage of consumer GPU's. -
bit_user
One thing is fab capacity and GDDR supply. Nvidia recently had to downgrade some of their GPUs from GDDR6X to regular GDDR6, due to shortages of the former.derekullo said:A 4090 is barely a consumer gpu lol.
Another thing is that people using consumer GPUs for AI training will just move on to the next generation of consumer GPUs, when they launch. That could push up prices and shift demand down the product stack, resulting in higher prices even on lower-end models.
I'm not saying any of this will happen, but I think they're not impossible outcomes. Then again, people have been doing multi-GPU training with Nvidia consumer GPUs for like a decade, so this isn't really anything terribly new or novel (apart from any clever algorithms Tiny might've devised). -
NinoPino
By price and power usage those GPUs cannot be defined consumer.vanadiel007 said:Hopefully this will not take off well, because otherwise we will have another shortage of consumer GPU's. -
derekullo
If anything a Geforce 4090 would be a prosumer gpu.NinoPino said:By price and power usage those GPUs cannot be defined consumer. -
DS426
Ok but that's not how consumer-class products work. It's also about support levels and some other assurances.NinoPino said:By price and power usage those GPUs cannot be defined consumer.
Also, the price class of the 7900 XTX is completely different than the 4090 being that, based on actual market prices, the 4090 really is out on it's own at a cost of more than 50% higher than the 7900 XTX. Sure, they're both "prosumer" class products, but that still falls in the consumer category and not professional or datacenter.
We're kind of arguing semantics here but the basics remains the same: you're getting one of the most affordable AI training servers for the performance given. Thankfully, there are alternatives to nVidia's AI strangehold, which the pricing really reflects. Now that clever folks will be tinkering with the AMD models and the both ends are working to improve driver quality, it seems to me that both options are very compelling, albeit for somewhat different reasons (price-perf vs. all-out performance and "it just works").