The newer Ryzen 5 5600G (Cezanne) has replaced the Ryzen 5 4600G (Renoir) as one of the best CPUs for gaming. However, a trick has breathed new life into the Ryzen 5 4600G, transforming the budget Zen 2 APU into a 16GB graphics card to run AI applications on Linux.
Not everyone has to budget to buy or rent a Nvidia H100 (Hopper) to experiment with AI. With the current demand for AI-focused graphics cards, you may be unable to access one even if you have the money. Luckily, you don't need an expensive H100, an A100 (Ampere), or one of the best graphics cards for AI. One Redditor demonstrated how a Ryzen 5 4600G retailing for $95 can tackle different AI workloads.
The Ryzen 5 4600G, which came out in 2020, is a hexa-core, 12-thread APU with Zen 2 cores that operate with a base and boost clock of 3.7 GHz and 4.2 GHz. The 65W chip also wields a Radeon Vega iGPU with seven compute units clocked up to 1.9 GHz. Remember that APUs don't have dedicated memory but share system memory. You can determine the amount of memory inside the motherboard’s BIOS. In this case, the Redditor had 32GB of DDR4 and allocated 16GB to the Ryzen 5 4600G. Typically, 16GB is the maximum amount of memory you can dedicate to the iGPU. However, some user reports claim that certain ASRock AMD motherboards allow for higher memory allocation, rumored up to 64GB.
The trick converts the Ryzen 5 4600G into a 16GB "graphics card," flaunting more memory than some of Nvidia's latest GeForce RTX 40-series SKUs, such as the GeForce RTX 4070 or GeForce RTX 4070 Ti, which are limited to 12GB. Logically, the APU doesn't deliver the same performance as a high-end graphics card, but at least it won't run out of memory during AI workloads, as 16GB is plenty for non-serious tasks.
AMD's Radeon Open Compute platform (ROCm) doesn't officially support Ryzen APUs. Third-party companies, such as BruhnBruhn Holding, offer experimental packages of ROCm that'll work with APUs. That means APUs can work with PyTorch and TensorFlow frameworks, opening the gate to most AI software. We wonder if AMD's latest mobile Ryzen chips, like Phoenix that taps into DDR5 memory, can work and what kind of performance they bring.
The Redditor shared a YouTube video claiming that the Ryzen 5 4600G could run a plethora of AI applications, including Stable Diffusion, FastChat, MiniGPT-4, Alpaca-LoRA, Whisper, LLM, and LLaMA. Unfortunately, he only provided demos for Stable Diffusion, an AI image generator based on text input. He doesn't detail how he got the Ryzen 5 4600G to work with the AI software on his Linux system. The YouTuber has vouched to release a thorough video of the setup process.
As for the performance, the Ryzen 5 4600G only took around one minute and 50 seconds to generate a 512 x 512-pixel image with the default setting of 50 steps. It's an excellent result for a $95 APU and rivals some high-end processors. The author said he used DDR4 memory but didn't list the specifications. Although the Ryzen 5 4600G natively supports DDR4-3200, many samples can hit DDR4-4000, so it would be fascinating to see AI performance scaling with faster memory.
The experiment is fantastic for those who own a Ryzen 5 4600G or Ryzen 5 5600G and want to play around with AI. For those who don't, throwing $500 into an APU build doesn't make much sense when you can probably get a discrete graphics card that offers better performance. For instance, AMD's Radeon 16GB graphics cards start at $499, and Nvidia recently launched the GeForce RTX 4060 Ti 16GB, which has a similar starting price.
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Zhiye Liu is a Freelance News Writer at Tom’s Hardware US. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.
> Unfortunately, he only provided demos for Stable Diffusion, an AI image generator based on text input. He doesn't detail how he got the Ryzen 5 4600G to work with the AI software on his Linux system.
Very typical in the home AI space; everyone is quick to show off their results but no one ever wants to detail exactly how they did it. Leaving the rest of us to bang our heads against the keyboard with endless CUDA, Pytorch, and driver errors, let alone even having the correct command, models, and file configurations.
This is a 0GB GPU. Shared ram is nowhere near the same thing. It's like calling a 3.5in floppy disk a Hard Drive or USB 2.0 flash drive a SSD.Reply
Logically, the APU doesn't deliver the same performance as a high-end graphics card, but at least it won't run out of memory during AI workloads, as 16GB is plenty for non-serious tasks.Stable Diffusion doesn't really run out of memory during AI workloads, at least the implementations I'm familiar with (Automatic1111 and ComfyUI) can work even with low (≤4GB) VRAM GPUs with a speed penalty, moving stuff between DRAM and VRAM. Kobold and similar programs can do the same with text generation, but the speed penalty, in my experience, is so large that it doesn't make it worthwhile.
When you set the RAM on an iGPU, you are reserving that amount of RAM specifically for the iGPU. That in turn means it is a 16GB GPU. Now you do not get the same performance as if it had its own VRAM due to shared bandwidth, but the frame buffer is the full 16GB.Ewitte said:This is a 0GB GPU. Shared ram is nowhere near the same thing. It's like calling a 3.5in floppy disk a Hard Drive or USB 2.0 flash drive a SSD.
That's exactly the HSA use case for which I bought my Kaveri, except that I didn't have ML in mind but figured there might be a smarter spreadsheet that would use 512 GPGPU cores to do a recalc instead of 4 integer and 2 FP cores. The ability to switch between CPU and GPU code at subroutine level was just mind blowing, if only anyone had been able to exploit that potentioal!Reply
Problem was that nobody really wanted to rewrite their spreadsheets or parametric design software to really use GPGPU code back then. And while the unified memory space even allows GPUs to use "CPU-RAM" today, the memory wall in front of DRAM over PCIe would likely anihilate much of the effort.
I hate to say it, because IMHO the last good Apple was a ]
But for 95$ does worth it ?Ewitte said:This is a 0GB GPU. Shared ram is nowhere near the same thing. It's like calling a 3.5in floppy disk a Hard Drive or USB 2.0 flash drive a SSD.
Admin said:Typically, 16GB is the maximum amount of memory you can dedicate to the iGPU. However, some user reports claim that certain ASRock AMD motherboards allow for higher memory allocation, rumored up to 64GB.
That is way more than I have ever seen BIOS allow for iGPUs. Took a look, my ryzen board only support 2G max.
But is simply allocating more BIOS memory really a "trick"? Is this like "one weird trick" or something? And really, a simple "trick" BIOS setting change that's worth a whole article? Was there an upgrade to CoreBOOT that I missed?
We wonder if AMD's latest mobile Ryzen chips, like Phoenix that taps into DDR5 memory, can work and what kind of performance they bring.It's worth noting that Phoenix and later desktop APUs would not only include substantially better iGPUs and DDR5 support, but also the XDNA AI accelerator. I don't know if it would be any faster than using the RDNA graphics, but it could allow you to use your APU to game while running Stable Diffusion or whatever on the accelerator at the same time.
>Not everyone has to budget to buy or rent a Nvidia H100 (Hopper) to experiment with AI.Reply
Until the AI is like The 6th Day , I'm not even going to bother with it. :geek:(y)
It is an interesting proof of concept. If it could be adapted to be an official feature, it would be of great benefit to schools and programs with computers without a dedicated video card.Reply