Phoronix reports that driver updates for ATI's ancient R300 through R500 series Radeon GPUs are being implemented into Linux this year. Despite their old age, the open-source community is keeping these GPUs alive with open-source drivers, enabling them to continue to run on modern Linux operating systems.
The Linux driver update makes a change to NIR lowering, which is related to the vertex shaders. The driver update will be available this quarter in Mesa 24.0, meaning anyone still using an R300-R500 series Radeon GPU will have it later this year.
"This MR moves the most of the remaining backend lowering into NIR. Specifically, ftrunc, fcsel (when suitable) and flrp. The backend lowering paths are removed. This is a prerequisite for more backend cleanups, for example I have a MR ready to get rid of backend DCE for vertex shaders." -- Pavel Ondračka
The ATI R300 debuted in 2002 as part of the ATI Radeon 9700 PRO. The GPU featured an AGP interface (an ancient competitor to PCIe), 150nm process, 110 million transistors, 325MHz core clock, 256MB of memory, and 19.8GB/s of memory bandwidth.
When it launched, the Radeon 9700 Pro was the fastest GPU of its time, beating Nvidia's counterpart, the GeForce 4 Ti 4600, in almost every workload. It was also the most technologically advanced GPU at the time, being the first GPU to support DirectX 9 in its entirety.
The R400 and R500 series GPUs that would follow afterward were mostly optimizations of the R300's GPU architecture, sporting far more pixel pipelines, vertex shader engines, and faster, more capable memory configurations.
It is amazing that anyone is still running one of these cards, let alone supporting them in modern Linux operating systems, as the open-source community has done. Obviously, these GPUs can't do much on modern operating systems, other than display windows and text. But it is cool, nevertheless, that a GPU from 2002 can run a modern operating system at all.
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.
While I question the practicality of having this GPU (or any other ancient piece of hardware fully functional), I do appreciate and find it really cool.Reply
We all like 60's Mustangs purring along, right?
I actually still own a fully functioning Radeon 9700 pro 😀.Reply
Its amazing they can still run on anything.
Gamers should move to linux and forget GPU upgrades for a while…Reply
Not gonna hapen though!
I have several Linux servers with Radeon HD 47xx from 2008-2010 just to have a local TTY console. I mostly access these servers with ssh or other services, but it is sometime convenient to be able to use a local console. I thank the Linux developers community for maintaining the drivers of these old but very functional graphics cards.Reply
It is amazing that anyone is still running one of these cards, let alone supporting them in modern Linux operating systems, as the open-source community has done.
Not really, especially if all you need is a basic GPU for video output since AMD CPUs don't include a basic GPU like (most) Intels do. Even if it's not that old there's still the Radeon 5450.
1964 1/2 Mustang my fav. But cool to see the old cards still have life at least in Linux.-Fran- said:We all like 60's Mustangs purring along, right?
This is why I like the FOSS community, there is no malicious segmentation or deliberately dropping "support" as a way to force consumers to buy more stuff. If the HW hasn't changed in two decades, then there is absolutely no drivers shouldn't keep on working.Reply
with the way things are going I think in a generation or two, GPUs will generate frames instead of raster render them. ie. you will buy a compute chip from nvidia and then the game devs will have a specifc model that has been trained to display a game. this will then be generated so fast it will be imperceptible from a frame rate and super smooth with infinite resolution and zooming if needed, with more detail being added the closer the zoom. current compute chips are still primitive, give it a decade or so and they'll precisely generate images without flaws realtime at a 120fps at 8K. it only looks flawed now because we are still learning how to improve and prompt it into getting it to show us exactly what we want to see this will massively drop the power usage and performance cost of displaying a game. stable diffusion with precise prompts is a far more efficient way to do game design. a custom game engine will then be made to precisely and accurately prompt a model into creating what is required. think unreal engine but with ai prompts to get it to do what you want. when this happens all these render based gpus will all seem outdated. imagine how unique games can become when AI can generate new scenarios based upon input and display them realtime in fluid motion with near instant latency interaction from controls/input (ie. making a character move or looking left/right/up/down). When this tech matures, we can expect hyperrealism gaming with higher resolution than real life.Reply