Was curious whether this straighforward question was ever answered with a straighforward solution.
Whilst I fully appreciate the reasoning concerning MS / ATI / NViDiA eventually discontining support of 2003 x32, I don't need to be made aware of this all nor feel like having to explain such responses?
We too are a development team working in W2K3 x32 (For good reasons we too have not gone up to a 64 bit platform - We use a variety of software that are licensed as x32 - And similarly it runs smoothly without any requirement for using more than 4GB RAM (Not even with VMware virtualization running side-by-side).
Yes, we can install x64 W2K3 (But why if all we need it dual display to work properly without flakiness, and why if it will just result in licensing costs?)
Yes, we can install 2008 (x32 or x64), but these OS's are have heavier RAM requirments, again have licensing costs accociated (not just for the OS, but also apps licensed to work under certain OS). They work stunning as servers for production, but are overkill for development purposes.
Man, we could even go instaling 2008 R2...
Point is NO... Why? Yes, better and faster, more efficient utilization of underlying hardware... I get all that! That's why we have these in production... But not for development servers.
Does someone out there have an answer to this? Can't believe for one second that there is absolutely not ONE SINGLE graphics card out there, whether it be PCI or PCI-e, which will work in W2K3 32bit. It doesn't have to perform at Usain Bolt speeds, it just needs to work properly! It can have DVI or VGA or HDMI output... Not fussed. Just one!?