Intel Drops DirectX 9 Support On Xe, Arc GPUs, Switches to DirectX 12 Emulation
A good time to move, with no DX9 discrete GPU experience to boot
Native DX9 hardware support is officially gone from Intel's Xe integrated graphics solutions on 12th Gen CPUs and A-Series Arc Alchemist discrete GPUs. To replace it, all DirectX 9 support will be transferred to DirectX 12 in the form of emulation.
Emulation will run on an open-source conversion layer known as "D3D9On12" from Microsoft. Conversion works by sending 3D DirectX 9 graphics commands to the D3D9On12 layer instead of the D3D9 graphics driver directly. Once the D3D9On12 layer receives commands from the D3D9 API, it will convert all commands into D3D12 API calls. So basically, D3D9On12 will act as a GPU driver all on its own instead of the actual GPU driver from Intel.
Microsoft says this emulation process has become a relatively performant implementation of DirectX 9. As a result, performance should be nearly as good, if not just as good, as native DirectX 9 hardware support.
This DX9 change from Intel appears to be a very good move as a result. Intel can now divert driver development resources towards DirectX 11 optimizations -- which we know is very bad at this time, and Intel won't suffer performance consequences as a result, with DX9 optimizations "outsourced" to Microsoft entirely.
According to Microsoft, with how performant D3D9On12 is, it will be interesting to see if Nvidia and AMD follow the same path as Intel. But, there could be consequences to the API translation, including higher CPU usage (since the translation is software accelerated) and potential side-effects with older games. Nvidia and AMD also have almost 20 years of driver experience with DirectX 9, which might result in performance losses with the DX12 emulation layer.
Intel, on the contrary, only has experience with DirectX 9 on its integrated graphics, which does not translate into the experience with its much higher-performing discrete graphics. So it makes a lot of sense that Intel is immediately transitioning to emulation as it gets closer to launching Arc worldwide.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.
-
-Fran- Well, at least when the hook is done at the game level there's a lot of performance benefits. This is without touching the game code, so it's good.Reply
I'd actually love if AMD would do this, since their DX9, DX10 and even DX11 often have very underwhelming performance. Maybe they're already doing it, I don't know.
Regards. -
TerryLaze
Are you guys letting AI write these stories?!Admin said:Intel has made the decision to remove native DirectX 9 support from its latest Xe iGPs and Arc GPUs, and replace DX9 with DX12 emulation instead.
Intel Drops DirectX 9 Support On Xe, Arc GPUs, Switches to DirectX 12 Emulation : Read more
They are replacing Dx9 with emulation of it (dx9) on dx12, not with emulation of dx12 which is what you have written.
with DX9 optimizations "outsourced" to Microsoft entirely.
MS is outsourcing it to the community though.
No that it matters because this is just a translation of all dx9 commands into things that Dx12 can run. They might find a few tricks to use fewer dx12 commands to emulate some dx9 stuff but the difference should be minimal no matter what they do.
Why open source?The D3D9On12 mapping layer is included as an operating system component of Windows 10. Over the years and Windows 10 releases, it has grown in functionality, to the point where it is a complete and relatively performant implementation of a D3D9 driver. We are choosing to release the source to this component for two primary reasons:
To enable the community to contribute bugfixes and further performance improvements, which will improve the stability and performance of Windows 10. See CONTRIBUTING.
It is very doubtful that this will increase Dx9 performance since the bad performance comes from lack of hardware resources needed to run that old code and those will not magically appear just because dx12 is used.
It's not like they turn old games into dx12 versions of that game.
If it does use the CPU, as mentioned in the article, it might increase performance in some things but there is nothing mentioned on the MS link of that being the case. -
brandonjclark I think the way this should read (and they wrote it as succinctly as possible), is that DX12 does the DX9 emulation layer.Reply -
TechyInAZ TerryLaze said:Are you guys letting AI write these stories?!
They are replacing Dx9 with emulation of it (dx9) on dx12, not with emulation of dx12 which is what you have written.
MS is outsourcing it to the community though.
No that it matters because this is just a translation of all dx9 commands into things that Dx12 can run. They might find a few tricks to use fewer dx12 commands to emulate some dx9 stuff but the difference should be minimal no matter what they do.
It is very doubtful that this will increase Dx9 performance since the bad performance comes from lack of hardware resources needed to run that old code and those will not magically appear just because dx12 is used.
It's not like they turn old games into dx12 versions of that game.
If it does use the CPU, as mentioned in the article, it might increase performance in some things but there is nothing mentioned on the MS link of that being the case.
The word emulation, implies the DX9 conversion. -
rluker5 Even if they lose 30% performance, a game written for dx9 era hardware should be pretty easy to run on modern stuff.Reply
I just want to see Arc out there already. Quick fixes are the fixes needed at this point. -
InvalidError Intel on the contrary, only has experience with DirectX 9 on its integrated graphics, which does not translate into experience with its much higher performing discrete graphics.
Intel did have some decently fast IGPs when it did Iris Pro with eDRAM. The next iteration of that will come when we get IGP tiles/chiplets with 1-4GB of eDRAM and 100+GB/s access to system memory. At that point, IGPs will be every bit as serious business as low-to-mid range dGPUs today. -
hotaru251
isnt the rumor of Zen4 igpu similar to low end gpu?InvalidError said:At that point, IGPs will be every bit as serious business as low-to-mid range dGPUs today. -
blppt -Fran- said:Well, at least when the hook is done at the game level there's a lot of performance benefits. This is without touching the game code, so it's good.
I'd actually love if AMD would do this, since their DX9, DX10 and even DX11 often have very underwhelming performance. Maybe they're already doing it, I don't know.
Regards.
AMD's recent driver updates have bumped DX11 performance significantly, i've found. Almost to Nvidia's level.
I do like the idea of the "wrapper" being integrated into the drivers---just about every game i've ever tried works better with the dxvk wrapper than native dx9 or even dx11 in some cases (on AMD). And that was written for linux---that it works on Windows at all is pretty cool. -
InvalidError
AFAIK, there will be a basic IGP worse than AMD's current ones for office-like environments where you need little more than video outputs and a "big IGP" which may be aiming as high as the current 60-tier, though I think something between the RX6500 and RTX3050 is a more realistic expectations ceiling. I expect a spicy price tag on those.hotaru251 said:isnt the rumor of Zen4 igpu similar to low end gpu? -
hotaru251
can they even make rdna2/3 worse than vega's?InvalidError said:there will be a basic IGP worse than AMD's current ones for office-like environments where you need little more than video outputs and a "big IGP" which may be aiming as high as the current 60-tier
and vegas was "not terrible".