According to a recent Intel Q&A, the company confirmed that driver optimizations for Arc GPUs -- relating to poor performance in DirectX 11 and 9 games -- is going to be a constant work in progress with no end goal in mind. Basically, Intel's lack of experience in the discrete GPU driver space will prevent their GPUs from being competitive with older APIs for quite some time.
This was made very apparent by a review from LinusTechTips, where he saw a 50% performance delta between the DX11 and DX12 versions of Shadow of the Tomb Raider running on an Arc A770. In DirectX 11, the A770 only saw around 38FPS, while in DirectX 12 mode, that frame rate bumps up to a whopping 80 FPS.
For the uninitiated, DirectX 11, Direct X 9, and other older APIs behave very differently from the modern ones like DirectX 12 and Vulkan. These older APIs rely heavily on the GPU driver itself to do a lot of the heavy lifting when it comes to tweaking and configuring lower-level GPU settings unseen by the user.
This behavior was intentional in an effort to reduce some additional heavy lifting for game developers. As a result, driver optimizations play a massive role in dictating the gaming performance of a GPU with these older APIs.
This is a night and day difference compared to DirectX 12 and Vulkan, where a lot of this driver baggage has been transferred to the game engine itself, with game developers being responsible for handing lower-level optimizations such as video memory allocation (this is why DirectX 12 and Vulkan are referred to as "low level" APIs).
The bad news for Intel is they have very little experience with these APIs surrounding discrete graphics (in comparison to iGPs). Nvidia and AMD, on the other hand, have more than a decade of experience in the field and know all the little details and odd behaviors DX 11 and DX 9 might have.
As a result, Tom Petersen from Intel says the road towards better performance in APIs like DirectX 11 will be a "labor of love forever." It is a sad truth, but a truth nonetheless. These optimizations don't happen overnight, and there are infinite ways to optimize GPUs for DirectX 11 and its predecessors. This fact holds true even for experienced companies like AMD, which has seen big DirectX 11 driver gains in recent years.
Integrated Graphics Experience Has Made Things Worse for Intel
At first glance, it's easy to assume Intel's experience with integrated graphics would be beneficial. But unfortunately, it has not helped matters and has even made things worse for the company.
In a report we covered a week ago, CEO Pat Gelsinger noted that it made a fatal error on the driver side of development and falsely assumed that it could take its integrated graphics driver stack and apply it to its discrete Arc GPUs.
This strategy showed Intel that its integrated graphics driver stack was utterly inadequate to run Intel's much more powerful Arc GPUs since the architectural differences between its iGPs and dGPUs are massive.
We suspect this could be a big reason Intel's Arc GPUs suffer exceptionally in Direct X 11. If Intel had started from scratch with a dedicated GPU driver stack, the developers would have had more time to optimize for older APIs.