It doesn’t really look like the switch from OpenGL to DirectX is becoming a trend, since Autodesk is the only major company making this drastic change. Of course, the advantage of DirectX for end-users is that they can do without specialized workstation cards, so long as they’re willing to forgo the drivers optimized for specific applications, greater compute performance, and so on. DirectX’s disadvantage is its use of single-precision coordinates, which can easily lead to display errors in complex models, resulting in things like the feared push-through effect of surfaces right behind another surface.
Nvidia's GeForce GTX 780 Ti is the company's fastest consumer graphics card in this benchmark, and it rules supreme in the Cadalyst 3D suite. However, the GK110-based board does succumb to the latest Hawaii-based Radeon R9s in Inventor.



- GK110, Unleashed: The Wonders Of Tight Binning
- Meet The GeForce GTX 780 Ti
- Test Setup And Benchmarks
- Results: Arma III
- Results: Battlefield 4
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Metro: Last Light
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- Results (DirectX): AutoCAD 2013 And Inventor
- Results (OpenGL): LightWave And Maya 2013
- Results (OpenCL): GPGPU Benchmarks
- Results: CUDA Benchmarks
- Gaming Power Consumption Details
- Detailed Gaming Efficiency Results
- Power Consumption Overview
- Noise And Video Comparison
- Unquestionably The Fastest Single-GPU Graphics Card
It could also come down to production variance between the chips. Seen in before in manufacturing and it's not pretty. Sounds like we're starting to hit the ceiling with these GPUs... Makes me wonder what architectural magic they'll come up with next.
IB
If it has a negligible impact on what it looks like I am wondering how performance is with single cards on ultraHD screens WITHOUT ANTI-ALIASING. Please could you investigate? or point me to somewhere that has. Cheers all!
Apples to apples it looks like the 780 ti will remain faster than the 290x even after we begin to see custom cooling AMD cards . . . but at a high premium.
and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
Well, it is possible, but highly unlikely, given that Nvidia has a hard defined minimum clock rate, and a much narrower range...plus this is a reference board, it's fully possible that a retail card with a custom cooler will perform much higher (like the Gigabyte 780 in this article).
And, it's not an issue with the Titan or the 780, which are both based on GK110...which has been out for months, and has stable drivers.
and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
What i can't understand is why this has to be a ****ing war.
When they had no competition, they charged a lot of money for their top end cards. No one was forced to buy these overpriced cards. If no one bought them, they'd drop prices. When AMD released solid competition, they dropped prices.
That's how the market works. If you think AMD wouldn't do the same, well, what can i say..
AMD haven't been in a position to do that for a long time on either the GPU or CPU front, which is why they haven't.
When they tried to release an $800 FX CPU (this is without a monopoly or lead in the market, btw), no one bought it, and AMD had to drop prices by more than half.