
All of the cards we’re testing are fast enough to average playable frame rates at Battlefield 4’s Ultra quality preset. Several cards also fare really well at 2560x1440, too.
GeForce GTX 780 Ti is 8% faster than the press-sampled R9 290X, which holds on to its clock rate most reliably. Our retail board drops to lower frequencies in the lab, allowing 780 Ti to beat it by 29%.
No matter which card you pick, 3840x2160 is simply not playable.



Again, under the dual-GPU boards, Nvidia’s GeForce GTX 780 Ti is the victor at 1920x1080 and 2560x1440. Ultra HD is more muddled, but only because we’re dealing with eight cards crammed in under the 25 FPS mark.

Overall, observed frame time variance looks great. The only exceptions happen at 3840x2160, where average frame rates are too low anyway.
- GK110, Unleashed: The Wonders Of Tight Binning
- Meet The GeForce GTX 780 Ti
- Test Setup And Benchmarks
- Results: Arma III
- Results: Battlefield 4
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Metro: Last Light
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- Results (DirectX): AutoCAD 2013 And Inventor
- Results (OpenGL): LightWave And Maya 2013
- Results (OpenCL): GPGPU Benchmarks
- Results: CUDA Benchmarks
- Gaming Power Consumption Details
- Detailed Gaming Efficiency Results
- Power Consumption Overview
- Noise And Video Comparison
- Unquestionably The Fastest Single-GPU Graphics Card
It could also come down to production variance between the chips. Seen in before in manufacturing and it's not pretty. Sounds like we're starting to hit the ceiling with these GPUs... Makes me wonder what architectural magic they'll come up with next.
IB
If it has a negligible impact on what it looks like I am wondering how performance is with single cards on ultraHD screens WITHOUT ANTI-ALIASING. Please could you investigate? or point me to somewhere that has. Cheers all!
Apples to apples it looks like the 780 ti will remain faster than the 290x even after we begin to see custom cooling AMD cards . . . but at a high premium.
and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
Well, it is possible, but highly unlikely, given that Nvidia has a hard defined minimum clock rate, and a much narrower range...plus this is a reference board, it's fully possible that a retail card with a custom cooler will perform much higher (like the Gigabyte 780 in this article).
And, it's not an issue with the Titan or the 780, which are both based on GK110...which has been out for months, and has stable drivers.
and yet, people will continue to eat up their products like mindless sheep. guess a lot of people have disposable income.
What i can't understand is why this has to be a ****ing war.
When they had no competition, they charged a lot of money for their top end cards. No one was forced to buy these overpriced cards. If no one bought them, they'd drop prices. When AMD released solid competition, they dropped prices.
That's how the market works. If you think AMD wouldn't do the same, well, what can i say..
AMD haven't been in a position to do that for a long time on either the GPU or CPU front, which is why they haven't.
When they tried to release an $800 FX CPU (this is without a monopoly or lead in the market, btw), no one bought it, and AMD had to drop prices by more than half.