
Our Crysis 3 benchmark is based on real-world gameplay. Fairly consistently, it appears to be platform-bound, though. It might be tempting to suspect a v-sync issue, given the average frame rates at 1920x1080 clumping up at 60 FPS. However, if you look back to our R9 290X review, you’ll notice averages in the 65 FPS range—roughly corresponding to our switch from a Core i7-4960X to a -3970X processor this time around.
One observation cannot be missed, though: Radeon R9 290 looks a lot like our sampled 290X and GeForce GTX Titan. The retail R9 290X is quite a bit slower though, particularly at 2560x1440.
The frame rates drop too low at 3840x2160 to be usable, though that’s clearly where AMD’s Hawaii GPU excels. We’ve already tested the 290X in CrossFire and seen impressive results. However, we’re waiting for a second retail card before revisiting that configuration in a more realistic way.



Our dual-GPU numbers were generated by the FCAT tool suite, which is designed to factor our dropped and runt frames. And yet, the Radeon HD 7990 is somehow able to transcend the ceiling imposed on every other card at 1920x1080.
This is masked somewhat at 2560x1440, where the GeForce GTX 690 reminds us that it’s a very capable performer, too. Single-GPU boards like the Radeon R9 290X, 290, GeForce GTX Titan, and 780 all clump together though. There's a little more spread at 3840x2160, but only enough to see the retail 290X getting outperformed by the sampled 290. Both cards beat out Nvidia’s GeForce GTX 780.

Worst-case frame time variance is fairly low at 2560x1440. It gets worse at 1920x1080 and 3840x2160, though seemingly not in a consistent way. Only the GTX 690’s higher numbers would make sense from the standpoint of getting two GPUs to render frames consistently. Just remember these are 95th percentile numbers. The average and 75th percentile are being excluded to avoid a data overload.
- Digging Deeper Into Hawaii’s Behavior
- Sidebar: Variability Turns Into A Graphics Card Crapshoot
- Meet The Radeon R9 290
- Test Setup And Benchmarks
- Results: Arma III
- Results: Battlefield 4
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Metro: Last Light
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- Results (DirectX): AutoCAD 2013 And Inventor
- Results (OpenGL): LightWave And Maya 2013
- Results (OpenCL): GPGPU Benchmarks
- Gaming Power Consumption Details
- Detailed Gaming Efficiency Results
- Power Consumption Overview
- Noise And Video Comparison
- Do-It-Yourself Upgrade With Arctic's Accelero Xtreme III
- Radeon R9 290: Priced Right Where We’d Peg It
http://techreport.com/review/25602/amd-radeon-r9-290-graphics-card-reviewed/9
Chris, these results differ drastically from real world results from 290X owners at OCN... I understand that your observations are anecdotal and based on a very small sample size but do you mind looking into this matter further because putting such a statement in bold in the conclusion even though it contradicts real world experiences of owners just provides a false assumption to the uninformed reader...
The above claim has already escalated further than it should... A Swiss site actually has already rebutted by testing their own press sample with a retail model and concluded the following:
In the quiet mode, where the dynamic frequencies to work overtime, the situation becomes slightly turbid. A minor performance difference can be seen in some titles, and even if it is not about considerable variations, the trend is clear. In the end, it does an average variance tion of only a few percent, ie no extreme levels. The reason may include slightly less contact with the cooler, or simply easy changing ambient temperature.
http://techreport.com/review/25602/amd-radeon-r9-290-graphics-card-reviewed/9
Chris, these results differ drastically from real world results from 290X owners at OCN... I understand that your observations are anecdotal and based on a very small sample size but do you mind looking into this matter further because putting such a statement in bold in the conclusion even though it contradicts real world experiences of owners just provides a false assumption to the uninformed reader...
The above claim has already escalated further than it should... A Swiss site actually has already rebutted by testing their own press sample with a retail model and concluded the following:
In the quiet mode, where the dynamic frequencies to work overtime, the situation becomes slightly turbid. A minor performance difference can be seen in some titles, and even if it is not about considerable variations, the trend is clear. In the end, it does an average variance tion of only a few percent, ie no extreme levels. The reason may include slightly less contact with the cooler, or simply easy changing ambient temperature.
Now to wait for the non-reference cards at the end of the month!
It looks like a good card for the price as it even keeps up with the $100 more GTX780. This is good as NVidia may drop prices even more which means we could also see a price drop on the 290X and I wouldn't mind a new 290X Toxic for sub $500.
Best to wait a month or two before buying to see how this all goes down
Some people who need CUDA for work and GPU for gaming will still get 780s, but no one will get 290x for $150 premium just to get a couple more FPS over 290. AMD just shot themselves in the foot before hurting nvidia.
Nvidia made a very good job with the reference cooler(but you really pay for it)... do you think AMD could not have pulled of a "monster" cooler?? is it really hard to make a good cooler? no, it is expensive.
You could do this, you have youre sources
Strange thing and I know some of us were going through this. I was thinking getting a 280x on Black Friday/Cyber Monday but the price tag is leaving me with something to think about. I think I'm just going to save up the few pennies to get something I thought was out of my price range ($300-450) a month ago ($650+).