Our high-quality benchmarks are tested using Far Cry 3's Ultra quality preset, with the addition of 4x MSAA.


AMD's Radeon HD 7870 generally hovers under 30 FPS, below our rough target for playability. Meanwhile, the GeForce GTX 660 Ti and Radeon HD 7950 with Boost are only a little bit quicker. Even the powerful Radeon HD 7970 and GeForce GTX 670 are humbled by average frame rates just above 35 FPS. Only the two Radeon HD 7870s in CrossFire and GeForce GTX 660 cards in SLI manage to generate averages in excess of 45 FPS.
Speaking of multi-card solutions, notice that the Radeons achieve higher average results, but suffer lower minimum frame rates. In the frame rate-over-time chart, you can see that the GeForce boards in SLI yield smoother numbers than AMD's cards, which are not as consistent.


None of these cards are able to handle 5760x1080 using Far Cry 3's most demanding settings.
Although the Radeon HD 7970 and GeForce GTX 670 manage playable results at lower detail presets, I don't think we'll see a GPU able to handle this title at its Ultra detail settings using three screens until the next generation of hardware shows up.
We should also mention that we experienced some texture anomalies on the GeForce cards at this detail level. None of the results are playable, so the issue isn't particularly significant. But we did see something similar when Battlefield 3 debuted, requiring a driver revision from Nvidia to fix.
I thinks it read like this
"The good news for folks with Piledriver-based processors is that the FX-8350 is nearly as quick as Intel's Core i7-3960X (never mind the fact that the Core i7 costs more than $500..). "
hehe....
anyways good review...
My God... Are the reviewers of this website paid to make AMD look bad? Any person with a minimum hint of common sense can clearly see that there is virtually no difference between FX 8350, the i3, the i5 and i7. This is a big disservice to the community.
Why no middle ground? And why no 7970/680 tests in Crossfire/SLI? Why use single flagship cards, but then only use SLI/Crossfire for the medium bunch?
I'm very glad to see that this game uses Crossfire/SLI effectively, ~50% increase in performance for dual GPU configurations.
My God... Are the reviewers of this website paid to make AMD look bad? Any person with a minimum hint of common sense can clearly see that there is virtually no difference between FX 8350, the i3, the i5 and i7. This is a big disservice to the community.
I thinks it read like this
"The good news for folks with Piledriver-based processors is that the FX-8350 is nearly as quick as Intel's Core i7-3960X (never mind the fact that the Core i7 costs more than $500..). "
hehe....
anyways good review...
LOL truthed ! I bet that 8350 when OCed can even close the tiny gap between it and the Intel processors. Can the i3 OC I don't think so.
Why no middle ground? And why no 7970/680 tests in Crossfire/SLI? Why use single flagship cards, but then only use SLI/Crossfire for the medium bunch?
I'm very glad to see that this game uses Crossfire/SLI effectively, ~50% increase in performance for dual GPU configurations.
Thanks Don for the great review as always.
Edit: These still screen shots don't do it justice.
The good thing is the game doesn't scale up with intel CPUs making the 8350 really look good in comparison.
Dude, the writer is only trying to point out that using a dual core i3 is more meaningful than using the 8core FX8350. AND B.T.W. its common sense than the latest games dont even benefit from so many cores. Stop moaning about whether or not the writer is an Intel fanboy because AMD performed well in the GPU section.
I use 310.70 drivers and evga GTX 580 in SLI
It's how it was worded as in they made it sound like the 8350 was at a grave disadvantage when that really was not the case at all in fact AMD needs to be praised as they made a good CPU for a change that is competitive with Intel's offerings in most tasks not to mention the AMD chip is a multithreading beast.
Until you go to Eyefinity modes, in which case the 7870s not only pull away from the 660s, but maintain a far more consistent frame rate. Purely academic at that framerate though.
Also, the fact that a heavily overclocked i7-3960X cannot beat the i5-3550 suggests it's GPU limited in the extreme. Piledriver cores are notably weaker per thread than Ivy Bridge (or Sandy Bridge, for that matter) which could explain the minimum frame rate being a little lower. If we really want to see CPU bottlenecking, I'd retest with lower quality graphics.
also toms should have done benchmark on high quality settings as well as thats the setting most people are going to play at
yeah i get the feeling this article was a little rushed. there are quite a few settings that when slightly lower without any apparent decrease in visuals can have a dramatic increase in frame rates. just simply going with HDAO and medium shadows raised my FPS from 35 to 48 on my GTX 570 OC'd to 855.
though it is a bit to ask for the author to spend 15 minutes tweaking out each card . . .