
For the first time, we observe significant difference between the FCAT- and Fraps-reported benchmark results using Thief’s in-game test. FCAT tells us that there’s a 62 FPS average, while Fraps spits back 77 FPS. But Fraps also tries convincing us that the game dips as low as 6 FPS and shoots as high as 1174 FPS, which surely throws the average out of whack. The FCAT number is far more believable, dropping to 16 FPS, and peaking under 90 FPS. That’s scaling in the 38% range, which is not great.

What’s up with the big difference between FCAT and Fraps in this title? Using four GPUs, the Thief benchmark exhibits strange behavior in that it starts, chops through a few seconds, and then spits out a rendered payload in faster-than-real-time until it catches up with where the action is supposed to be. AMD suggests to us that this could be due to the app compiling thousands of shaders upfront, affecting performance. If you play through the game for several minutes, the frame rate does even out a bit.
The big drop in performance happens at the end of the test for no clear reason.

Two Radeon R9 295X2s in CrossFire again yield the highest worst-case frame time variance, though I normally don’t consider the 6 ms-range problematic. We do know, however, that results in the 5 ms range can be distinguished in blind testing, depending on the title.

Big frame time variance spikes are indicative of our biggest problem with the Thief benchmark: severe stuttering. As with Battlefield 4 the experience in Thief simply isn’t acceptable. The issue was confirmed when we went into the actual game and encountered the same stuttering issues.
I cant believe the reviewer just shrugged of the fact that the games obviously look cpu limited by just saying "well, we had the fastest cpu you can get" when they could have used mantle in BF4 to lessen cpu usage.
For that to happen, IMO, the time from one GPU release to the next would have to be so long that users needed more than 2x high end GPUs to handle games in the mean time.
As it is, there's really no gaming setup that can't be reasonably managed by a pair of high end graphics cards (Crysis back in 2007 is the only example I can think of when that wasn't the case). 3 or 4 cards will always just be for people chasing crazy benchmark scores.
I cant believe the reviewer just shrugged of the fact that the games obviously look cpu limited by just saying "well, we had the fastest cpu you can get" when they could have used mantle in BF4 to lessen cpu usage.
But to say one company has another one cornered is a bit bias. Not a bit, just straight up bias. I like both companys, they are both doing great IMO.
After my last burn with SLI GTX295s, I will never go back to QuadSLI. I am still having an issue leaving my SLI GTX680s @ 1300MHzcore / 7Ghz Ram setup. Then again i am still at 1080p like 99% of the gamers.
4K isn't ready until refresh rate is bumped up 60Hz- 120Hz and better HDMI standards.
I think Tom went mad to catch Jerry ....