GeForce GTX 760 Review: GK104 Shows Up (And Off) At $250
With its last graphics card introduction until the end of Fall, Nvidia isn't trying to impress anyone with groundbreaking performance. Rather, the company is pulling better-than GeForce GTX 660 Ti-class frame rates to a $250 price point, creating value.
Test Setup And Benchmarks
Test Hardware | |
---|---|
Processors | Intel Core i7-3770K (Ivy Bridge) 3.5 GHz at 4.0 GHz (40 * 100 MHz), LGA 1155, 8 MB Shared L3, Hyper-Threading enabled, Power-savings enabled |
Motherboard | Gigabyte Z77X-UD5H (LGA 1155) Z77 Express Chipset, BIOS F15q |
Memory | G.Skill 16 GB (4 x 4 GB) DDR3-1600, F3-12800CL9Q2-32GBZL @ 9-9-9-24 and 1.5 V |
Hard Drive | Crucial m4 SSD 256 GB SATA 6Gb/s |
Graphics | Nvidia GeForce GTX 760 2 GB |
Row 5 - Cell 0 | Nvidia GeForce GTX 770 2 GB |
Row 6 - Cell 0 | Nvidia GeForce GTX 660 Ti 2 GB |
Row 7 - Cell 0 | Nvidia GeForce GTX 660 2 GB |
Row 8 - Cell 0 | Nvidia GeForce GTX 670 2 GB |
Row 9 - Cell 0 | AMD Radeon HD 7950 with Boost 3 GB |
Row 10 - Cell 0 | AMD Radeon HD 7950 3 GB |
Power Supply | Cooler Master UCP-1000 W |
System Software And Drivers | |
Operating System | Windows 8 Professional 64-bit |
DirectX | DirectX 11 |
Graphics Driver | AMD Catalyst 13.6 (Beta 2) |
Row 16 - Cell 0 | Nvidia GeForce Release 320.39 |
Getting Frame Time Variance Right
Astute readers will notice that the numbers on the following page (and those thereafter) are quite a bit more conservative than the same page in my Radeon HD 7990 review, and there is a reason for this. We were previously reporting the raw and real-world frame rates, and then showing you the frame time variance data with runt and dropped frames still included. The thing is, if that’s not what you experience, it isn’t fair to then point to the raw frame time latencies and hammer AMD on them.
This is why we’re now giving you the more practical frame rates over time, along with frame rate variance numbers that match. The outcome is far less exaggerated, though still very telling in terms of the games where AMD struggles.
Benchmarks And Settings | |
---|---|
Battlefield 3 | Ultra Quality Preset, v-sync off, 1920x1080, DirectX 11, Going Hunting, 90-Second playback, FCAT |
Far Cry 3 | Very High Quality Preset, DirectX 11, 2x MSAA, v-sync off, 1920x1080, Custom Run-Through, 50-Second playback, FCAT |
Borderlands 2 | Highest-Quality Settings, PhysX Low, 16x Anisotropic Filtering, 1920x1080, Custom Run-Through, FCAT |
Metro: Last Light | High Quality Preset, 16x Anisotropic Filtering, Normal Motion Blur, DirectX 11, 1920x1080, Built-In Benchmark, Scene D6, FCAT |
The Elder Scrolls V: Skyrim | Ultra Quality Preset, FXAA Disabled, 1920x1080, Custom Run-Through, 25-Second playback, FCAT |
BioShock Infinite | Ultra Quality Settings, DirectX 11, Diffusion Depth of Field, 1920x1080, Built-in Benchmark Sequence, FCAT |
Crysis 3 | High System Spec, MSAA: Low (2x), High Texture Resolution, 1920x1080, Custom Run-Through, 60-Second Sequence, FCAT |
Tomb Raider | Ultimate Quality Preset, FXAA Enabled, 16x Anisotropic Filtering, TressFX Hair, 1920x1080, Custom Run-Through, 45-Second Sequence, FCAT |
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
-
SiliconWars This doesn't look faster than the 7950 boost to me. Maybe you should check your scores and update your conclusion to reflect reality?Reply -
pauldh 11035777 said:This doesn't look faster than the 7950 boost to me. Maybe you should check your scores and update your conclusion to reflect reality?
Re-read the conclusion in question below. He doesn't say it is faster, he says this card will replace Don's recommendation for best $250 card and displace the 7950 Boost. ie. Don won't be recommending a $300 card that trades blows or barely beats a $250 card. If both were to end up $250, things change.
quote - "A quick reference to Best Graphics Cards For The Money: June 2013 shows that Don is currently recommending the Tahiti-based Radeon HD 7870 for $250. With almost certainty, the GeForce GTX 760 will take that honor next month, displacing the Radeon HD 7950 with Boost at $300 in the process." -
mapesdhs Chris, what is it about the GTX 580 that makes it so slow for the CUDA FluidmarkReply
test, given it does so well for the other CUDA tests, especially iRay and Blender?
Btw, I don't suppose you could include 580 SLI results for the game tests? ;)
Or do you have just the one 580?
My only gripe with the 760 is the misuse of a model number which allows one to
infer it should be quicker than older cards with 'lesser' names (660, etc.) when
infact it's often slower. I really wish NVIDIA would stop releasing products that
exhibit such enormous performance overlap. Given the evolutionary nature of
GPUs, and the time that has passed since the 600s launched, one might
reasonably expect a 760 to beat the 670 too, but it never does. To me, the
price drop is the only thing it has going for it. The endless meddling with shader
numbers, clocks, bus width, etc., creates an utter muddle of performance
response depending on the game. One really has to judge based on the
individual game rather than any general product description or spec summary.
I just hope Skyrim players with 660s don't upgrade on the assumption newer
model names mean better performance, but I expect some will.
Ian.
-
tomfreak GTX760 is an upgrade for GTX460/560 user and of all of that u didnt throw in those cards to bench with. Seriously?Reply -
Novuake Nice review as per usual Chris.Reply
Amazing performance at 250$. The 265bit memory interface does wonders for GK104.
Now I am wondering if there will even be a GTX760ti, while there is a large enough gap in the product stack, I have a feeling there is a chance there may not be a "ti" version.
Anyone know more? -
sarinaide AMD will have to release a new interim Radeon series, the existing family is not to outdated to be stretched to much longer.Reply -
horaciopz So, maybe there will be an GTX 760 ti, for about 300 bucks with the peformance of a GTX 670... Uh? nVidia really should. This remembers the gtx 400 series and 500 series... nVidia is doing it all over again.Reply