Watch Dogs is one of the most anticipated games of 2014. So, we're testing it across a range of CPUs and graphics cards. By the end of today's story, you'll know what you need for playable performance. Spoiler: this game is surprisingly demanding!
AMD's Hawaii GPU makes its appearance in the workstation space as FirePro W9100. Does this $4000 card have what it takes to displace Nvidia's Quadro K6000, or is it a more conservative performer? We throw an exhaustive benchmark suite at it to find out.
Two years and two graphics card generations have passed since the last major update to our famous graphics card performance charts. It's time to get them back up to speed. We introduce modern benchmarks, new measurement equipment, and fresh methodology.
We spent our weekend benchmarking the sharp-looking iBuyPower Erebus loaded with a pair of Radeon R9 295X2 graphics cards. Do the new boards fare better than the quad-GPU configurations we've tested before, or should you stick to fewer cards in CrossFire?
Judging from the R9 290X Lightning's hefty build, it takes a lot of metal to cool the Hawaii GPU properly. But what does this massive card give you aside from sharp looks? How about impressive acoustics? Is its $750 price tag worth the premium experience?
SPECviewperf 12 sets out to be the standard for evaluating workstation graphics cards by including the latest professional applications, more complex models, and synthetic workloads pulled from important market segments. We test 19 cards in the new suite.
“Do you have what it takes?” AMD asks, purportedly referring to the big budget and beefy power supply you need before buying its new Radeon R9 295X2. We benchmark the 500 W, dual-GPU beast against several other high-end configs before declaring a winner.
We're not particularly fond of AMD's reference Radeon R9 270-series cooling solution. Fortunately, most of the company's board partners have their own heat sinks and fans. We take 10 cards and measure their clock rates, thermals, and acoustics.
Liquid cooling solves the thermal challenges presented by AMD's Hawaii GPU much more elegantly than a big heat sink and loud fan. But the requisite parts also add cost. Does VisionTek's CryoVenom R9 290 deliver maximum performance at a fair price?
With pricing all over the map, AMD wants to plug the gap between its Radeon R7 260X and R9 270. To that end, it's introducing a Curaçao-based Radeon R7 265 with better-than-Radeon HD 7850 performance at $150. Will that be enough to stave off Maxwell?
AMD's name might be new, but we're already intimately familiar with its Radeon R7 250X (formerly known as the Radeon HD 7770). Can AMD take an old piece of hardware and turn it into something you want to spend money on in 2014? Let's have a quick look...
AMD announced its Radeon R7 260 in December of last year, and we were excited about a $110 Radeon HD 7770 replacement. Almost two months later, one model is available on Newegg for $140. Today, we're testing the card and pondering its curious position.
We like the idea of two GK104 GPUs in SLI on one graphics card. Sounds like a GeForce GTX 690, right? Except that board costs $1000 and Asus' Mars 760 sells for $650. In a world with sub-$700 GeForce GTX 780 Tis, can this dual-GPU stunner still impress?
Now that AMD's Radeon R7 240 and 250 are here, we want to know a little more about what the sub-$100 market looks like. Can the latest Oland-based boards serve up playable performance in the latest titles, or are there other hidden gems to discover?
We're in the process of testing Radeon R9 290X cards from AMD's board partners, and were curious how they all fare in a closed chassis. Corsair's deluxe Obsidian 900D offers lots of airflow, so we dusted off a more mainstream $80 case to test with.
Back when GeForce GTX 780 sold for $650, slinging a bunch of 760s together looked like a great deal. Now that the 780's down to $500, is there still value in going three-way SLI with GK104, or are you better served by a couple of GK110-based 780 cards?
You've forever faced this dilemma: disable V-sync and live with image tearing, or turn V-sync on and tolerate the annoying stutter and lag? Nvidia promises to make that question obsolete with a variable refresh rate technology we're previewing today.