Does it come as any surprise that a second graphics card sporting AMD’s Hawaii GPU, lightly altered, appears identical to the Radeon R9 290X? Given the lack of evolution that went into 290X’s thermal solution, we wholly expected 290 to be indistinguishable. Today’s description gets a whole lot easier as a result.
In short, this is the same 11”-inch-long, dual-slot board with a 75 mm centrifugal fan.
Its top edge prominently features the same eight- and six-pin auxiliary power connectors, and a distinct lack of CrossFire connectors. To that point, Radeon R9 290 benefits from the xDMA engine built into Hawaii’s on-die compositing block. Right out of the box, two of these boards support CrossFire configurations with frame pacing enabled at Ultra HD and multi-screen resolutions. What they don’t yet support is frame pacing in DirectX 9 games like Skyrim or OpenGL-based titles. AMD still claims that the beta driver adding that capability will be available before the end of 2013.
Display output connectivity is the same, too. Modified from my 290X coverage:
The R9 290 card we received has two dual-link DVI ports, a full-sized HDMI output, and one DisplayPort connector. Its Hawaii GPU features an updated display controller though, which includes a third independent timing generator. So, although the board comes equipped with one less display output than the R9 280X we recently reviewed, you can actually hook up six screens operating at different resolutions and timings to the R9 290 with an MST hub.
Hawaii’s new display controller will also enable the 600 MHz pixel rates needed to support upcoming single-stream Ultra HD displays at 60 Hz. As you know, currently, the only way to drive a 4K screen is through two HDMI ports or one DisplayPort 1.2 output with MST support. These correspond to a pair of 1920x2160 tiles that come together as a 2x1 Eyefinity array. Next-generation scalars will make 3840x2160p60 possible without tiling—they’ll simply require higher pixel clocks. Radeon R9 290 can do it for sure, but AMD isn’t certain whether its older display controllers will.
We'll go into more detail in the pages that follow, but it's also worth noting that AMD claims that Radeon R9 290 bears the same 250 W typical board power as the 290X. That was a conservative estimate for the 290X, and the same likely goes for 290, too. Suitably, AMD also arms this board with one eight- and one six-pin power connector.
- Digging Deeper Into Hawaii’s Behavior
- Sidebar: Variability Turns Into A Graphics Card Crapshoot
- Meet The Radeon R9 290
- Test Setup And Benchmarks
- Results: Arma III
- Results: Battlefield 4
- Results: BioShock Infinite
- Results: Crysis 3
- Results: Metro: Last Light
- Results: The Elder Scrolls V: Skyrim
- Results: Tomb Raider
- Results (DirectX): AutoCAD 2013 And Inventor
- Results (OpenGL): LightWave And Maya 2013
- Results (OpenCL): GPGPU Benchmarks
- Gaming Power Consumption Details
- Detailed Gaming Efficiency Results
- Power Consumption Overview
- Noise And Video Comparison
- Do-It-Yourself Upgrade With Arctic's Accelero Xtreme III
- Radeon R9 290: Priced Right Where We’d Peg It




http://techreport.com/review/25602/amd-radeon-r9-290-graphics-card-reviewed/9
Chris, these results differ drastically from real world results from 290X owners at OCN... I understand that your observations are anecdotal and based on a very small sample size but do you mind looking into this matter further because putting such a statement in bold in the conclusion even though it contradicts real world experiences of owners just provides a false assumption to the uninformed reader...
The above claim has already escalated further than it should... A Swiss site actually has already rebutted by testing their own press sample with a retail model and concluded the following:
In the quiet mode, where the dynamic frequencies to work overtime, the situation becomes slightly turbid. A minor performance difference can be seen in some titles, and even if it is not about considerable variations, the trend is clear. In the end, it does an average variance tion of only a few percent, ie no extreme levels. The reason may include slightly less contact with the cooler, or simply easy changing ambient temperature.
http://techreport.com/review/25602/amd-radeon-r9-290-graphics-card-reviewed/9
Chris, these results differ drastically from real world results from 290X owners at OCN... I understand that your observations are anecdotal and based on a very small sample size but do you mind looking into this matter further because putting such a statement in bold in the conclusion even though it contradicts real world experiences of owners just provides a false assumption to the uninformed reader...
The above claim has already escalated further than it should... A Swiss site actually has already rebutted by testing their own press sample with a retail model and concluded the following:
In the quiet mode, where the dynamic frequencies to work overtime, the situation becomes slightly turbid. A minor performance difference can be seen in some titles, and even if it is not about considerable variations, the trend is clear. In the end, it does an average variance tion of only a few percent, ie no extreme levels. The reason may include slightly less contact with the cooler, or simply easy changing ambient temperature.
Now to wait for the non-reference cards at the end of the month!
It looks like a good card for the price as it even keeps up with the $100 more GTX780. This is good as NVidia may drop prices even more which means we could also see a price drop on the 290X and I wouldn't mind a new 290X Toxic for sub $500.
Best to wait a month or two before buying to see how this all goes down
Some people who need CUDA for work and GPU for gaming will still get 780s, but no one will get 290x for $150 premium just to get a couple more FPS over 290. AMD just shot themselves in the foot before hurting nvidia.
Nvidia made a very good job with the reference cooler(but you really pay for it)... do you think AMD could not have pulled of a "monster" cooler?? is it really hard to make a good cooler? no, it is expensive.
You could do this, you have youre sources
Strange thing and I know some of us were going through this. I was thinking getting a 280x on Black Friday/Cyber Monday but the price tag is leaving me with something to think about. I think I'm just going to save up the few pennies to get something I thought was out of my price range ($300-450) a month ago ($650+).