Closed

PowerColor Devil 13 Dual Core R9 290X 8 GB Review: Dual Hawaii on Air

PowerColor’s Devil 13 graphics card, with its two Hawaii GPUs and massive heat sink, weighs in at more than two kilograms and exudes luxury. But can it compete with AMD’s dual-GPU reference design with closed-loop water cooling? Let’s find out!

PowerColor Devil 13 Dual Core R9 290X 8 GB Review: Dual Hawaii on Air : Read more
43 answers Last reply
More about powercolor devil dual core 290x review dual hawaii air
  1. Looking at page 4, that screen is insane!! Love it!
  2. Was this card staying at 1000Mhz during those benchmark runs?
    The FPS difference at 1080p is ~7%, yet the clock difference is less than 2%.

    Also,
    Did you find the Devils maximum stable overclock? With all that power available, I'd imagine this thing could achieve better overclocks than the 295x2.
  3. OC makes no sense, because the card will be really loud. And please read the review attentively! The performance difference between both cards reflects the difference in power consumption nearly 1:1! To handle this cooling by air, the power color card uses a lower power target. Since AMDs Power Tune and Nvidias Boost the pure core clock rates says nothing about the final performance! In my eyes this is also a good study about the limits of an air-cooler.

    Take a look at the page with the HiRes power draw. This card isn't a perpetuum mobile. Less power consumption = less gaming performance. OC brings really nothing. Ok, you can destroy your ears... (or the card). We had to handle this rare card very carefully, so I was not able to break the voltage barrier.
  4. FormatC said:
    OC makes no sense, because the card will be really loud. And please read the review carefully! The performance difference between both cards reflects the difference in power consumption! To handle this cooling by air, the power color card uses a lower power target. Since Power Tune and Boost the pure core clock says nothing abou the final performance!

    Take a look at the page with the HiRes power draw. This card isn't a perpetuum mobile. Less power consumption = less gaming performance. OC brings really nothing. Ok, you can destroy your ears...


    "Destroy my ears"?
    "OC makes no sense because the card will be really loud"? Are you serious??????
    Since when has that stopped anyone?
    This thing is quiet compared to high end cards from 5+ years ago....

    Have we become so spoiled by the advances in technology, that has enabled higher performance at lower noise levels, that we will not push the limits in fear of a little noise!?


    It's an ultra high end GFX card made for the kind of people who like to push the limits. It should absolutely be overclocked and benchmarked. With a big fat mind-blowing power usage chart with figures higher than any card has ever pushed!

    Also, clock rates still directly correlate to performance. Lowering the power tune limit will limit clock rates, and vice versa. Lower clock rates equals lower performance, but lower power does not always equal lower performance.
  5. EDIT: Typo Fixed :D
  6. I just got done reading 3 other reviews for this card, and each of those reviews had this card slightly above the 295x2 in their gaming benchmarks. (despite the Devil's boost clock being slightly lower).
    That seems a little odd.

    The games, game settings, drivers, average clock rates, and Bios mode used for the benchmark figures/comparison, are not listed in this review. (unusual for Toms) That information would be very helpful.
    I suspect that the 295x2 was maintaining a higher average clock rate, in this comparison. (higher than 2%, anyways)
    PowerColor, almost certainly made sure that this card meets-or-beats the performance of the 295x2, before sending it off for review.
  7. It's interesting to see this kind of enormous, powerful, noisy card being developed in a market where most games are designed to fit the limits of console hardware.
  8. Why have they called this a 290X? Rather confusing, it should be called a 295X.
    Having it listed in the 290X section on seller sites is dumb. Also, it's not an 8GB
    card, it's a 2x4GB card. I really wish tech sites would stop GPU vendors from
    getting away with this inaccurate product spec PR. Call it for what it is, 2x4GB,
    and if vendors don't like it, say tough cookies. The user will never see '8GB' so
    the phrase should not be used as if they could (though PowerColor seems happy
    to have such misleading info on its product page). I'm assuming you agree with this
    Igor, because the table on pp. 11 does refer to the Devil 13 as a 2x4GB... ;)

    Btw, checking a typical seller site here (UK), the cheapest 290X is 1040MHz core,
    so given the Devil 13 uses 3 slots anyway, IMO two factory oc'd 290Xs make more
    sense, and would save more than 300 UKP.

    Ian.

    PS. The typo Mac266 mentioned is still present.
  9. Why did the person who wrote this article focus primarily on power consumption and efficiency?

    This review has 4 pages of power consumption/efficiency data, with some impressively detailed information. But, it only has 2 pages of actual performance data, with almost no details at all.

    Who wouldn't want to see this card overclocked to a ridiculous extent, with plumes of smoke coming off of it, and the only power consumption figures showing that it's consuming more power than any other card ever made?


    (I had to edit this comment. That first revision was a little crazy.)
  10. jlwtech said:
    The primary focus of this review is absurd!
    Why did the person who wrote this article focus primarily on power consumption and efficiency?

    This review has 4 pages of power consumption/efficiency data, with some impressively detailed information. But, it only has 2 pages of actual performance data, with almost no details at all.



    The performance is so close to the performance of the liquid-cooled R9-295X that it would basically be a repeat of that review. If you want an idea of its performance, just re-read that review and maybe reduce each frame rate by 3%-ish.

    OT: 60Db? Into the trash it goes. I don't care how expensive a card is, if it's too loud that I can't have a goddamn normal conversation near my computer, it's going in the skip. I don't want to surround myself in an anti-social bubble of noise-induced hearing loss every time I want to game.
  11. bemused_fred said:
    jlwtech said:
    The primary focus of this review is absurd!
    Why did the person who wrote this article focus primarily on power consumption and efficiency?

    This review has 4 pages of power consumption/efficiency data, with some impressively detailed information. But, it only has 2 pages of actual performance data, with almost no details at all.



    The performance is so close to the performance of the liquid-cooled R9-295X that it would basically be a repeat of that review. If you want an idea of its performance, just re-read that review and maybe reduce each frame rate by 3%-ish.

    OT: 60Db? Into the trash it goes. I don't care how expensive a card is, if it's too loud that I can't have a goddamn normal conversation near my computer, it's going in the skip. I don't want to surround myself in an anti-social bubble of noise-induced hearing loss every time I want to game.



    So what if the performance is close? EVERY OTHER review Toms has done on aftermarket flagship cards has been performance-based, why would that be a reason to stop now?
    This is the first custom version of the fastest GFX card on earth, why wouldn't we want to see a performance-based review?
    Why would anyone want to see power efficiency numbers opposed to performance numbers, on a card like this? Does that really matter to the kind of person who would actually buy a $1500 GFX card? Of course not.

    "anti-social bubble"? That's pretty thin...
    Do you tend to make conversation with people in the room, while gaming at full tilt? Sounds kinda distracting...
    5+ years ago, 60DB+ was par-for-the-course for any high-end card like this. GTX480. GTX295. HD6990. HD5970. HD4870x2, the list goes on and on....
    Besides, 60DB is on an open test bench at close range. Real-world usage will be much quieter. Check some other reviews to see more realistic figures.

    Everyone is getting spoiled these days.....
  12. Quote:
    OT: 60Db? Into the trash it goes.

    Would you really throw a $1500 card in the trash, simply because it was loud?
    I'd just wear headphones....
  13. Quote:

    "Destroy my ears"?
    "OC makes no sense because the card will be really loud"? Are you serious??????
    Since when has that stopped anyone?
    This thing is quiet compared to high end cards from 5+ years ago....

    Have we become so spoiled by the advances in technology, that has enabled higher performance at lower noise levels, that we will not push the limits in fear of a little noise!?


    It's an ultra high end GFX card made for the kind of people who like to push the limits. It should absolutely be overclocked and benchmarked. With a big fat mind-blowing power usage chart with figures higher than any card has ever pushed!

    Also, clock rates still directly correlate to performance. Lowering the power tune limit will limit clock rates, and vice versa. Lower clock rates equals lower performance, but lower power does not always equal lower performance.


    You seem to have missed the part where he said ``We had to handle this rare card very carefully, so I was not able to break the voltage barrier.`` Hence no overclocking.

    Quote:
    Also, clock rates still directly correlate to performance.


    You`ve not tested many cards I can see. Clock rates have diminishing returns once you pass a certain point. This point varies from GPU to GPU and even from individual cards. At that point, it`s not worth the ridiculous amount of excess noise to gain 1-5 %. In addition to instability that would definitely follow due to a lack of reliable voltage.
  14. mapesdhs said:

    Why have they called this a 290X? Rather confusing, it should be called a 295X.
    Having it listed in the 290X section on seller sites is dumb. Also, it's not an 8GB
    card, it's a 2x4GB card. I really wish tech sites would stop GPU vendors from
    getting away with this inaccurate product spec PR. Call it for what it is, 2x4GB,
    and if vendors don't like it, say tough cookies. The user will never see '8GB' so
    the phrase should not be used as if they could (though PowerColor seems happy
    to have such misleading info on its product page). I'm assuming you agree with this
    Igor, because the table on pp. 11 does refer to the Devil 13 as a 2x4GB... ;)

    Btw, checking a typical seller site here (UK), the cheapest 290X is 1040MHz core,
    so given the Devil 13 uses 3 slots anyway, IMO two factory oc'd 290Xs make more
    sense, and would save more than 300 UKP.

    Ian.

    PS. The typo Mac266 mentioned is still present.



    Legally they are not allowed. AMD wants that moniker reserved only for their reference model.
  15. LOL The Titan Z also uses 400w just like this 3-slot 295X2. Kepler is so much more power efficient. /s
  16. Quote:

    "anti-social bubble"? That's pretty thin...
    Do you tend to make conversation with people in the room, while gaming at full tilt? Sounds kinda distracting...
    5+ years ago, 60DB+ was par-for-the-course for any high-end card like this. GTX480. GTX295. HD6990. HD5970. HD4870x2, the list goes on and on....
    Besides, 60DB is on an open test bench at close range. Real-world usage will be much quieter. Check some other reviews to see more realistic figures.

    Everyone is getting spoiled these days.....


    60db is too loud when quieter options are available for the same price or less. Most people get around 58db from the stock 290x blower when it hits its "über" mode @ %55 or w/e :

    http://www.youtube.com/watch?v=QQhqOKKAq7o - Uhhh no thx.
  17. jlwtech said:
    The primary focus of this review is absurd!
    Why did the person who wrote this article focus primarily on power consumption and efficiency?

    This review has 4 pages of power consumption/efficiency data, with some impressively detailed information. But, it only has 2 pages of actual performance data, with almost no details at all.


    Read the review!
    Quote:
    We're using the 2014 VGA Charts database for comparative benchmarking. This gives us a great basis for creating an index with all games and tests taken into consideration. If you want the individual test results, check out the charts section itself, where you'll find all of the individual numbers composing the index and have the option to create comparison tables.


    For all the blinds again as separate link:
    http://www.tomshardware.com/charts/2014-vga-charts/benchmarks,175.html
  18. jlwtech said:
    I just got done reading 3 other reviews for this card, and each of those reviews had this card slightly above the 295x2 in their gaming benchmarks. (despite the Devil's boost clock being slightly lower).
    That seems a little odd.

    The games, game settings, drivers, average clock rates, and Bios mode used for the benchmark figures/comparison, are not listed in this review. (unusual for Toms) That information would be very helpful.
    I suspect that the 295x2 was maintaining a higher average clock rate, in this comparison. (higher than 2%, anyways)
    PowerColor, almost certainly made sure that this card meets-or-beats the performance of the 295x2, before sending it off for review.



    I haven't seen a review yet that this card outperforms the r9-295x2. In every review this card goes into a case it temp throttles pretty badly. the only time it seems to not temp throttle is when it's on an open bench.

    I think everyone is missing the point of this card. This card isn't meant to be used with the stock cooler. With 4x 8 pin power plugs, this card is meant to be put on a water block and then overclocked to the moon. this card was designed to shove as much power as you can into it. the air cooler is an afterthought.

    they likely saved more money sticking the air cooler onto it, and giving out that mouse, then the stock heatsink/waterloop would have cost them.

    What they need to do is review this card like it was intended to be used. on a waterblock.
  19. Quote:
    What they need to do is review this card like it was intended to be used. on a waterblock.

    Prohibited by vendor. After the thirth reviewer has the card disassembed, stressed and "repaired" this piece is simply waste. OC with this hoover? Beware...

    And for the others: the card is not able to hold ther stock clock rates stable without increasing the fan rpm to a horrible level. Have fun with this orchestra. My conclusion was simply and short: this cooler ist limited.
  20. In my house at Russia I use this to stay warm in winter and scare away brown bears.
  21. This piece sounds like a million of bees. Be carefully with the bear :D
  22. FormatC said:
    Quote:
    What they need to do is review this card like it was intended to be used. on a waterblock.

    Prohibited by vendor. After the thirth reviewer has the card disassembed, stressed and "repaired" this piece is simply waste. OC with this hoover? Beware...

    And for the others: the card is not able to hold ther stock clock rates stable without increasing the fan rpm to a horrible level. Have fun with this orchestra. My conclusion was simply and short: this cooler ist limited.


    Gotcha. I thought it was something like this. Shame you can't talk PowerColor into letting you test this without that hair-drier attached. Cause the only thing about this card that stands out is the incredible power delivery it allows... which is just crying out for a waterblock.

    Good review. Shame this card didn't just come with a custom water block.

    Had you stuck it in a case i'm sure you'd have had the same issue everyone else seems to have when they stick it into a case... temp throttling.
  23. I hate this sample rotation - too bad. :(
  24. Am I the only one who thinks those temps at full load in performance mode are crazy low? I mean, a dual hawaii card aircooled thats getting around 70c full load? Thats crazy.
  25. anthony8989 writes:
    > Legally they are not allowed. AMD wants that moniker reserved only for their
    > reference model.

    Then they should call it something else, but certainly not 290X which is the
    standard name for the single-GPU card. I had hoped we'd be past the insane
    GPU naming madness by now, but it seems not.

    Ian.
  26. Quote:
    Have you even tried to read other reviews.

    Take a look into our charts :D This link shows you pure synthetic (try to play it) and only in this single benchmark when the Powercolor card is measurable faster for me. The rest is more or less equal and depends at the chip quality of each sample! I'm using an own retail R9 295X2 and this card is really faster then the first media sample that AMD sent to us!

    I can OC the 295X2 without problems above 1,15 GHz using the right PSU (be quiet! Dark Power Pro P10 1200W). The main problem is not the current as such, but the high frequencies. You need a PSU with good cables and larger caps inside. I had a problem with an 1200W Enermax, but this piece switched off. Interesting: due undervoltage. That shows us, that the CWT platform inside the Platimax is real crap. Too small caps because the VGA cards current peaks came faster than the PSU was able to fill the caps. The Chroma was not able to figure out such problems because this expensive tool is too slow for current VGA cards. Such fast peaks can't be simulated, only real AC. :D
  27. anthony8989 said:
    Quote:

    "anti-social bubble"? That's pretty thin...
    Do you tend to make conversation with people in the room, while gaming at full tilt? Sounds kinda distracting...
    5+ years ago, 60DB+ was par-for-the-course for any high-end card like this. GTX480. GTX295. HD6990. HD5970. HD4870x2, the list goes on and on....
    Besides, 60DB is on an open test bench at close range. Real-world usage will be much quieter. Check some other reviews to see more realistic figures.

    Everyone is getting spoiled these days.....


    60db is too loud when quieter options are available for the same price or less. Most people get around 58db from the stock 290x blower when it hits its "über" mode @ %55 or w/e :

    http://www.youtube.com/watch?v=QQhqOKKAq7o - Uhhh no thx.


    I agree. I'm not saying that noise should not be taken into consideration when making a purchase, or that people should just "put up with it". I wouldn't buy it, and I wouldn't recommend buying it.
    My point was, loudness is no reason to not fully test a cards capabilities. This is a review after all, and noise or not, I suspect that most readers would still like to see what this card can do.
  28. anthony8989 said:
    Quote:

    "Destroy my ears"?
    "OC makes no sense because the card will be really loud"? Are you serious??????
    Since when has that stopped anyone?
    This thing is quiet compared to high end cards from 5+ years ago....

    Have we become so spoiled by the advances in technology, that has enabled higher performance at lower noise levels, that we will not push the limits in fear of a little noise!?


    It's an ultra high end GFX card made for the kind of people who like to push the limits. It should absolutely be overclocked and benchmarked. With a big fat mind-blowing power usage chart with figures higher than any card has ever pushed!

    Also, clock rates still directly correlate to performance. Lowering the power tune limit will limit clock rates, and vice versa. Lower clock rates equals lower performance, but lower power does not always equal lower performance.


    You seem to have missed the part where he said ``We had to handle this rare card very carefully, so I was not able to break the voltage barrier.`` Hence no overclocking.

    Quote:
    Also, clock rates still directly correlate to performance.


    You`ve not tested many cards I can see. Clock rates have diminishing returns once you pass a certain point. This point varies from GPU to GPU and even from individual cards. At that point, it`s not worth the ridiculous amount of excess noise to gain 1-5 %. In addition to instability that would definitely follow due to a lack of reliable voltage.



    To your first reply:
    I didn't miss that part.
    I understand it's a heavy, large, and cumbersome card. I can understand that Toms didn't want to break it, or their equipment. That's why they didn't get to test it inside a case. But they still could have overclocked it on the bench. In fact, they did overclock it to 1018Mhz. Surely, they could have gone further...


    To your second reply:
    So what are you trying to say here? That clock rates do not directly correlate to performance?
    It seems that your reply has very little to do with the statement you have quoted.
    That comment was in reply to the following (inaccurate) statements made by FormatC:
    "Since AMDs Power Tune and Nvidias Boost the pure core clock rates says nothing about the final performance!
    Less power consumption = less gaming performance. OC brings really nothing."

    Less power does not always equal less performance. It's not a given. Actual clock rates still dictate performance. Not the clock rates stated on the box, mind you, but the actual clock rate of the card while performance is being measured. There is a direct correlation.
    Noise, stability, and diminishing returns are a different matter entirely.
  29. ingtar33 said:
    jlwtech said:
    I just got done reading 3 other reviews for this card, and each of those reviews had this card slightly above the 295x2 in their gaming benchmarks. (despite the Devil's boost clock being slightly lower).
    That seems a little odd.

    The games, game settings, drivers, average clock rates, and Bios mode used for the benchmark figures/comparison, are not listed in this review. (unusual for Toms) That information would be very helpful.
    I suspect that the 295x2 was maintaining a higher average clock rate, in this comparison. (higher than 2%, anyways)
    PowerColor, almost certainly made sure that this card meets-or-beats the performance of the 295x2, before sending it off for review.



    I haven't seen a review yet that this card outperforms the r9-295x2. In every review this card goes into a case it temp throttles pretty badly. the only time it seems to not temp throttle is when it's on an open bench.

    I think everyone is missing the point of this card. This card isn't meant to be used with the stock cooler. With 4x 8 pin power plugs, this card is meant to be put on a water block and then overclocked to the moon. this card was designed to shove as much power as you can into it. the air cooler is an afterthought.

    they likely saved more money sticking the air cooler onto it, and giving out that mouse, then the stock heatsink/waterloop would have cost them.

    What they need to do is review this card like it was intended to be used. on a waterblock.



    Here is a pair of reviews that show the Devil beating the 295x2:
    http://www.guru3d.com/articles_pages/powercolor_radeon_290x_295x2_devil13_review,23.html

    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66784-powercolor-devil-13-r9-290x-dual-core-review-7.html

    There are more out there. All you have to do is look.
  30. FormatC said:
    jlwtech said:
    The primary focus of this review is absurd!
    Why did the person who wrote this article focus primarily on power consumption and efficiency?

    This review has 4 pages of power consumption/efficiency data, with some impressively detailed information. But, it only has 2 pages of actual performance data, with almost no details at all.


    Read the review!
    Quote:
    We're using the 2014 VGA Charts database for comparative benchmarking. This gives us a great basis for creating an index with all games and tests taken into consideration. If you want the individual test results, check out the charts section itself, where you'll find all of the individual numbers composing the index and have the option to create comparison tables.


    For all the blinds again as separate link:
    http://www.tomshardware.com/charts/2014-vga-charts/benchmarks,175.html



    I did read the review. Several times.
    Yes, there is a lot more info in the 2014 VGA Database. Why not put it all in the review, instead of giving the readers homework?
    Also, the 2014 VGA Database still doesn't provide enough details.
    What drivers were used for each card?
    Which Bios mode was the Devil set to?
    What were the average and maximum clocks of each card during each benchmark? (If a card throttles, performance is severely impacted. That's critical information, especially when comparing one card to another.)
    Toms had all this info (and more) in the 290x launch review, why not include now? Why not always include it?
    There is a lot of inconsistency between reviews. That makes it difficult to compare one review to another.
  31. For the record, the reason I am asking so many questions about the performance figures in this review, and how they were obtained, is this:

    The results are counterintuitive:
    The performance figures show the 295x2 being 6% faster than the Devil at 1080p, and 3% faster at 2160p.
    The Devil should have been ahead at 2160p, and no more than 2% behind at 1080p, if no throttling occurred.
    The review doesn't mention if the Devil throttled, but seems to predict the counterintuitive performance results based on the power consumption/efficiency data. Since both cards use totally different PCB's, power circuitry designs, and cooling methods, it's safe to assume that efficiency (per framerate) will also be different.
    The PowerColor board has 3 fans to spin up, and higher memory clocks to feed. I imagine it would use more power than the 295x2.

    The facts:
    Both cards have the same core clock.
    The 295x2 can boost up to 1018Mhz while the Devil is stuck at 1000Mhz. A 2% difference.
    The Devil's memory is clocked at 5400Mhz, while the 295x2 is 5000Mhz. An 8% difference.

    Estimated performance results:
    They should perform almost identically on an open test bench, with the Devil having a small advantage (3-5%) at higher resolutions due to the faster memory clock. The 295x2's slightly higher boost clock should equate to a less than 2% performance advantage at lower resolutions.

    2% faster boost clock does not equal 6% better performance. There is some other contributing factor.
  32. i want this card or the amd reference card but i will wait 5 years when its 120.00 but for you people with a huge bank account enjoy this beast
  33. Quote:
    2% faster boost clock does not equal 6% better performance. There is some other contributing factor.


    GPUs are unique and each GPU has their own quality! The performance difference between AMDs press sample and my retail card is for example above 5%! And I've measured, that the Devil uses a lower power target. That means: the same clock rate overall, but a little bit less performance and less power consumption. The R9 295X2 allows higher short peaks for GPU voltage as the Devil - take a look at the power consumption. I have here detailed power draw data in intervals of 10 microseconds. For more detail please read our launch review for Hawaii and about Power Tune 2.0.

    On the other hand:
    I'm testing all cards each time and for each single benchmark only after a longer heating period! Mostly the sites are benchmarking the cards only "as is". This is not objective and can make a difference of 3-5%! And now the question: what is better under this conditions? Ok, the Devil13 performs more or less well at start but the performance decreases after 10-15 minutes measurable. The water-cooled card can keep their performance in each case. Heated cards are real world, all other stuff is waste. And clock rates in charts? Unusable, because this average numbers are not stable enough to reproduce it each time exactly.

    The power consumption of the memory is too low in each case to make a difference between both cards. Edit: just compared - the OCed memory of the 295X2 needs 0.3 Watts more. This is a real joke and within all tolerances :D

    What needs more power, three smaller fans or one large 12V fan on the radiator plus pump?

    I wouldn't say, that the power supply part on the Devils PCB is better than on the R9 295X2s PCB. I see only one big advantage: more connectors. But with a good PSU this is ends equal. The principle of the reference card to handle all voltages over PCI-E is from my point of view a lot better. The power consumption of the Devil13 in idle is really worse and the Zero Core Power feature doesn't work properly on the Devil. It can't work, because it seems, that Powercolors design doesn't support it.

    The gaming performance doesn't scale linear with the memory clocks. This is an urban legend. Only in 4K you can see an advantage. This is only a marketing gimmick. I've tried the R9 295X2 also with higher clocks - in 1080p it is nearly useless.

    Quote:
    Why not put it all in the review, instead of giving the readers homework? Also, the 2014 VGA Database still doesn't provide enough details. What drivers were used for each card?
    Which Bios mode was the Devil set to?


    To be honest: if someone need more info - the charts are only one click away and I think it is not an advantage, to put 20 more pages into a review. Who need it, can simply click. Who not, will come a lot faster to the conclusion.

    The charts are more or less driver independend - I have all reference cards here and it makes each time, when a new driver appears, a lot of work to figure out if a benchmark result can be improved by this new version or not. This problem was one of the reasons to select more "older" games with always optimized drivers. If I see some driver improvements, the charts were and will be re-benched each time! Nobody can see this horrible work, but we do it each time!

    As I wrote in the review - it exists only ONE BIOS mode. The difference is only the fan speed, not more. Same power target, same voltages, same clock rates. The card runs a little bit cooler and is noisy as a hoover. That's all. This card is far away to be a must-have for me.
  34. i could not imagine spending this much money on a powercolor card, they do not honor their warrenties and they will break after two cards from them i am done with them forever. i was a big fan after my first card from them lasted years and years and then all the other ones ive tried have been bad experiences.
  35. FormatC,
    I was not aware that you were the person who wrote this review.
    Thank you for taking the time to talk with me.

    From what you have said in your comments, it seems that you do not believe that clock rates determine performance. I may have misunderstood you, but that's how it looks to me.
    That is not the case.
    For the most part, power consumption is proportional to performance these days. Especially when comparing 2 cards of the same architecture. But, power consumption is a byproduct of performance.
    It is not always true that more power = more performance, or vice-versa. Even with two cards of the same type. In the end, clock rates determine performance.
    Higher clock rate = higher performance. Always.

    Quote:
    GPUs are unique and each GPU has their own quality! The performance difference between AMDs press sample and my retail card is for example above 5%! And I've measured, that the Devil uses a lower power target. That means: the same clock rate overall, but a little bit less performance and less power consumption.

    They do not have the same clock rates, overall. They may have the same base clock, and boost clock, but that does not mean they will operate at the same clocks all the time. These boosting technologies allow a card to alter it's clock rates on the fly. The card can increase it's clock rate as long as it stays with a predefined set of power usage and temperature figures.
    The reason for the 5% performance difference between the press sample and the retail sample is because the press sample is a cherry-picked, low-leakage part. This allows it to maintain higher clock rates than the retail sample, while using the same amount of power, and producing the same amount of heat.

    Before boosting technologies were available on Video Cards, the Press sample and the Retail sample would produce identical benchmark figures, because they both operated at the same clock rate.
    Back then, the press sample would use less power, which makes less heat, which means less noise. Also, the press sample was usually a better overclocker than the retail sample.
    If you were to take the 295x2 and the Devil, and lock them down to the same clock rate (and memory clock), ensuring that both cards could not alter their clock rates, and then benchmark both of them, they would perform identically. They may have different power consumption figures, but the performance would be the same.

    Quote:
    And clock rates in charts? Unusable, because this average numbers are not stable enough to reproduce it each time exactly.

    It may not be stable enough to repeat, identically, each time, but it will be pretty close. There will be a relationship between average clock rate, and performance, in each benchmark, if you take the time to find it.
    Besides, when you bench a game 10 times in a row, the average FPS figures are not the same for each run. That is just as "unstable" as the average clock rate figures you mentioned. The FPS figures for each run, are averaged-out, into one final figure that gets used for the review/chart/whatever. Why is that not good enough for average clock rates?
    Maximum, Minimum, and Average clock rates, over the course of a benchmark run, are relevant figures that will help to provide a better overall picture.

    Quote:
    The power consumption of the memory is too low in each case to make a difference between both cards. Edit: just compared - the OCed memory of the 295X2 needs 0.3 Watts more. This is a real joke and within all tolerances

    I never said how big of a difference it would make, just that it would make a difference, and it does....

    Quote:
    The gaming performance doesn't scale linear with the memory clocks. This is an urban legend. Only in 4K you can see an advantage. This is only a marketing gimmick. I've tried the R9 295X2 also with higher clocks - in 1080p it is nearly useless.

    I'm not sure what you are getting at here. You seem to be disagreeing with me, even though:
    1) I never said that gaming performance scaled linearly with memory clock.
    2) I never said memory clocks effected performance at 1080p.
    3) I did say that the increased memory clocks would result in higher performance at 2160p (aka, 4K)

    Quote:
    The charts are more or less driver independend - I have all reference cards here and it makes each time, when a new driver appears, a lot of work to figure out if a benchmark result can be improved by this new version or not. This problem was one of the reasons to select more "older" games with always optimized drivers. If I see some driver improvements, the charts were and will be re-benched each time! Nobody can see this horrible work, but we do it each time!

    So what are you saying here? That the figures in the charts are not obtained with the same drivers?
    If the performance data in the charts is obtained with a mishmash of various drivers, than those figures are not nearly as accurate as they could be, or should be. Some cards will have an unfair advantage, making them seem better/worse than they really are. When comparing Video Cards, it's pretty-much standard practice to use the same drivers on both cards, to eliminate any possibility of an unfair advantage. Toms did at least one driver performance comparison review that I know of, and the performance differences were substantial.
    A 5% difference can alter the pecking order in the charts, and drivers can easily alter performance by more than 5%. Sometimes a LOT more.

    I understand that it is a serious undertaking to re-bench every card in the charts, but isn't accuracy the most important consideration for something like the VGA Charts? Shouldn't we strive for excellence? Surely there are some serious bragging rights for having the most accurate VGA database on the web. Plus, all the readers it would attract...

    Quote:
    As I wrote in the review - it exists only ONE BIOS mode. The difference is only the fan speed, not more. Same power target, same voltages, same clock rates. The card runs a little bit cooler and is noisy as a hoover. That's all.

    That is far from correct Sir.
    The BIOS modes (Quiet and Performance) do not directly impact performance, but they do alter fan speeds, and fan speeds alter temperatures, and temperature can affect clock rates. Higher fan speeds keep the card cooler, which keep the clock rates up, which results in better performance. Higher clock rates = higher performance.
    Every review that has compared the performance between the 2 BIOS modes, shows improved performance in "Performance Mode". Why is that? Because the card is throttling in Quiet mode, reducing it's clock rate, which reduces performance.

    In summary,
    The point I am trying to make here is this: Clock Rates directly impact performance, power consumption does not. (although they are closely related)
    Thank You for reading
  36. WOw the rage is strong in this comment section... Could not imagine why?

    Wait, 60dB??? Deary me, that's mildly insane...
  37. It also doubles as a blunt weapon, so you can game and beat the crap out of thieves at the same time.
  38. Hold up - no frametime variance charts? It was always the crux of Crossfire, does this card handle it better than normal?
  39. jerrolds said:
    Hold up - no frametime variance charts? It was always the crux of Crossfire, does this card handle it better than normal?


    Why would it?
  40. What is that static noise heard on the HD6990, HD7990, R9 295X2 and Devil13 R9 295X2 100% load noise comparison videos, page 9?
  41. In page 8, the power consumption for the other cards, gtx 780/ti and r9 290/x is much lower then the power consumption I saw in the reviews
    http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-4.html
    http://www.tomshardware.de/gigabyte-windforce-gtx-780-ti-ghz-edition,testberichte-241428-4.html

    is it because the measurements here are in 2ms gap?
  42. Better equipment, yes :)

    And for Maxwell (and later) I will combine TWO scopes, one for current and one for voltage, so I get 8 channels for measuring and logging. One of this HAMEG's will run in slave mode :)
Ask a new question

Read More

Graphics Cards AMD PowerColor Graphics