AMD Radeon R9 Fury X Power And Pump Analysis

While AMD introduced its highly anticipated Radeon R9 Fury X in June, we’re taking a deeper look at its power consumption and pump noise.

Introduction

Since we were able to procure our own retail board, we can finally take a much closer look at different aspects of the card’s power consumption. Our findings prove to be very interesting indeed! Furthermore, we’ve also taken several days to analyze the AMD Radeon R9 Fury X’s pump and the annoying noises that it produces. Even more interesting findings await!

We’re using the same benchmark system that we used for our AMD Radeon R9 Fury X launch article. However, we’ve tweaked it a bit. The idle power consumption’s now being measured using a system that has a lot of the software on it that tends to accumulate over time. Last time, the system was completely “fresh.” This time around, linear interpolation was applied to the data by the oscilloscope, and not when it was analyzed.

The biggest change, which we’ll stick with for all future launch articles as well, concerns the content, though.

We’ll look at power consumption in direct relation to gaming performance, and we’ll do so separately for 1920x1080 and 3840x2160, since there are major differences between the two. We’ll also look at several different games, and even run some of them with different settings, such as tessellation. In addition, we’ve added some applications that aren’t related to gaming. Life’s not always just about gaming, after all.

Bear in mind that these tests are merely snapshots in specific applications. A benchmark not part of our suite will likely fall somewhere between our average figure and the torture test peak. There are no absolutes these days; best estimates will always consist of a range.

Gaming @ 1920x1080 (FHD)

We know from AMD Radeon R9 Fury X 4GB Review that the Fury X doesn't do as well as many enthusiasts would have hoped compared to Nvidia's GeForce GTX 980 Ti. The averages of all of our results are very similar. At first glance, AMD’s 202W result doesn't look any worse than Nvidia's outcome (incidentally, also 202W). If gaming performance per watt is taken into account, though, then there’s a gap in Nvidia’s favor. But this doesn’t tell the whole story.

Flipping through the graphs, the individual results for power consumption (chart two) and the corresponding benchmarks (chart three) stick out. The insight gained from their comparison is quite simple: the reason that power consumption isn’t higher is that the graphics card limits it. This effect is particularly pronounced if the game involves tessellation. Metro: Last Light shows very clearly that tessellation is the Radeon R9 Fury X’s primary performance killer. What’s even more shocking is that Fiji's power consumption jumps way up with tessellation, whereas GM200 isn't as affected.

Gaming @ 3840x2160 (UHD)

UHD poses much more of a challenge for both graphics cards. Consequently, their power consumption increases significantly. On average, the scoreboard now shows 255W (283W peak in Metro: Last Light) for AMD’s offering and a much more reasonable 220W (233W peak in Metro: Last Light) for Nvidia’s card. The Radeon's power consumption does increase disproportionately, but its performance approaches that of Nvidia’s GeForce GTX 980 Ti as well. This means that efficiency takes a big hit. At least the performance is competitive, as can be seen in the benchmark results (chart three).

All of this means that the GeForce GTX 980 Ti is forcibly held back by GPU Boost and the restrictive 250W power target, but really doesn’t need any more than this to maintain the performance crown by a slight margin. Things are very different for the Radeon R9 Fury X, which occasionally draws up to 350W, even though few games hit the extreme reaches of more than 300W. Overall, there are two different philosophies at work here: GPU Boost’s hard limit versus PowerTune’s more generous allocation.

Lowering Consumption By Decreasing The Limit

What would happen if the power limit (not to be confused with Nvidia’s power target) was lowered manually? Would it be possible to trade a small and expected performance hit for significantly lower power consumption akin to what we see at Full HD? Unfortunately, no.

Check out the following graph. The y-axis shows the W/FPS ratio, which it to say that it shows how many watts are needed per frame per second. The x-axis shows the power limits. The numbers next to the lines stand for the actual FPS at the power target in question.

Even though the power consumption decreases from 267W to 170W when the power limit is set to -50 percent, the resulting frame rates just aren’t in the playable range any more. Higher power limit settings do not show imported results, either. The minimum and average frames per second, as well as the power consumption, stay the same. Things actually look about the same at Full HD, but the power consumption is low enough in that scenario that there’s no point to changing the settings anyway.

Efficiency For Different Applications

Let's shift away from gaming for a moment. Four tests show that professional software needs to be measured on a title by title basis because driver optimization, and with it application performance, can vary wildly. Generalizing any results just isn’t possible. We’re still offering a quick performance snapshot here.

We set the Radeon R9 Fury X’s performance as 100 percent, since the different benchmarks have dissimilar scoring systems.

Not taking performance into account, the overall power consumption duel can be summarized like this:

Bottom Line

We again see that one or two benchmarks aren’t enough to provide a fair evaluation. Things have gotten so complicated that we had to rethink what we’ve been doing, and we updated our procedures accordingly.

It’s hard to draw a definitive conclusion about the Radeon R9 Fury X knowing that AMD says upcoming drivers will tease more performance out of it. At this point, all we can say is that AMD’s new graphics card catches up to the GeForce GTX 980 Ti’s power consumption at Full HD, but can’t keep up when it comes to performance. At UHD, its performance is competitive, but its power consumption is much higher compared to Nvidia’s graphics card.

We’re including links to overviews of the results for all tested graphics cards below. The graphs include the new numbers and open in a separate window.

Exploring Pump Noise

Let’s go back in time a few days to right before the Radeon R9 Fury X’s launch. Our German team (the guys measuring thermals, power and acoustics) were only able to test the new card for a few hours due to a scarcity of press samples. But this short time period was enough to give us an earache. Our colleagues reported the same problems, which confirmed the existence of a whistling noise. In addition, there was some kind of buzz. The latter is a low-frequency, oscillating grinding noise. These noises combined to form annoying acoustic fireworks.

Since we weren’t the only ones who contacted AMD about this, all reviewers received a preemptive email that was supposed to calm and reassure people:

Very small batch?! We wanted to know just how small this batch really was, and got our hands on our own retail sample from Newegg, just as a customer would. Our model was made by XFX, and we'll say more about it in a small footnote at the end.

Apart from the Cooler Master label being somewhat off-center, our press sample and the retail board, along with their pumps, are identical. By now, there’s talk on some forums about other labels and even engraved Cooler Master logos, but we haven’t seen a card like that in the lab yet.

The retail card’s noise profile is different from that of the press sample in the sense that the XFX card doesn’t just have a whistling noise, but also a lower saw-like noise. Really, it sounds like a buzz saw with an oscillating out-of-balance blade. It even includes the saw’s motor noise. The graphics card’s obviously not as loud, but the acoustic profile is quite similar. Consequently, we’ve labeled this noise “buzz.”

We tried to trace the origin of the sound with a special microphone designed to measure structure-borne noise. This led us to where the small heat pipes are connected above the voltage converters. Even though these are designed poorly from a flow perspective (sharp bends, unnecessary decreases in diameter, possibly a rough inner surface), they were neutral as far as the noise was concerned.

This means that micro-bubbles aren’t the culprit. This conclusion is confirmed by the fact that decreasing the pump’s voltage to slow it down halves the whistle's frequency, but doesn’t make it any less loud. All that remains as a possible source is the pump itself.

Searching For The Actual Manufacturer

To get to the bottom of this, the first thing we needed to find out was who actually manufactured the pumps. There might be a huge Cooler Master label attached to them, but let’s be clear: Cooler Master buys their components. They don’t make them.

According to our information, the Taiwanese company AVC (Asia Vital Components) made the pumps. This is the same company that manufactures the Seidon and Nepton products. It's not a small player in the OEM field, either. In fact, it sees itself as the world market leader in some areas. In spite of a number of personal contacts, all we were greeted with was a wall of silence when we contacted them.

Norms: Not the Norm?

Almost all large OEMs have been using the quality assurance norm ISO 9001:2008 for years. Internationally, it’s the most common one. It specifies minimum requirements for quality assurance management systems that companies have to implement to meet customer requirements and provide sufficient product and service quality. This most definitely includes a decrease in error rates.

An additional factor is process orientation. This means that all major operational processes are included and checked continuously. This can and should lead to ongoing optimization. We haven’t been able to find any negative experiences by other AVC customers. In fact, what we heard from them was quite the opposite: products like Enermax's Liqtech have very small and quiet pumps, and the Seidon and Nepton pumps are also acceptable. At this point, it looked like Cooler Master might just have chosen the wrong product for its purpose.

But is that really the problem? We heard through the grapevine that there have been discussions about glue problems with the batch in question. We couldn’t figure out if the issue was the amount or consistency of the glue, which means that this information’s just too vague to make a big deal out of it.

This is a plausible explanation though, since it would explain the permanent noise, which would be due to remnants of glue inside the tubing and its connectors that could cause turbulence. It would also explain why the severity of the problem seems to be different in various people’s accounts. This wouldn’t be due to the customer’s varying sensitivity to noise, but simply due to quality fluctuations.

Cooler Master & AMD

If Cooler Master is basically the man in the middle by offering a product that is comprised of a number of components made by other companies, then it’s Cooler Master's responsibility to verify that each batch meets the required quality standards. This is especially true for the first one. It takes some additional effort to dedicate personnel to the quality assurance of a newly-started production line, but this should still be covered by the certification.

This very common flaw was either missed or ignored during the time leading up to the launch. Why this might have happened is unclear.

It shouldn’t be overlooked that parts made by the original OEM (AVC) are often sent directly to the final customer’s (AMD) next OEM (Sapphire/PC Partner) and are immediately included in the final assembly on the manual insertion line without being checked at the point of arrival. The previous OEM is trusted not to have made any mistakes. Unfortunately, it seems that neither Cooler Master nor PC Partner conducted any spot checks.

AMD’s Statement

A short statement by AMD reached us in the final minutes before we posted this article. We had shared our measurement results and insights with them leading up to its publication. The statement confirmed our information that a problem with the glue was the source of the noise problem.

Information About The RMA

In the end, there are a number of people suffering due to this massive demonstration of incompetence to implement decent quality assurance. There’s AMD, its partners that just buy these graphics cards from AMD after having them labeled and packaged in China, as well as the stores and the customers.

Looking at the first of AMD’s emails that we quoted at the beginning, it’s pretty clear by now that the “very small batch” was, at best, an understatement. We’ve contacted some of AMD’s partners directly to ask them if they are aware of the problem and willing to do spot checks. We also wanted to know how they are handling cards that customers found to be defective and if they have stopped delivering affected cards to stores.

We’d first like to note that all of AMD’s partners told us the exact same thing. We’re not reporting their names, since this information was mainly given by the R&D departments of the companies in question, and there haven’t been, and most probably won’t be, any official statements. This isn’t much of a problem, since the main message was that all of the spot checks yielded graphics cards with the same pump problem, even though its severity varied. None of AMD’s partners are planning to return the cards directly to AMD at this point for a variety of reasons and to avoid ending up on AMD’s bad side.

The good news is that AMD will apparently reimburse its partners for any losses suffered due to customers actually returning their graphics cards. Is this a ploy to sell at least part of the affected stock, because some customers aren’t that sensitive to noise and others don’t want to go to the trouble of an RMA? This would limit the financial damages, of course. However, it might still lead to undesirable results due to the damages to AMD’s and its partners' images. It’s questionable if the financial gain is worth it.

Ending On A High Note

We’ve been able to ascertain that there will be AMD Radeon R9 Fury X graphics cards with quiet pumps. Ultimately, the problem was found and fixed. The new revision won’t be identifiable by just looking at the package, though. It also stands to reason that everybody will first try to get rid of their old defective cards before pushing out the new ones. AMD could really have helped this situation by putting its foot down at the first sign of trouble. Then again, taking the high road does have to be financially feasible first. In spite of everything having been cleared up, a bad aftertaste remains.

Puzzled By Clueless Manufacturers

We’d like to add one quick note about the retail card before we get to our actual test with its video on the next page. The package states that the card has 4GB of memory. Right behind this, two black strips of paper were glued to it. The strips were fairly loose, so we took them off. Underneath, it says GDDR5. GDDR5 with a full 4096-bit interface? Whoever designed this really didn’t have any clue about the technology in the card.

At least they caught it before it was too late. In addition, Mantle is explicitly advertised in spite of this being the Radeon R9 Fury X. Games almost exclusively use this API as nothing more than a fallback. Windows 10, and with it DirectX 12, is right around the corner, and AMD's own website emphasizes its compatibility with the Microsoft API.

Let’s get to the good stuff!

Testing The Retail Card

We’re using a high-quality supercardioid attachable microphone. The following results should be interpreted accordingly.

The first thing we hear in the video are blubbering noises from the lower radiator. These didn’t go away after three days of continuous use. Since we’ve measured water temperatures in (and sometimes exceeding) the 60-degree Celsius range, and seeing that there’s no reservoir, there has to be enough air in the system to keep the expanding water from popping the tubes or the pump. Overall, the system is designed pretty close to the limit.

Spectrogram Frequency Analysis

The noise that sounds a bit like a saw consists of a permanent “carrier signal” between 1.93 and 1.97KHz and an oscillating buzz that’s located a bit lower on the spectrum. It’s really annoying. Let’s take another look at the frequency analysis that was recorded with a high-quality measurement microphone from a distance of 30cm.

The deeper parts aren’t as pronounced as the whistling noise, but can still be easily made out on the spectrogram. The higher frequencies far exceed 20KHz.

Do Dampening & Pressure Help?

We tried to glue dampening material to the inside, where it also applied pressure to the pump’s housing. This helps, but only a little. It doesn’t really impact the whistling noise, but cuts down the buzz a bit.

We also found that the part of the noise we call buzz had subsided a bit after 70 to 80 hours of continuous operation. This speaks for it being due to a manufacturing defect. The whistling noise did decrease a little as well, but only marginally.

Results Overview

We used the same test setup that we used for our launch article to take our new measurements. This yielded the following results:


Pump
Fan
Total
Gaming Loop Press Sample
33.1 dB(A)33.7 dB(A)34.2 dB(A)
Gaming Loop Retail Card
34.5 dB(A)
33.8 dB(A)
35.2 dB(A)
Gaming Loop Retail Card
(After 80 Hours of Operation)
33.6 dB(A)
33.6 dB(A)
34.6 dB(A)
Torture Test Press Sample33.5 dB(A)34.6 dB(A)35.6 dB(A)
Torture Test Retail Card
34.9 dB(A)
34.9 dB(A)
36.2 dB(A)
Torture Test Retail Card
(After 80 Hours of Operation)
34.2 dB(A)
34.8 dB(A)
35.8 dB(A)

Conclusion

What advice can we give to our readers interested in an AMD Radeon R9 Fury X? Wait or play the lottery? It’s important that AMD has apparently guaranteed that it will pay for RMA-related losses and has started doing so. The partners certainly hope that not all of the graphics cards end up back with them, though. People outside of Europe tend to care less about noise levels, which might explain why AMD chose to take this route. However, it’s not exactly great for the company's image.

The Radeon R9 Fury X and its cooling solution can’t be dismissed outright as an upgrade due to its acceptable performance. This wouldn’t be fair for the quiet cooling solution that might well be available soon.

From our point of view, Cooler Master and the OEMs it hired are responsible for this failure. Norms need to be adhered to and executed, especially if you proudly display certificates. We’re sure that AMD will get its money back from these companies, since these kinds of things are always subject to air-tight contract clauses in this business.

MORE: Best Graphics Cards For The Money
MORE: All Graphics Content
MORE: Graphics Cards in the Forum
MORE: How Well Do Workstation Graphics Cards Play Games?

Igor Wallossek is a Senior Contributing Editor for Tom's Hardware Germany, covering CPUs and Graphics.

Follow Tom's Hardware on Twitter, Facebook and Google+.

This thread is closed for comments
50 comments
    Your comment
  • Cryio
    I'm seriously curious to see what 2015's Omega Driver does to Fury X's efficiency and overall performance.
  • coolcole01
    Way to go amd 650 for less performance and manufacturing defects bravo.
  • thor220
    More pages are spent on the whine then the actual performance of the card. It's an issue that's already been addressed, tom's really didn't have to dedicate so much to such a small issue.

    Comparatively, at least it doesn't reduce performance like the last 0.5 GB on the 970, and I don't remember tom's being on Nvidia's case about that much more serious issue.
  • i7Baby
    The ISO standard is more about making sure you settle on a quality standard and stick to it rather than improving quality. You can get certification by having a system in place that churns out crap - systematically.

    Somehow I don't think a lot of PC stuff is rigorously specified. In this case I think samples were signed off by sales and marketing. This is how much of the PC industry is run.
  • crisan_tiberiu
    With AMD, many users are going to be using first time in their PC a liquid cooler, so many wont have previous experiences to compare and they just gonna say that the sound is normal. These sound issues i am sure that they were known issues i am sure of it. Still, i would buy the Fury, i am sure that the tiny pump on the card has less than 0 chance on beating in noise levels my aquarium pumps that sit next to my desktop PC (i have 1 air pump and 1 filter motor..., so yeah i know what loud means) :D :D :D
  • hannibal
    http://forums.anandtech.com/showpost.php?p=37527347&postcount=43

    They are really changing the pump. Someone tried to upper their margins by cutting quality. But all in all even the older pump is very quiet and with the right version situation gets even better.
    The FuryX is very good card. Luckily so is 980ti even in bigger measurements! Interesting to see how air coolet Fury will change the situation. All in all 980ti will make FuryX cheaper and FuryX forced the Nvidia to make something else than TitanX to customers. Competition is good!
    980ti is very good card and FuryX allmost gets there. Because of AMD problems with DX11 the situation may be reversed in DX12 games so in longer run the situation is even as it should be for the customers.
    Hopefully we will soon see some retake when Toms will get the upgraded pump to the test. Also it seem that some parts that are not water cooled can get quite hot, so Air cooled basic Fury will be very interesting card to be get tested.
    The situation in GPU part is better than for long times! The next year will be even more interesting, because we may see first finvet based GPUs and also second generation of HBM.
  • FormatC
    316517 said:
    More pages are spent on the whine then the actual performance of the card. It's an issue that's already been addressed, tom's really didn't have to dedicate so much to such a small issue. Comparatively, at least it doesn't reduce performance like the last 0.5 GB on the 970, and I don't remember tom's being on Nvidia's case about that much more serious issue.


    This is a follow-up, not the launch review. In the first part of this follow-up with power consumption measuring I have tested both cards in a lot of games and applications - together with the specific power draw for each card in each benchmark. This is more than other sites published. 10,240,000 single values - this wasn't done in a few hours.

    Simply use the slider pics to compare power consumtion, performance and watts/fps. :)

    Quote:
    I don't remember tom's being on Nvidia's case about that much more serious issue

    This is only one example:
    http://www.tomshardware.com/news/MSI-GTX-660-670-overvolting-PowerEdition,18013.html
    (the original, detailled review was in German)
  • synphul
    It's unfortunate that they continue to slap the cheapy aio pump combos on everything. Rather than improving it sounds like quality just keeps dropping. Stating that the fan is so noisy it will probably drown out the annoying budget pump isn't really much consolation. In addition to performance charts, notations like these are important in a review. If a product has undesirable quirks it would be much nicer to know before dropping close to $700 on it. If this was a $200-300 card it might be more tolerable, not at those prices though. People paying top dollar should receive top quality. I'd rather they charge $10 more and do it right. Especially considering aio's have been around long enough they're no longer in their quirky 'infant' stage.
  • Mousemonkey
    60597 said:
    http://forums.anandtech.com/showpost.php?p=37527347&postcount=43 They are really changing the pump. Someone tried to upper their margins by cutting quality. But all in all even the older pump is very quiet and with the right version situation gets even better. The FuryX is very good card. Luckily so is 980ti even in bigger measurements! Interesting to see how air coolet Fury will change the situation. All in all 980ti will make FuryX cheaper and FuryX forced the Nvidia to make something else than TitanX to customers. Competition is good! 980ti is very good card and FuryX allmost gets there. Because of AMD problems with DX11 the situation may be reversed in DX12 games so in longer run the situation is even as it should be for the customers. Hopefully we will soon see some retake when Toms will get the upgraded pump to the test. Also it seem that some parts that are not water cooled can get quite hot, so Air cooled basic Fury will be very interesting card to be get tested. The situation in GPU part is better than for long times! The next year will be even more interesting, because we may see first finvet based GPUs and also second generation of HBM.


    Interesting that you see AMD somehow forcing Nvidia's hand, maybe Nvidia is just doing it's own thing and AMD are the one's playing catch up and having to release untested products in order to try and stay in the game. The GDDR5 bit is amusing enough as it indicates a design change that could explain why there was mention of the cards RAM using a fair bit of juice, it's also good to know that it can use Mantle but I didn't see mention of Freesync?
  • Nuckles_56
    My question is why Total War: Attila was causing such a spike in watts/fps on the 980ti?
  • FormatC
    1670073 said:
    My question is why Total War: Attila was causing such a spike in watts/fps on the 980ti?

    The power consumption is more or less ok, but the performance not. It's not a Nvidia optimized game ;)
  • army_ant7
    Is it just me or have the individual performance and efficiency conclusions basically change here vs. the launch article?

    Did those testing method changes listed at the beginning really have that much of an effect or was it a difference in software and/or settings used to benchmark?

    I'd like your insight on this Igor, and I'll have to do an article comparison myself.
  • FormatC
    250158 said:
    ..Did those testing method changes listed at the beginning really have that much of an effect or was it a difference in software and/or settings used to benchmark? I'd like your insight on this Igor, and I'll have to do an article comparison myself.


    I've measured since years in FHD because the most gamers prefers this resolution. But especially the Fury X is very limited in FHD. It was the goal to find now the best and more plausible method and game selection to show the difference in various situations. In UHD you get higher numbers and I will use Thief in UHD as reference in the near future for the not so important custom cards.

    This review shows only a small part of all 21 tested games and apps. I've selected the most representative for this suite and I will use is this for all upcoming launches. Only one game is not enough to give a fair and objective description / conclusion. The range is too large.
  • AS118
    Well this is all well and good, but I'm more interested in the air-cooled Fury and the Fury Nano myself.

    I feel like the Nano, with its Fiji / HBM architecture with small size and high power efficiency may be the breakout product of this generation. As for the air-Fury, if it has 980Ti levels of performance at $100 cheaper, that'd be quite interesting, and it'd be easier to Crossfire as well.

    Anyhow, I'm done giving Nvidia my money when it has such a big sales lead and there's only 2 GPU manufacturers on the market. It's not like it is with cars or fridges where there's lots of competition. I want AMD to not only stay in business, but have enough money to give Nvidia and Intel a run for their money.

    So that means I'll keep buying AMD to make sure they have funds for R&D. Their products are good value and perform well enough (within spitting distance of Nvidia) so it's all good to me.
  • random stalker
    Quote:
    From our point of view, Cooler Master and the OEMs it hired are responsible for this failure. Norms need to be adhered to and executed, especially if you proudly display certificates. We’re sure that AMD will get its money back from these companies, since these kinds of things are always subject to air-tight contract clauses in this business.


    That is not how it works. From quality management point of view: "outsourcing a process/manufacturer does not relieve a subject of the responsibility of ensuring a product/process meets all required standards as specified by some sort or an internal engineering norm/drawing, etc..."
    So it is entirely AMDs fault - outsourcing the pump to CM doesn't make AMD magically immune to all CM may cause.
    Thus there are now two options:
    - AMD needs better quality control so they wouldn't allow this kind of card on the market,
    - AMD knew about it and released the bad card anyhow.

    Quote:
    The ISO standard is more about making sure you settle on a quality standard and stick to it rather than improving quality. You can get certification by having a system in place that churns out crap - systematically. Somehow I don't think a lot of PC stuff is rigorously specified. In this case I think samples were signed off by sales and marketing. This is how much of the PC industry is run.


    It is usually the quality department and the R&D that approve the samples; but yes, in a broad sense it works like that.
  • FormatC
    The OEM business is a bitch. You get a few samples from your OEM and all looks well. You start the mass production and the first batch is around 1000 pieces. This is and sounds not so much, but it is all what you need to start into a disaster without a final control. :)
  • random stalker
    The batch size always depends on a part. Basically if you want me to make you some stuff, you need to give me all the norms and all the specifications you deem to be important, so I can set up my production and quality control.

    After that, there is a part approval process, which is a hardcore version of the following:
    - I make some paperwork that everything is OK on my side.
    - I make initial samples, send them to you, you take them apart and give me green, yellow or red light.
    - I set up the production tools and start making beta samples.
    - Then you send your people to audit my processes - and you will send the most fierce evil people you can find so they find any nonconformity even the smallest one.
    - I ship you another batch or two/three/four, you set up your processes based on this second/third/fourth... batch. During this series I set up additional and tune my production equipment.
    - After that comes the 2 day test production, where I demonstrate the capability of manufacturing required amount of part of the required quality. You also send some of your people to observe that I don't cheat.
    - Then, after a while, you audit me once again - to check that I've fixed everything.
    - After that I send you some parts with increased quality control both yours and mine.
    - And after a while (which can be several tens of thousand parts) you allow me to remove some obsolete quality control measures.

    This 'dance' can usually take about an year, but also can be shortened (if we already collaborated on a same product), but is seldom shorter than half a year (well, I've seen a production for a new product being set up under three months, audits, paperwork, machines and everything, but those guys were crazy...) :D
    So, there is a very small chance that AMD didn't noticed those problems before.
  • eklipz330
    Quote:
    More pages are spent on the whine then the actual performance of the card. It's an issue that's already been addressed, tom's really didn't have to dedicate so much to such a small issue. Comparatively, at least it doesn't reduce performance like the last 0.5 GB on the 970, and I don't remember tom's being on Nvidia's case about that much more serious issue.

    if that's how you perceive things sure. this just shows how much of a fanboy you are towards AMD. they've spent a total of one article on the noise, and they've already done a full blown review.

    and yes, they did address the .5gb issue on the gtx 970, in lengthy detail. a simple search would have saved you(and everyone that gave you a TU) from looking like a tool:

    http://

    http://

    http://

    it's people like you who give TH a bad name.
  • FormatC
    @random stalker:
    I'm very often in Asia because I'm not only an editor. The main problem is the typical man-in-the middle story:
    AVC -> Cooler Master -> Sapphire -> AMD. Competence splitting. And no one feels responsible. But I see Cooler Master as guilty.

    If you see all things working well, from mockup to the pre-production samples / small series, it is common to give the green light for the mass production. Cooler Master must control AVC, Sapphire must control Cooler Master, AMD must control Sapphire. I'm sure that AMD saw the first real samples only after shipping to branch offices and distributors. And exactly this was too late. The whole launch was a disaster.
  • random stalker
    Agreed. But the one, that is responsible for the shipping of a NOK product is always the end of the line manufacturer.
    Thus, he is responsible to make sure, everything, that is sent to the consumers, meets all the required criteria (and there is a lot of them, not only functional, qualitative, regulatory...). So without the final OK from the AMD, no AMD cards can leave the factory.

    But here is the catch - there are various levels of nonconformity/problems/faults (name it as you like):
    -minor - general annoyance, which some sensitive people may see as a fault (f.e. logo is dark red and not bright red)
    -major - component may not work correctly and/or regular customers may rma the part (aka rma if a customer finds out)
    -critical - component will not work and/or can be hazardous to health without warning (100% rma and/or a lawsuit impending)

    Based on the level of a problem, there is a general guideline of what to do with such a part. The first and most important part, when handling major and critical problems, is to ensure that a part with said defective part can not be used in production (and especially not be shipped to customer). Which means not only sorting out the stock, but also marking all the OK parts and first (or first few) batch after said sorting action.

    Based on the AMD reaction, it seems, AMD sees such problem as 'only an annoyance'.
    And it wouldn't be the only time a faulty product was released, because the company releasing it was in a bind :D
  • FormatC
    Quote:
    Agreed. But the one, that is responsible for the shipping of a NOK product is always the end of the line manufacturer. Thus, he is responsible to make sure, everything, that is sent to the consumers, meets all the required criteria (and there is a lot of them, not only functional, qualitative, regulatory...)
    This cards were made by Sapphire, not AMD. This is a typical outsourcing and I'm sure that all was shipped from PC Partner, not AMD :)

    AMDs fault is at the end the missed final control in Asia. May be it was too hot in China for the pinstripe yuppies :D
  • random stalker
    Oh sorry, must overlooked that in the article. Thought the AMD sells the cards under their own brand :(
    If Sapphire makes them and sells them under the Sapphire brand, then Sapphire is at fault (the distributor doesn't matter as he is only moving the stock from point a to point b) unless all other manufacturers suffer from the same problems. Then the product design should be reviewed and the quality requirements should be rechecked - especially criteria for sound conformity. If the design is faulty, you can not blame manufacturers for producing a faulty product and the ball is again in the AMD court. If it is only Sapphire that has those kind of problems, then we all should ditch Sapphire cards until the problems are sorted out, and go with MSI or Gigabyte cards :D
  • FormatC
    Sapphire produced ALL this cards for AMD and shipped it to other brands to label it. PC Partner is AMDs OEM and Sapphire is a brand of PC Partner. All cards are reference and were only labeled later. The same with Nvidias TitanX. Not one AIB has an influence. Companies like Gigabyte bought at the begin only 100 of this cards from AMD to put it in their own boxes :D
  • Garrek99
    I've been reading reviews here for well over a decade and obviously really appreciate the info that you guys output.
    What I'm unhappy about is the new image viewer control that has the left and right arrows so opaque.
    Please make these arrows transparent so that they stop blocking content when the images are viewed on small screens.
    Thanks.