Gigabyte's CMP 30HX Mining Card Launches With Three-Month Warranty
Short warranty for miners?
Gigabyte has officially launched its CMP 30HX board for cryptocurrency mining. The board uses the same components as the company's mid-range graphics cards, but naturally lacks display outputs. Perhaps, the most interesting detail about the product is that it comes with a three-months warranty.
Gigabyte's CMP 30HX D6 6G board is powered by Nvidia's TU116 graphics processor (possibly with some parts disabled) clocked at 1785 MHz and connected to 6GB of GDDR6 memory using a 192-bit interface. The board measures 224.5 × 121.2 × 39.6 mm and has one eight-pin auxiliary PCIe power connector, so expect its power rating to be at 125W, same as in case of Palit's CMP 30HX board.
To ensure longevity in tough conditions, Gigabyte uses Ultra Durable certified components for the card, a valuable feature for miners who run their boards 24/7. Gigabyte's WindForce 2X cooling system consists of an aluminum radiator, a composite copper heat pipe, and two fans that spin in opposite directions to exhaust hot air above the card, another valuable feature for miners.
What miners will not be happy about is the three-months warranty that Gigabyte offers with its CMP 30HX D6 6G board. The card is more than likely to survive longer if cooled properly, but Gigabyte wants to play it safe. Meanwhile, the EU mandates a two-year warranty on electronics, so in Europe this product could get the same warranty as regular graphics cards.
Nvidia introduced its Cryptocurrency Mining Processor (CMP) lineup in mid-February. Nvidia originally planned to earn $50 million selling GPUs for cryptocurrency mining, but now the company expects its CMP revenue to be about $150 million in the first quarter. Gigabyte is among the first makers of graphics cards to officially confirm that it sells a CMP product, but it will almost certainly not be the only manufacturer to do so.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
-
velocityg4 Understandable as some miners will pack the cards right next to each other with no space for air to move. Then OC the VRAM way higher than it should hit.Reply -
d0x360 Sounds fair to me. Miners run 24/7, gamers play a few hours and a day at most for the average person.Reply
They probably did some math and figured instead of the usual 3 years a miner would hit that amount of use time in 3 months AND under constant intense loads while being packed in with other GPU's all generating tons of heat.
I'd have cut the warranty down too if I were a company even though I know that generally speaking a gpu used for mining is perfectly acceptable to resell to a gamer and they will probably get more than needed from it in terms of it not breaking. -
cryoburner Meanwhile, the EU mandates a two-year warranty on electronics, so in Europe this product could get the same warranty as regular graphics cards.
Isn't that for consumer goods from retailers though, whereas this would more likely be a direct business sale, probably in bulk to larger mining operations? Also, apparently that law requires the consumer to show the burden of proof that the failure was a result of a manufacturing defect after the first six months, and not a result of misuse. I suspect it might not be hard for the manufacturer to deny such claims if the card is being run under harsh, 24/7 mining conditions. And most of these cards will probably be going to China anyway, with EU sales being less common. -
Abion47 d0x360 said:I'd have cut the warranty down too if I were a company even though I know that generally speaking a gpu used for mining is perfectly acceptable to resell to a gamer and they will probably get more than needed from it in terms of it not breaking.
CMP cards are quite literally unusable for gaming, remember? How are you gonna game on a GPU with no display outputs?