Sapphire Technology is expanding its graphics card offering with the introduction of the Sapphire RX 570 16GB HDMI Blockchain Graphics Card, as it's called, which specializes in mining Grin Coin and other cryptocurrencies.
Sapphire's new Blockchain graphics card features a dual-slot design and employs the brand's Dual-X cooling system with two dual-ball bearing 95mm cooling fans. The fans carry Sapphire's Quick Connect Fan Technology whereby only a single screw holds the fan in place. This allows consumers to swap out fans with relative ease. The graphics card is still based around AMD's Polaris 20 silicon, which is fabricated by GlobalFoundries' 14nm process node.
Just like any Radeon RX 570, the Sapphire RX 570 16GB HDMI Blockchain Graphics Card is equipped with 2,304 Stream Processors. The only noteworthy upgrade is the 16GB of GDDR5 memory. However, memory still communicates across a 256-bit interface, so the memory bandwidth remains unchanged at 224 GB/s. Sapphire didn't reveal the graphic card's clock or memory speeds.
The Sapphire RX 570 16GB HDMI Blockchain Graphics Card sports a single 8-pin PCIe power connector. According to the manufacturer, the graphics card draws around 175W. It only display output is an HDMI port.
For cryptominers, the graphics card supports the recently launched Grin Coin, Cuckaroo 29+ and Cuckatoo 31+. Sapphire claims that its Sapphire RX 570 16GB HDMI Blockchain Graphics Card can deliver performance in the range of 0.42 GPS in Cuckatoo 31+. The graphics card is also capable of mining Cuckatoo 32+ with the help of a small driver update. The Cuckaroo and Cuckatoon algorithms can consume anywhere between 5.5GB and 11GB of memory, so the Sapphire RX 570 16GB HDMI Blockchain Graphics Card's 16GB of memory will certainly come in handy for serious cryptocurrency miners.
Sapphire didn't reveal the pricing for the Sapphire RX 570 16GB HDMI Blockchain Graphics Card but said it'll be available for order soon at the company's official website.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Zhiye Liu is a news editor and memory reviewer at Tom’s Hardware. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.
-
TechyInAZ Huh, if Crossfire ever gets readily good support. You could put two or three of these together for 4k-5k gaming and never run into VRAM bottlenecking.Reply -
bloodroses A little late to the party. The cryptocurrency boom is over now and the only people left are using ASICS. Just ask Nvidia and their investors.Reply -
cryoburner
I don't think VRAM is really much of a limitation in today's games at 4K, at least so long as the card has 8GB. I don't recall seeing any benchmarks where the 2080's 8GB caused any performance issues relative to the 1080 Ti's 11GB. Even the 2060's 6GB seems to be enough to handle current games just as well as the 1070 Ti's 8GB at 4K, though I could see that potentially impacting performance more within the next couple years. It seems like 8GB should be enough for a while though, so it probably wouldn't be worth using these cards in a multi-card setup for gaming, since they undoubtedly cost more than 8GB RX 570s.21702621 said:Huh, if Crossfire ever gets readily good support. You could put two or three of these together for 4k-5k gaming and never run into VRAM bottlenecking.
As for 5K, it won't likely be relevant for gaming for quite a while, and I can't see even three RX 570s handling it well. Considering how poorly supported SLI and Crossfire tend to be, you would be stuck running a single RX 570 in any game that didn't work well with it, meaning you would be falling back to middling performance at 1080p in those titles. I can't see multi-card support getting better any time soon either, seeing as at least Nvidia seems to be moving away from it, only supporting SLI on their $700+ cards this generation.
And even if multi-card setups were widely supported by game developers, it still probably wouldn't be a very good idea to run these cards crossfired at 4K resolution or higher. Even with good multi-card scaling, a single 2060 or 1070 Ti would likely outperform two of these cards in most cases, at around half the power draw and heat output. And even those cards only provide mediocre performance at 4K, requiring graphics settings to be lowered to maintain reasonable frame rates in newer games. Three RX 570s might draw close to 500 watts when fully utilized, making such a setup even less practical. -
TJ Hooker
There are always new coins and algorithms coming out, some of which are ASIC-resistant (at least for a time). Hadn't heard of the "grin coin" mentioned in the article before, but apparently it uses some new proof-of-work algorithm that's supposed to be mostly resistant to existing ASICs, making it friendly for CPU/GPU mining. I'm sure there's other coins out there that can be GPU-mined as well, I think people are still GPU mining ethereum, no idea how profitable it is though.21702658 said:A little late to the party. The cryptocurrency boom is over now and the only people left are using ASICS. Just ask Nvidia and their investors. -
bloodroses 21703683 said:
There are always new coins and algorithms coming out, some of which are ASIC-resistant (at least for a time). Hadn't heard of the "grin coin" mentioned in the article before, but apparently it uses some new proof-of-work algorithm that's supposed to be mostly resistant to existing ASICs, making it friendly for CPU/GPU mining. I'm sure there's other coins out there that can be GPU-mined as well, I think people are still GPU mining ethereum, no idea how profitable it is though.21702658 said:A little late to the party. The cryptocurrency boom is over now and the only people left are using ASICS. Just ask Nvidia and their investors.
Hmm, interesting that they're trying to make them ASIC resistant. It seems kind of counter productive to trying to make a 'standardized' coin to get universally adopted though. I can see other 'grey area' uses for it though. Either way, as you said, it's interesting as a proof-of-work concept.
As with mining, honestly I see the fad completely dying out/useful/profitable as businesses/governments will just spin their own currency type if they choose to switch to crypto. This article is a perfect example of what Japan is trying to accomplish:
https://www.technologyreview.com/s/611656/will-people-ditch-cash-for-cryptocurrency-japan-is-about-to-find-out/ -
InvalidError
I can't imagine governments switching to crypto for currency, that's far too inefficient. Much simpler to stick to regular currency and existing banking infrastructure which doesn't require massive computing power per transaction.21703849 said:I see the fad completely dying out/useful/profitable as businesses/governments will just spin their own currency type if they choose to switch to crypto.
Keep in mind that one of the core reasons behind crypto-currencies' massive overheads is the lack of trust since there are no trusted central authorities to back them up. Crypto-currencies are little more than computer science experiments. -
TJ Hooker
Well yeah, that's kind of the whole idea behind cryptocurrencies, that you don't need to trust a centralized authority in order for the currency to have value.21706069 said:Keep in mind that one of the core reasons behind crypto-currencies' massive overheads is the lack of trust since there are no trusted central authorities to back them up. Crypto-currencies are little more than computer science experiments.