Palit, Gainward Packs GTX 580 Cards with 3 GB
Two new GeForce GTX 580 cards sport an impressive 3 GB of GDDR5 memory.
Tech Connect reports that both Gainward and parent company Palit Microsystems have announced GeForce GTX 580 graphics cards with 3 GB of GDDR5 memory each.
According to the report, the Gainward offering sports the Phantom brand and a custom cooler with three 80-mm PWM fans, six heatpipes, a copper base and 44 aluminum fans. The Palit card also comes with a custom cooler, however it only utilizes two fans.
Out of the two, Gainward's GTX 580 Phantom looks more like a miniature radiator (seen right). However beyond that, the card itself features the typical GTX 580 setup including 512 CUDA cores, a core clock of 783 MHz, a shader clock of 1566 MHz, a memory clock of 4020 MHz, a 364-bit memory interface and DirectX 11 support. There's also dual DVI ports, HDMI and DisplayPort connections. The card follows Gainward's previous GeForce GTX 570 Phantom card with 1280 MB of memory, and the GeForce GTX 580 GOOD with 1536 MB of memory.
As for the new Palit GeForce GTX 580 graphics card, it appears to simply double the memory found on Palit's previous 1536 MB version. Like the Gainward version, the card offers the same GTX 580 basics including 512 CUDA cores, a core clock of 783 MHz, a shader clock of 1566 MHz, a memory clock of 4020 MHz and so on.
"Top notch performances with nice design, the dual-fan cooler design of Palit GTX 580 emits the heat more effectively but remains the noise level under control at the same time," the company said in regards to the previous 1 GB model. "Palit GTX 580 provides exceedingly quiet environment in idle but also generates less temperature when it's under heavy loading during gameplay."
Currently it's unknown when the new Palit and Gainward cards will be available. Pricing was also not provided, so stay tuned-- we should see something official any day now.
You will have less money to spend on games.
You will have less money to spend on games.
Will only show any gains if you are doing multi-monitor high resolution heavy AA etc. That's what these cards were built for (note the Display port on the cards).
Due to the architecture of the card, they couldn't go to 2GB, like the HD 6950/70. 2GB would be enough for now, but as it is, they had to double the memory to 3GB. Hey, you might be able to take advatage of that in heavy multimonitor gameplay, or in regular play in 2 years (when high end gpu's will probably come with 3GB; it will be a mainstream card by then, but still pretty usable)
Did you see the new charts comparing the GTX 560Ti? There was no visible correlation between 1GB vs. 2GB memory amounts even in highest resolutions with AA turned on. This product is for bragging rights only. Just like intel's 1000 dollar Extreme chips. No noticeable performance gains, but many people still buy them.
As for the extreme chips.....they have unlocked multipliers necessary for the highest overclocks and also are binned to get further on less voltage. They have a higher coded and unlockable TDP allowing for even further overclocking
You are a retard if you run an extreme chip at stock speeds
funny and very true
That will be the price of the 6990
Well, I happen to own a 980X, and it just so happens that there is a significant performance gain over pretty much everything on the market. You should look at the CPU Higherarchy Chart. Oh, and in terms of performance gains from my last system? I was using a P4 3.0GHz single core processor with HT. So you could say that it performs a lil better...
Is that aluminum fins? 44 fans means a lot of power for the cooler itself!
It has to be fins, just...just has to. Doesn't it?
Actually, Hardocp recently did SLI, CF reviews at 5760x1200. There were some signs that SLI 580 did hit the memory wall.
Who the hell wants to use multiple monitors!? just get a freaking HDTV!
Wow tiny freaking monitors placed besides each other OR 1 big screen on your wall...what would be better? I'd go with 1 big screen! and if they want some realistic 3dish effect, then get a 3DTV. Why the hell would someone waste $800 on this thing and then another $400-$500 on 3 monitors?