More Details on Galaxy's Dual-GPU GTS 250
3DMark Vantage reveals that card's core and shader clock speeds.
A few weeks ago, we reported that Galaxy is working on a dual-core graphics card that uses two 55nm G92 GPUs (each sporting 128 processing cores) on a single PCB. Each GPU is backed by a dedicated 1 GB of DDR3 memory and a 256-bit memory interface. Without SLI fingers for expansion, the Quad-SLI capable GPUs are locked into its current form.
Now more details have been released on Galaxy's dual-GPU GTS 250. Apparently the company has released additional images, showing the board with a second layer of three cooling fans mounted above. With that said, it's easy to assume that Galaxy's card will consume at least three motherboard slots. The card looks really cool when installed however, lighting up the interior of the rig using a blue LED mounted in each fan.
On a performance level, the card performs slightly higher in 3DMark Vantage than the older single GPU GTX 280 according to ShaneBaxtor.com. The card is overclocked to 675 MHz on the core front, and the shader clock is cranked up to 1696 MHz. Galaxy's Xtreme Tuner HD overclocking software even reported a GU temp of 45c, however that temp is probably due to sitting idle..
As we previously reported, it's unknown if Galaxy plans to push this dual-GPU design into mass production. The product images provided by Galaxy could lean towards a possible production, and we might even see the card in action at a future event. Still, we question why the company is using two-year-old GPUs in the first place. Perhaps this will be an "economy SLI combo card," or perhaps this is just a way to clear out old technology.
The only way the dual GTS 250 will sell is if it sells for well under $300.
Agreed. If it's in the $220-250 range it should sell decently.
If its 200 bucks a lot of people care.
Well there aren't hardly any dx11 titles out there to take advantage of the hardware. But I would say it would be a better move to grab two 5770's or 5750's instead of two gts 250's...
Nvidia needs to come out with their new cards already so we can see a price cut from ATI
Because dx11 like dx10 simply doesn't mean anything. New version with new features that 95% of games aren't going to use in a meaningful way because they need to move a million copies to break even and they need dx9 customers too...
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102809&Tpk=4850x2