hispeed120

Distinguished
Mar 13, 2008
136
0
18,680
ATis 780G mobo has been out for some time now. Earlier, there was news of nVidia releasing the 'MCP7a' a similarly featured motherboard for Intel processers. I'm not against either brand, but for the price, AMD processors can't touch Intel at the moment.

Unfortunately, it seems right now only ATI cards can be paired with AMD processors for a 'hybrid' setup. Does anyone have any insight/info on the topic? Also, does hybrid Crossfire power down the gpu and only run it when graphically intensive programs are being run, or does nVidia's Hybrid SLI only have that perk?

Thanks.
 

joefriday

Distinguished
Feb 24, 2006
2,105
0
19,810
Currently there is no hybrid GPU scheme for Intel motherboards. Both AMD and Nvidia have a very basic hybrid design for socket AM2, but nvidia has yet to develop something that advanced for Intel platforms. Nvidia's current IGPs for Intel motherboards, launched last November, carry the MCP73 name, and are designated 7050, 7100, and 7150. None of them are competitive with current top of the line IGPs from Intel, AMD, or even Nvidia's own chips for AM2. The MCP73 is single channel RAM only, has no HDCP, and lacks any sort of purevideo technology (not even standard def mpeg-2 acceleration). It's basically a port of Nvidia's old 6xxx series of IGPs for AMD boards. The MCP73 is built to compete with the low end IGP boards, as an upgrade over using a VIA or SiS chipset, and a decent rival to Intel's old GMA 9xx-based IGPs found on the everything but the G35 and G965 chipsets which use the much more powerful x3000/x3500 IGP.
Unfortunately right now there's no such thing as a great IGP for Intel boards. Maybe the G45 will make things better....TBH, with all the action going towards Nahalem, I don't see Nvidia/SiS/VIA and especially not ATI spending any more resources developing northbridge chipsets for the old Intel FSB architecture.
 

hispeed120

Distinguished
Mar 13, 2008
136
0
18,680
That's too bad. I really think that 'hybrid' technology would have been a great step in the right direction. Power consumption continues to climb with video cards, and this would have been a great solution to combat it. Not everyone needs the power of a high-end GPU 100% of the time. The idea just makes sense.