According to market data, integrated chipsets outnumber the number of discrete graphics cards sold each quarter. The obvious reason behind this phenomenon is that systems with graphics integrated into the chipset are less expensive than a separate graphics card. While these integrated platforms may help keep greenbacks in your wallet, these systems are generally underpowered compared to ones that have a dedicated graphics card Compare Prices on Top Video Cards.
This raises a few questions regarding integrated chipsets. First, how well can integrated chipsets accommodate graphically-intensive games, as opposed to Flash animation on the Web. Of course, integrated graphics cannot handle game titles like Elder Scrolls IV: Oblivion, Bioshock or Crysis but I would expect current integrated chipsets to be able to play Doom 3 (OpenGL) or Fear (DX 9.0c). I am not even expecting them to handle massive resolutions like 2560x1600 but I would expect them to play at a minimal frame rate at 1024x768 (CRT) and possibly 1280x1024 (LCD). So if they cannot handle frame rates at "normal" resolutions, what settings can I get decent frame rates at?
Lastly, if the chipsets can't play games well, can they instead be used for a home theater PC (HTPC) for DVD and HD DVD playback? If the chipset cannot play movies well enough, would you be better off with an inexpensive graphics card? That is what we intend to nail down. We did not have Nvidia's 6150 chipset for this review but the article is targeted at "is integrated okay?" To see how well Nvidia fairs, we are planning a follow-up article when Nvidia's next integrated chipset is unveiled later this month.