Video Capture: Available On AMD, Sometimes On Nvidia

High Definition - Hype And Practicality

You might have cables like these but video capture might not work

Looking back over the past year to year and a half, we see that there has been a huge push towards high definition (HD) content. What was once idling on the back burner for consumer graphics cards is now front and center. The marketing machinery for consumer electronics companies has been telling us that HD is the best thing since sliced bread, and most are buying into that assumption. While the outlook for HD has not always been as optimistic as it is today, the trend towards greater quality in our digital entertainment choices is here to stay.

The previous major push for theater quality at home came in the form of audio: we have made great strides over the year moving from simple stereo all the way to Dolby 7.1 surround sound. The story today, though, is image quality. Looking back, one forecast in 1998 indicated that only 1 million HDTVs would be in North American homes by 2003. Last year, the Consumer Electronics Association predicted that HDTVs would out-sell analog television sets by 89% in 2006; it further predicted that HDTV sales would reach 15.9 million units. A study earlier this month from Pacific Media Associates commented that "unit sales for flat panel televisions and business displays in North America doubled from October to November, increasing an unprecedented 109%. Their Consumer Flat Panel Display Sell-Through Tracking Service also shows that the average street price fell 8% for the month, resulting in an increase of 93% in revenues over October."

While it is unclear which storage format will win in the end, content has been pouring out of Hollywood in both Blu-ray and HD-DVD. At the same time storage formats were being introduced, the graphics industry was showing its prowess, in terms of how well it could handle the challenges of video playback. H.264 and other buzzwords hit the media, and another craze began as consumer graphics makers overhauled their video products. ATI and Nvidia both wanted to show consumers that they were ready for the next generation of content. Nvidia revamped its video playback technologies called CineFX, while Nvidia not only improved on what it had done in the past, it gave its technology a new identity: PureVideo. With the launch of its R520 graphics processor, ATI (now AMD) took similar steps. The company took its Theater product for video capture and playback, made some upgrades, and repackaged it with its Radeon product to create AVIVO (ATI Video In and Video Out).

The focus of this article is not AMD, Nvidia, or any one company in particular - after all, fads come and go, sometimes based on a gimmick or single product, but trends are rarely this way. There is generally a paradigm shift in people's perception of a need or want, and these are the pendulum swings and quantum leaps. The current image quality trend has been in motion for some time, and new technologies are being introduced every year to move forward the desire for better moving pictures. That said, I decided to look into something that a lot of people have not been talking about: can graphics cards capture video?

Join our discussion on this topic