I was going through my current setup and interlaced video playback always seems to be a problem. Basically I'm playing back TV shows (mostly video taped sourced 70's shows like All in the family, Sanford and Son, 70's Saturday Night Live, etc.) I'm running a dual core and my current video card is an Nvidia Geforce 7300se/7200 GS running S-Video out to a CRT monitor (most of the stuff I watch is made before 1985 so HD really isn't a concern). Most of my stuff is ripped from DVD in Xvid format using Auto GK for encoding and am playing through a playback software called Raduga (it's a video playback interface which uses Windows Media 11). As far as I know Auto GK will leave interlaced video interlaced when encoding and I don't believe I have any de-interlacing filters turned on (I play most of my codecs through ffdshow) When playing back any of the above mentioned shows, the video looks "good" but not quite there in terms of maintaining the "video" look. Static scenes do look interlaced but any sort of high movements look de-interlaced. I've also played raw VOBs directly through Windows Media 11 and get the same problems. I was thinking of a new video card but wanted to make sure there isn't anything I should tweak or check before running out and buying anything. Also if I do need a new card, could anyone suggest anything good with interlaced sources with an S-video output?
Thanks in advance!
More about :video card htpc playing interlaced video
I'm saddened by the lack of response to your post. The computer industry has shown very little interest in supporting standard video formats that have been in use for decades. It's only the last couple of years that Microsoft have put proper support for 50Hz video which is used as the standard refresh rate outside the Americas and Japan (that's most of the world!) The only way I've got decent interlaced playback from my media centre PC is using an XBOX360 as a media centre extender. Not an ideal solution as it has extremely noisy fans and is quite laggy in its control of the PC and uses even more electricity. I gave up trying a few years ago.
Wouldn't it be great to have a video card that outputted video in it's native format without changing the framerate or interlace. I'd love to know if things have improved though. Anyone know?