Sign in with
Sign up | Sign in
Your question

Help! Tv tuner input quality is crap

Last response: in Graphics & Displays
Share
December 2, 2007 6:53:18 AM

here

This is the card I picked up for $30, on sale from $60...and it won't really display ps2 very good...in fact it's horrible...

Why is this?? :( 
December 2, 2007 7:31:07 AM

How are you connecting?
December 2, 2007 7:32:08 AM

Svideo and the yellow rca, they are both the same quality
Related resources
December 3, 2007 8:00:37 PM

Your PC's resolution is probably at least 1024x768, standard def tv resolution is 704X480. When you blow up the image to your pc's higher resolution you can see the lower quality of standard def's image. You might need to play with the deinterlacing settings for the card or your playback software to get something acceptable looking.
a b x TV
December 3, 2007 8:32:41 PM

As said, your computer screen is far higher in resolution then a TV and its a progressive screen where as a tv is interlaced...

for NTSC(Canada, US, Japan has there own NTSC-M or something) this is normally 640x480(720x480 for DVD's)
for PAL(Think Europe) 640 x 576(720 x 576 for DVD's)

Now on to interlaced video...SD TV refreshes every second line(easy way to say it) so it can make things look wrong on a monitor. There is software that can de-interlace, but it may cause a slight delay(or even one second on a hardware TV card)

If you want to try look for a program called descaler. See what it does for you, may just make it look good enough to play, but the slight delay(varies with cards) may not make some games fun.....never hurts to try...

http://deinterlace.sourceforge.net/

A bit more info on interlaced video(well a picture :) )

Make sure you click it and make it full size

First if Interlaced - this will be seen during motion as each line is refreshing....

Next is De-interlaced - Looks better....If I had good tv signal it would be allot better, but you get the idea

Last is Field Duplication - Looks ok, not as good as de-interlaced. This looks horrible on cartoons and some games. Many low end TV cards do this as it keeps the cpu utilization low....
December 5, 2007 2:21:14 AM

Well it turns out de-interlaced was already on. I'm guessing I just got what I paid for...
a b x TV
December 5, 2007 2:47:47 AM

what card is it?

did you try descaler?
December 5, 2007 2:58:01 AM

The pictures shown are what's capable with software deinterlacing and that is hard to get working in the display chain anyway. If you have a decent graphics card there should be little to no jagged artifacts using hardware deinterlace. Make sure the card is outputting to DxVA or has hardware acceleration turned on.

Make sure the latest DirectX is installed. Go to the graphics driver and make sure deinterlacing is enabled. This is under Avivo Quality in the ATI Catylyst drivers. Try all possible combinations of tuner and driver settings. Restart the tuner between changes. Even the ATI USB Wonder tuner is not good at making changes on-the-fly.
!