Any success in using TV output for games?

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
Does anyone here have any experiences with using TV output for Direct3d and Opengl games and if so which cards would you recommend?

This little cathode light of mine, I'm gonna let it shine!
 

Jake75

Distinguished
Aug 30, 2001
2,770
0
20,780
I use my Geforce2Mx cards TV output to look at DVD movies.
It also works fine with games (no need for AA when using telly! :smile: )

Only problem is that you don´t get the whole picture, you need a program called TV Tool or similar.

I think that ATI´s card can use the whole screen without any extra program and the quality is better, so I´ve heard.


This is my old <font color=blue>Downdated</font color=blue>, <font color=green>Uneatable</font color=green> & <font color=red> rotten </font color=red> sig.
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
It would be neat if one of the new cards coming out in the next month would have good TV out support. It's too bad reviewers never ever test TV out (at least I've never seen anyone try to test it). They just assume that if it has TV out it works. The last time I saw TV out tested was in a review of the ATi Rage Pro chipset (more than 4 years ago). I mean, look at this post to see what I mean:

<A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=49072#49072" target="_new">http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=49072#49072</A>

This little cathode light of mine, I'm gonna let it shine!
 

Vince604

Distinguished
Mar 1, 2002
741
0
18,980
Well this all actually comes down to your tv in reality. A standard tv I believe has 480i horizontal lines and interlaced scanning.
(i = interlaced)
Interlaced is the standard TV scanning process where every other horizontal line of an image is displayed 1/30th of a second.


Monitors and LCD displays on the other hand can have resolutions of up to 1920x1440 pixels.
(A pixel is a single dot on the display of your monitor/LCD)
So basically the higher the resolution the more pixels are used thus creating a better image.

Well if you have lots of money to spend you can get a HDTV (High Definition TV) which has 1080i which is really good and will benefit for the future as I'm pretty sure everything on TV will be broadcasted in HDTV format. (Currently only Japan in the world using this format)

And new technologies are coming out such as LCD TVs which has resolutions up to 1280x768.

But if you want the best on the market you can buy a plasma TV which can probably buy you a new car. I think the max. resolution or what I have seen is 1366x768. But I'm not sure about that.

It all comes down to your tv really like it or not. Your video cards shouldn't be the problem unless it's really old but still most should be capable of displaying a pretty high resolution.


And I'm not sure which post it was but I remember you said S-Video(Y/C) and Composite have the same video quality. But that is not quite true as S-Video is a lot better than composite as it transmits video images in two separate signals and at the same time does not segregate the signals any further. While Composite using RCA only transmits video images to your television via a single signal.
Think of sound for example. Would you rather have 5 seperate speakers and a seperate subwoofer? Or would you want to try to compact all of that sound in to two speakers?
You would obviously have a substantially poorer sound quality trying to put all that in to two speakers.

Anyways I hope this cleared a lot of things up!
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
Thanks for your input. I didn't notice any difference between S video and the RCA cables, but then again the TV resolution isn't good enough to notice much difference anyway. What I was referring to was the whole crop of bad and buggy drivers that caused few video cards to be able to use TV out correctly. Someone recently posted about their MSI 4400 not being able to use TV out, and I remember one of my ATi cards couldn't use TV out in direct3d or opengl even with updated drivers. This is not for 2D. 2D on all tv-out enabled video cards basically always works, that's a given. What I'm worried about is buggy tv output support for 3d games.

This little cathode light of mine, I'm gonna let it shine!
 

Crashman

Polypheme
Former Staff
I got the el cheapo Radeon LE TVO (NOT the 8500LE, the original), and it works fine for games on TV, except text in games, which tends to be too small for the TV's low resolution. I've had various ATI and nVidia cards, and the ATI's have always had the best TV-OUT.

<font color=blue>At least half of all problems are caused by an insufficient power supply!</font color=blue>
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
Cool, thanks! I look forward to getting the new R300 or the Parhelia if either of those have TV out.

This little cathode light of mine, I'm gonna let it shine!
 

Vince604

Distinguished
Mar 1, 2002
741
0
18,980
Hmm I was able to get my direct3D and opengl to work fine on my tv. But to get the tv out working you'd have to plug in the cables first before you turn on your computer right? Or else your computer won't detect it's on a tv unless you have a monitor connected to it also and from the monitor changing the settings to tv.

What happened when you tried using Direct3d or OpenGL with one of your ATI cards?
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
The game would crash right back to the desktop or it would give an error like "could not start renderer".

This little cathode light of mine, I'm gonna let it shine!
 

Vince604

Distinguished
Mar 1, 2002
741
0
18,980
Hmm that's odd..
Because anything running on a cpu should have nothing to do with the monitor or tv.. It's like running a cpu without a monitor and tv and should still work perfectly fine.
But I'm not too sure about this..
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
1. It rendered 3d games fine on my D-SUB plug monitor.
2. It crashed and/or could not initialize the renderer when it was hooked up to my TV, either using s-video or RCA.

This little cathode light of mine, I'm gonna let it shine!
 

Tiberius13

Distinguished
Jan 28, 2002
247
0
18,680
It should be pointed out that the quality of the cables used makes a huge difference in the quality you see on a tv. The cables that come with a tv / vcr / graphics card are usually just basic quality cables. If you go out and get high quality gold plates cables, you WILL notice a difference (well... maybe You won't... but I did some tests myself where I looked at crisp white text on a news/weather channel and switched between 'regular' and 'gold plated' rca cables and noticed a significant difference. It was enough to convince me to drop the $50 for the gold plated cables over the cheapo cables. I'm sure the same holds true with composite, svideo and component cables...

<font color=green><b>More salt than just a grain you will need with posts of mine. - Yoda</b></font color=green>
 

Vince604

Distinguished
Mar 1, 2002
741
0
18,980
Hmm. what kinda video card are you using and at what resolution are you using?
I remember some games wouldn't work on my monitor because my video card didn't support the correct resolution for openGL but after I changed the resolution setting for OpenGL it worked fine.
 

Tiberius13

Distinguished
Jan 28, 2002
247
0
18,680
Can you explain why gold plated cables are better?
Sure I can explain why gold is better. The little creatures that live inside all things (but are especially fond of electronics equipment) see that you have used gold plated cables, and since they know the value of gold, they are pleased and do their job in transmitting signals that much better.

Beyond that obvious explanation, there is the more vague and clouded explanation that has something to do with how gold handles electricity. At contact points where signals must travel from one connector to another, using a material that best transmits the signal without distortion or loss is critical to ensure the connection is good. I believe the property of gold that makes it good for video cables is it's capacitance.

All cables have some resistance, capacitance, and inductance. As such, all cables act as filters. There are other factors, such as skin effect, but suffice it to say, that what comes out is always different than what went in. The idea is to minimize the loss. What you are experiencing is the result of distortion caused by laws of electricity.

(that last paragraph I ripped off of a website - hometheaterhifi)

<font color=green><b>More salt than just a grain you will need with posts of mine. - Yoda</b></font color=green>
 

Vince604

Distinguished
Mar 1, 2002
741
0
18,980
Ah I see. Thanks! Well is gold plated the best right now? Or is there something better?
And why is Monster Cable so expensive? If you compare one of their S-Video cables to a generic that is also gold plated would you see any difference?
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
It's not hooked up to my system right now but it was an extremely old 4MB card based on the Rage Pro chipset. It was probably just bad drivers. I do know for a fact that it ran D3D games fine on my monitor but when it came to my TV it kept crashing back to the desktop for every single one. It doesn't matter, it is an old card. What worried me the most (and made me post this thread) was that someone else had said their GF4 Ti4400's TV-Out didn't work. Telling about my own experiences was just alimentary.

This little cathode light of mine, I'm gonna let it shine!
 

mulletkid

Distinguished
Dec 21, 2001
106
0
18,680
Contrary to what vince604 had to say, i believe that the TV has some to do with the image quality of TV out display from your computer but not all. Alot of your image quality will come from the quality of the tv out encoder chip on your video card. I have an old asusTNT card with TV out on my RCA 27" (circa 1980) and it looks far crisper than my brothers Leadtek geforce2 ultra on a 27" Sharp (circa 1999). So if you want to even have a chance reading text on your tv I would make sure i bought a Vid card with good tv out such as ATi's stuff. Oh and computer gaming on a tv with surround sound is far more enjoyable than on a dinky computer monitor and dinky speakers. Even the klipsch 5.1 system doesnt sound nearly as good as a good Home Theatre system. Computer games belong on big screens.
 

Vince604

Distinguished
Mar 1, 2002
741
0
18,980
Yes your exactly right mulletkid. Thanks for the post as it also does play a role in image quality.

I'm not exactly sure but is the tv-out using an entirely seperate chip?

And does it have to do anything with the decoder chip?