analog vs digital in gaming

mypinhead

Distinguished
Jul 28, 2004
6
0
18,510
I just purchased a samsung syncmaster 193P TFT LCD display and use the geforce fx 5200. I have an option of connecting analog or digital. I need advice on what connection to use for gaming. I like to play shooter games such as half-life, unreal tournament, doom3 (eventually), etc.

What would give the best performance and visual combination? Analog or digital?

thank you
 

coylter

Distinguished
Sep 12, 2003
1,322
0
19,280
Wait a second, .... you want to play doom 3 with your fx5200? ok

hahahahahahahahaha , sry hylarious moment...

HELL i dont think a 8 years old analog, broken, 12 inch monitor would make a difference with your card...

Athlon 2700xp+ (oc: 3200xp+ with 200fsb)
Radeon 9800pro (oc: 410/360)
1024mb pc3200 (5-3-3-2)
Asus A7N8X-X<P ID="edit"><FONT SIZE=-1><EM>Edited by coylter on 07/27/04 07:48 PM.</EM></FONT></P>
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
forget playing that game with ur card as coylter said.
But use Digital connection as it has better refresh rates.

Hope u r not (and neither i am) mistaking analog connection vs digital in an LCD with
LCD monitor(a digital) vs CRT(an analog)

ie
Digital vs analog in
1] the way info data is got from card to monitor
2] the way a monitor displays data on the screen

A crt gives better gaming bcos of much higher refresh rates, higher sharper contrast ratio...but LCDs are coming close and are easier on the eyes as pixels are ALWAYS on(i think, even to show black color). Unlike in crt where to show black is not to fire the electrons on the screen.

<i> :eek: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek: </i>
 

pauldh

Illustrious
I would use the digital input on that Samsung as long as your video card has DVI out. But, seriously, that FX 5200 needs to go if you want to play the latest shooters.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
THAT's AWESOME

I CAN'T F***ING WAIT TO SEE YOUR FACE WHEN U INSTALL DOOM 3 and the screen says

"this computer is too f***ing crappy to play this game at the lowest loser setting, please get a life and dont insult Doom3"

:D

it doesn't matter if u have 2 23inch top of the line LCD monitor, u can't even beat first level without restarting the comp more than 50 times

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

mypinhead

Distinguished
Jul 28, 2004
6
0
18,510
OK, OK, OK, I got the point. Is there a decent card that I can get that is $250 or less? I need to keep some budget.

I guess my next problem is my processor? I have a amd xp 2400 w/ 1 gig of pc3200 ddr ram
 
But use Digital connection as it has better refresh rates.
Huh?

No, actually your DVI connection will be bandwidth limited due to the TMDS speed and the way the signal is handled.

DB-15 connectors have better bandwidth (the best being BNC), and will scale better too.

Max for MOST DVI connectors is just above 1600x1200x60hz. High quality dual channel cards (which the FX5200 definitely is NOT) will go to around 1920x1440x60. DB-15 can handle higher resolutions and refresh rates. OF course BNC connectors are the best for truely serious configurations, either that or multi-dvi solutions.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
Can you provide me a link to what you say.
For now I am going to stick to my theory. Also DVI doesnt need digital to analog and vice-versa conversions. The articles/white paper below show how fast can the bandwidth of TMDS be and how resolutions DVI can support.

http://computer.howstuffworks.com/monitor5.htm
http://www.proxima.com/downloads/pdf/DVI-WhitePaper.pdf
http://www.blackbox.com/tech_docs/tech_overviews/video_conn.html

And y do the articles say
1] "The DVI interface provides high bandwidth for today’s devices as well as plenty of headroom for the future."

2] "DVI is based on Silicon Image’s Transition Minimized Differential Signaling (TMDS) technology, which provides a high-bandwidth digital connection between the host computer and a display device."

3] "allows digital displays to reach resolutions up to 2048 X 1536 (QXGA) and beyond."

If the industry say HD15 (DB15) or BNC to be of higher bandwidth, y a new standard.

<i> :eek: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek: </i>
 

mypinhead

Distinguished
Jul 28, 2004
6
0
18,510
thanks. So, a 9800 w/128 mb of ram is decent enough? Would this still be better than getting the 9600 with 256mb of ram?
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
see <A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=407967#407967" target="_new">this thread</A>

<i> :eek: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek: </i>
 

mypinhead

Distinguished
Jul 28, 2004
6
0
18,510
Thank you. Is there a decent card that I can get that is $250 or less? I need to keep some budget. I see you have the 9800 pro. is 128 mb of ram enough?

I guess my next problem is my processor? I have a amd xp 2400 w/ 1 gig of pc3200 ddr ram
 
Actually READ your second whitepaper.

They describe the single channel/link limitation right in it;

<font color=purple>"The limit is a bandwidth of about 165MHz, which
equates to 165 million pixels per second. A single TMDS link has a bandwidth of 165 MHz, which
enough to display resolutions of up to 1600 x 1200 (UXGA) at 60Hz."</font color=purple>

And your quote about the MAX misses the ESSNTIAL point I made about dual/multi channel/links which is mentioned just before your quote or the similar quote;

<font color=purple><b>"which is the first standard specifically written for the TMDS digital interface allows for up to
two TMDS links, a total of 6 channels sharing a single clock, to be integrated into a single DVI
connector to support a minimum bandwidth of 330 mega pixels per second.</b> That is enough
bandwidth to enable digital displays to reach resolutions of up to 2048 x 1536 (QXGA)."</font color=purple>

Most cards (outside of Matrox Cards) are single link/channel.

If the industry say HD15 (DB15) or BNC to be of higher bandwidth, y a new standard.
Quality, DVI also sends more information per second for the same resolution, and thus runs out of bandwidth quickly for most DVI cards that use single link. Another advanatage is ofcourse the reduction of noise in the signal. Of course the last one you know already which is avoiding image/signal degredation from converting to analogue and back again.

You want to see how the cards fair, and often FAIL to even meet 1600x1200, forget WhitePapers look at this actual test by ExtremeTech;

<A HREF="http://www.extremetech.com/article2/0,1558,1367918,00.asp" target="_new">http://www.extremetech.com/article2/0,1558,1367918,00.asp</A>

(been a while since I posted that)

BTW, DVI connectors aren't just for Digital, and there's more than 2 types unlike what blackbox says.
But I'm sure you already knew that, right? :wink:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
how abt giving me a link showing DB15 bandwidth and comparison to DVI. I think the argument was about which has higher bandwidth and not limitations.

And yes I know diff standards/types of DVI.
BTW
BTW, DVI connectors aren't just for Digital, and there's more than 2 types unlike what blackbox says.

I think u should READ that line carefully. There is a word - main - in it.

<i> :eek: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek: </i>
 
I think the argument was about which has higher bandwidth and not limitations.
What do you think we are talking about, theoretical? No, I'm not talking about the potential bandwidth of the design, but the actual baandwidth of real cards. BTW, the question, not argument was about refresh rates. And the FX5200 will NOT achieve better refresh rates with it's DVI connection because of it's design limitations.

You statement was <font color=blue>"But use Digital connection as it has better refresh rates."</font color=blue> to which my reply was straight forward; <font color=purple>"No, actually your DVI connection will be bandwidth limited due to the TMDS speed and the way the signal is handled."</font color=purple> which is directly related to the FX's design with single link TMDS(s), you will NOT find a DUAL link on a low end graphics card, let alone a QUAD (which is not supported by the DVI connector you're thinking of). And if you want to talk about theoretical, let me pick the cable length. For average cards (especially the one mentioned) the DVI connector will provide lower refresh rates.

Now as for the technology, and theoretical limits. DVI maxes out on the connectors you're talking about at 330mhz. A NORMAL DB15 usually is limited to 350mhz, however 'wide' DB15 maxes at 700mhz and 'ultra-wide' HD DB-15 has a max above 1ghz. As for BNC each channel can carry above 300mhz each. So you figure it out. Even a Quad DVI would have a max of just over 650mhz. HDDB-15 is designed for above 1ghz, and BNC can handle even more. The reason they 'usually' don't is becuase of issue like EMI.

And yes I know diff standards/types of DVI.
Ok list them.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
DVI 1.0 Specification:

DVI-D Digital Only
DVI-I Digital and Analog
Dual Link Dual Link DVI supports 2x165 MHz (2048x1536 at 60 Hz, 1920x1080 at 85 Hz). A dual link implementation utilizes all 24 of the available pins.
Single Link Single Link DVI supports a maximum bandwidth of 165 MHz (1920x1080 at 60 Hz, 1280x1024 at 85Hz). A single link implementation utilizes 12 of the 24 available pins.

DVI-A Analog Only, not part of the specification.


<i> :eek: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek: </i>
 
Nice cut and paste.

Right down to the TV-based resolution.

I'm surprised you didn't include minimum resolution.

Hmm reminds me of somene else.

For the single link/channel FX5200, it's pretty obvious which one would provide the max bandwidth.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Wolfy

Distinguished
Aug 30, 2003
1,036
0
19,280
you have to respect the way you guys completely took this topic over :) is anyone going to answer the mans question?

i live in Ireland so I dunno how far $250 will take u but if yur planning on playing d3 you will want something like a 9800pro AT LEAST (according to one topic it says a 128mb card is for medium textures, a 256 card for high and 512 haha for ultra) and you will want a better cpu but in fairness till it comes out no ones reeeeeally knows what the game will play like but with a xp2400 and even with like a 9800pro or xt the game wont be playable at max settings.... but d3 aside yur fx5200 is a piece of muck and yur system would benefit from a 9800pro for any other game but thats your choice.

EDIT: oh and if you are into gaming while a static picture is sharper on a TFT unless the response time is excellent then a TFT aint as good as a CRT for playing games, particularly FPS's.

"Its only when you look at ants closely with a magnifying glass on a sunny day that you realise how often they burst into flames"<P ID="edit"><FONT SIZE=-1><EM>Edited by wolfy on 07/28/04 05:22 AM.</EM></FONT></P>
 
Well really coylter answered it correctly from the start so the rest is simply an open thread for discusion/bickering.

But talking about NOT answering the man's questions, what the heck does an R9800 have to do with deciding which connector to use off a Hybrid LCD?

As for the original question, it doesn't matter. DVI will probably give him the truest image, but it likely won't make a big difference either way.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

gobeavers

Distinguished
Jun 7, 2003
446
0
18,780
Hate to say it grape (I'm one of your grape-nutz), but you did hijack his thread. His original question was answered but he asked the follow up question " I see you have the 9800 pro. is 128 mb of ram enough? I guess my next problem is my processor? I have a amd xp 2400 w/ 1 gig of pc3200 ddr ram" So I don't see it as an open thread, he still has a question. A 9800 doesn't have anything to do with LCD connectors but did you not see his other questions?

In response to the original poster-A 9800 pro with 128 mb of RAM will be worlds better than your 5200, but as for whether to splurge for 256mb, I don't know if the card is fast enough to take advantage of it or not. Grape could tell you that if he would stop arguing about DVI :D.... From what I have heard Doom 3 is more video card dependent than processor, so your processor may do fine (but we won't know till it actually comes out).

"Go forward until the last round is fired and the last drop of gas is expended...then go forward on foot!" -Patton
 

pauldh

Illustrious
Tough question. A Radeon 9800 Pro with 128MB of ram would be better than a Radeon 9600XT with 256MB of ram. No doubt about that. How much more would it cost you for a 256MB R9800 Pro? Probably too much. I don't think it's worth $50+ for the extra memory, when up to this point they perform identically for the most part. And so far the price difference is often $70-$100. As far as Doom 3, we are just days away from seeing HardOCP put out a hardware guide that should be awesome for helping see how various systems will play Doom3 and at what resolutions/quality. So the 128MB vs 256MB question is soon to be answered for sure.

I really wouldn't look to spend much over $200 or so on a video card to pair with that XP2400+, unless a mobo/cpu upgrade is is the near future. It would surely help in some games, but not across the board as the XP2400+ will more often than not hinder the new X800/6800 cards from performing noticably better than the 9800 Pro. That's my opinion anyway.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 

priyajeet

Distinguished
May 21, 2004
2,342
0
19,780
I never said i remember the DVI types. But I knew all three as I use (at my work) each one of them...

Apples have another type of DVI connector...a wierdo one. I think its links USB and firewire too...but not sure.

<i> :eek: <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek: </i>
 

mypinhead

Distinguished
Jul 28, 2004
6
0
18,510
"For the single link/channel FX5200, it's pretty obvious which one would provide the max bandwidth."
_________________________________________

Thank you everyone for the spitited discussion to help me!

I am a little more confused. Let's forget the current video card I have, the FX5200. I am shopping around currently to buy a new one. I am going in the direction of the Radeon 9800 pro.

I did not realize that there was so many variables involved in this. With the 9800pro is it better for analog or digital or it doesn't really matter?

thank you again!
 

cleeve

Illustrious
With the 9800pro is it better for analog or digital or it doesn't really matter?

I think the answer is "it doesn't really matter", especially since every 9800 PRO out there will have a DVI as well as analog output. Hell, try them both and see which you like better. I suspect it won't make a percievable difference in a real-world situation, however. But that's just a guesstimate.

The 9800 PRO is an awesome card, and 128 megs should do you just fine for a couple years at least. Much better than, say, a 9600 PRO.

On a final note: I'd try out Doom3 with your 5200 before assuming it won't run it. Because Nvidia and ID are in Kahoots, I wouldn't be surprised if the 5200 would run Doom3 a bit better than, say, a 9600 non-pro at lower resolutions. I seem to recall a benchmark of the D3 alpha confirming this a while back... just a thought, anyway.


________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 329/337)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @ 2208 Mhz)</i>
<b>3dMark03: <font color=red>4,876</b>