Sign in with
Sign up | Sign in
Your question

analog vs digital in gaming

Last response: in Graphics & Displays
Share
July 27, 2004 11:13:46 PM

I just purchased a samsung syncmaster 193P TFT LCD display and use the geforce fx 5200. I have an option of connecting analog or digital. I need advice on what connection to use for gaming. I like to play shooter games such as half-life, unreal tournament, doom3 (eventually), etc.

What would give the best performance and visual combination? Analog or digital?

thank you

More about : analog digital gaming

July 27, 2004 11:45:53 PM

Wait a second, .... you want to play doom 3 with your fx5200? ok

hahahahahahahahaha , sry hylarious moment...

HELL i dont think a 8 years old analog, broken, 12 inch monitor would make a difference with your card...

Athlon 2700xp+ (oc: 3200xp+ with 200fsb)
Radeon 9800pro (oc: 410/360)
1024mb pc3200 (5-3-3-2)
Asus A7N8X-X<P ID="edit"><FONT SIZE=-1><EM>Edited by coylter on 07/27/04 07:48 PM.</EM></FONT></P>
July 28, 2004 12:56:18 AM

forget playing that game with ur card as coylter said.
But use Digital connection as it has better refresh rates.

Hope u r not (and neither i am) mistaking analog connection vs digital in an LCD with
LCD monitor(a digital) vs CRT(an analog)

ie
Digital vs analog in
1] the way info data is got from card to monitor
2] the way a monitor displays data on the screen

A crt gives better gaming bcos of much higher refresh rates, higher sharper contrast ratio...but LCDs are coming close and are easier on the eyes as pixels are ALWAYS on(i think, even to show black color). Unlike in crt where to show black is not to fire the electrons on the screen.

<i> :eek:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek:  </i>
Related resources
a b U Graphics card
July 28, 2004 1:09:39 AM

I would use the digital input on that Samsung as long as your video card has DVI out. But, seriously, that FX 5200 needs to go if you want to play the latest shooters.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 28, 2004 1:47:18 AM

THAT's AWESOME

I CAN'T F***ING WAIT TO SEE YOUR FACE WHEN U INSTALL DOOM 3 and the screen says

"this computer is too f***ing crappy to play this game at the lowest loser setting, please get a life and dont insult Doom3"

:D 

it doesn't matter if u have 2 23inch top of the line LCD monitor, u can't even beat first level without restarting the comp more than 50 times

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D 
July 28, 2004 2:48:46 AM

OK, OK, OK, I got the point. Is there a decent card that I can get that is $250 or less? I need to keep some budget.

I guess my next problem is my processor? I have a amd xp 2400 w/ 1 gig of pc3200 ddr ram
a b U Graphics card
July 28, 2004 4:26:29 AM

Quote:
But use Digital connection as it has better refresh rates.

Huh?

No, actually your DVI connection will be bandwidth limited due to the TMDS speed and the way the signal is handled.

DB-15 connectors have better bandwidth (the best being BNC), and will scale better too.

Max for MOST DVI connectors is just above 1600x1200x60hz. High quality dual channel cards (which the FX5200 definitely is NOT) will go to around 1920x1440x60. DB-15 can handle higher resolutions and refresh rates. OF course BNC connectors are the best for truely serious configurations, either that or multi-dvi solutions.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 28, 2004 4:53:37 AM

Can you provide me a link to what you say.
For now I am going to stick to my theory. Also DVI doesnt need digital to analog and vice-versa conversions. The articles/white paper below show how fast can the bandwidth of TMDS be and how resolutions DVI can support.

http://computer.howstuffworks.com/monitor5.htm
http://www.proxima.com/downloads/pdf/DVI-WhitePaper.pdf
http://www.blackbox.com/tech_docs/tech_overviews/video_...

And y do the articles say
1] "The DVI interface provides high bandwidth for today’s devices as well as plenty of headroom for the future."

2] "DVI is based on Silicon Image’s Transition Minimized Differential Signaling (TMDS) technology, which provides a high-bandwidth digital connection between the host computer and a display device."

3] "allows digital displays to reach resolutions up to 2048 X 1536 (QXGA) and beyond."

If the industry say HD15 (DB15) or BNC to be of higher bandwidth, y a new standard.

<i> :eek:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek:  </i>
July 28, 2004 5:04:53 AM

thanks. So, a 9800 w/128 mb of ram is decent enough? Would this still be better than getting the 9600 with 256mb of ram?
July 28, 2004 5:07:26 AM

Thank you. Is there a decent card that I can get that is $250 or less? I need to keep some budget. I see you have the 9800 pro. is 128 mb of ram enough?

I guess my next problem is my processor? I have a amd xp 2400 w/ 1 gig of pc3200 ddr ram
a b U Graphics card
July 28, 2004 5:48:05 AM

Actually READ your second whitepaper.

They describe the single channel/link limitation right in it;

<font color=purple>"The limit is a bandwidth of about 165MHz, which
equates to 165 million pixels per second. A single TMDS link has a bandwidth of 165 MHz, which
enough to display resolutions of up to 1600 x 1200 (UXGA) at 60Hz."</font color=purple>

And your quote about the MAX misses the ESSNTIAL point I made about dual/multi channel/links which is mentioned just before your quote or the similar quote;

<font color=purple><b>"which is the first standard specifically written for the TMDS digital interface allows for up to
two TMDS links, a total of 6 channels sharing a single clock, to be integrated into a single DVI
connector to support a minimum bandwidth of 330 mega pixels per second.</b> That is enough
bandwidth to enable digital displays to reach resolutions of up to 2048 x 1536 (QXGA)."</font color=purple>

Most cards (outside of Matrox Cards) are single link/channel.

Quote:
If the industry say HD15 (DB15) or BNC to be of higher bandwidth, y a new standard.

Quality, DVI also sends more information per second for the same resolution, and thus runs out of bandwidth quickly for most DVI cards that use single link. Another advanatage is ofcourse the reduction of noise in the signal. Of course the last one you know already which is avoiding image/signal degredation from converting to analogue and back again.

You want to see how the cards fair, and often FAIL to even meet 1600x1200, forget WhitePapers look at this actual test by ExtremeTech;

<A HREF="http://www.extremetech.com/article2/0,1558,1367918,00.a..." target="_new">http://www.extremetech.com/article2/0,1558,1367918,00.a...;/A>

(been a while since I posted that)

BTW, DVI connectors aren't just for Digital, and there's more than 2 types unlike what blackbox says.
But I'm sure you already knew that, right? :wink:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 28, 2004 6:08:48 AM

how abt giving me a link showing DB15 bandwidth and comparison to DVI. I think the argument was about which has higher bandwidth and not limitations.

And yes I know diff standards/types of DVI.
BTW
Quote:
BTW, DVI connectors aren't just for Digital, and there's more than 2 types unlike what blackbox says.


I think u should READ that line carefully. There is a word - main - in it.

<i> :eek:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek:  </i>
a b U Graphics card
July 28, 2004 7:02:23 AM

Quote:
I think the argument was about which has higher bandwidth and not limitations.

What do you think we are talking about, theoretical? No, I'm not talking about the potential bandwidth of the design, but the actual baandwidth of real cards. BTW, the question, not argument was about refresh rates. And the FX5200 will NOT achieve better refresh rates with it's DVI connection because of it's design limitations.

You statement was <font color=blue>"But use Digital connection as it has better refresh rates."</font color=blue> to which my reply was straight forward; <font color=purple>"No, actually your DVI connection will be bandwidth limited due to the TMDS speed and the way the signal is handled."</font color=purple> which is directly related to the FX's design with single link TMDS(s), you will NOT find a DUAL link on a low end graphics card, let alone a QUAD (which is not supported by the DVI connector you're thinking of). And if you want to talk about theoretical, let me pick the cable length. For average cards (especially the one mentioned) the DVI connector will provide lower refresh rates.

Now as for the technology, and theoretical limits. DVI maxes out on the connectors you're talking about at 330mhz. A NORMAL DB15 usually is limited to 350mhz, however 'wide' DB15 maxes at 700mhz and 'ultra-wide' HD DB-15 has a max above 1ghz. As for BNC each channel can carry above 300mhz each. So you figure it out. Even a Quad DVI would have a max of just over 650mhz. HDDB-15 is designed for above 1ghz, and BNC can handle even more. The reason they 'usually' don't is becuase of issue like EMI.

Quote:
And yes I know diff standards/types of DVI.

Ok list them.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 28, 2004 8:16:58 AM

DVI 1.0 Specification:

DVI-D Digital Only
DVI-I Digital and Analog
Dual Link Dual Link DVI supports 2x165 MHz (2048x1536 at 60 Hz, 1920x1080 at 85 Hz). A dual link implementation utilizes all 24 of the available pins.
Single Link Single Link DVI supports a maximum bandwidth of 165 MHz (1920x1080 at 60 Hz, 1280x1024 at 85Hz). A single link implementation utilizes 12 of the 24 available pins.

DVI-A Analog Only, not part of the specification.


<i> :eek:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek:  </i>
a b U Graphics card
July 28, 2004 8:55:22 AM

Nice cut and paste.

Right down to the TV-based resolution.

I'm surprised you didn't include minimum resolution.

Hmm reminds me of somene else.

For the single link/channel FX5200, it's pretty obvious which one would provide the max bandwidth.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 28, 2004 9:19:30 AM

you have to respect the way you guys completely took this topic over :)  is anyone going to answer the mans question?

i live in Ireland so I dunno how far $250 will take u but if yur planning on playing d3 you will want something like a 9800pro AT LEAST (according to one topic it says a 128mb card is for medium textures, a 256 card for high and 512 haha for ultra) and you will want a better cpu but in fairness till it comes out no ones reeeeeally knows what the game will play like but with a xp2400 and even with like a 9800pro or xt the game wont be playable at max settings.... but d3 aside yur fx5200 is a piece of muck and yur system would benefit from a 9800pro for any other game but thats your choice.

EDIT: oh and if you are into gaming while a static picture is sharper on a TFT unless the response time is excellent then a TFT aint as good as a CRT for playing games, particularly FPS's.

"Its only when you look at ants closely with a magnifying glass on a sunny day that you realise how often they burst into flames"<P ID="edit"><FONT SIZE=-1><EM>Edited by wolfy on 07/28/04 05:22 AM.</EM></FONT></P>
a b U Graphics card
July 28, 2004 9:26:43 AM

Well really coylter answered it correctly from the start so the rest is simply an open thread for discusion/bickering.

But talking about NOT answering the man's questions, what the heck does an R9800 have to do with deciding which connector to use off a Hybrid LCD?

As for the original question, it doesn't matter. DVI will probably give him the truest image, but it likely won't make a big difference either way.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 28, 2004 9:33:15 AM

Hate to say it grape (I'm one of your grape-nutz), but you did hijack his thread. His original question was answered but he asked the follow up question " I see you have the 9800 pro. is 128 mb of ram enough? I guess my next problem is my processor? I have a amd xp 2400 w/ 1 gig of pc3200 ddr ram" So I don't see it as an open thread, he still has a question. A 9800 doesn't have anything to do with LCD connectors but did you not see his other questions?

In response to the original poster-A 9800 pro with 128 mb of RAM will be worlds better than your 5200, but as for whether to splurge for 256mb, I don't know if the card is fast enough to take advantage of it or not. Grape could tell you that if he would stop arguing about DVI :D .... From what I have heard Doom 3 is more video card dependent than processor, so your processor may do fine (but we won't know till it actually comes out).

"Go forward until the last round is fired and the last drop of gas is expended...then go forward on foot!" -Patton
a b U Graphics card
July 28, 2004 12:55:12 PM

Tough question. A Radeon 9800 Pro with 128MB of ram would be better than a Radeon 9600XT with 256MB of ram. No doubt about that. How much more would it cost you for a 256MB R9800 Pro? Probably too much. I don't think it's worth $50+ for the extra memory, when up to this point they perform identically for the most part. And so far the price difference is often $70-$100. As far as Doom 3, we are just days away from seeing HardOCP put out a hardware guide that should be awesome for helping see how various systems will play Doom3 and at what resolutions/quality. So the 128MB vs 256MB question is soon to be answered for sure.

I really wouldn't look to spend much over $200 or so on a video card to pair with that XP2400+, unless a mobo/cpu upgrade is is the near future. It would surely help in some games, but not across the board as the XP2400+ will more often than not hinder the new X800/6800 cards from performing noticably better than the 9800 Pro. That's my opinion anyway.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 28, 2004 3:30:20 PM

I never said i remember the DVI types. But I knew all three as I use (at my work) each one of them...

Apples have another type of DVI connector...a wierdo one. I think its links USB and firewire too...but not sure.

<i> :eek:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek:  </i>
July 28, 2004 3:30:52 PM

"For the single link/channel FX5200, it's pretty obvious which one would provide the max bandwidth."
_________________________________________

Thank you everyone for the spitited discussion to help me!

I am a little more confused. Let's forget the current video card I have, the FX5200. I am shopping around currently to buy a new one. I am going in the direction of the Radeon 9800 pro.

I did not realize that there was so many variables involved in this. With the 9800pro is it better for analog or digital or it doesn't really matter?

thank you again!
July 28, 2004 3:33:06 PM

digital.

<i> :eek:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :eek:  </i>
July 28, 2004 4:12:14 PM

Quote:
With the 9800pro is it better for analog or digital or it doesn't really matter?


I think the answer is "it doesn't really matter", especially since every 9800 PRO out there will have a DVI as well as analog output. Hell, try them both and see which you like better. I suspect it won't make a percievable difference in a real-world situation, however. But that's just a guesstimate.

The 9800 PRO is an awesome card, and 128 megs should do you just fine for a couple years at least. Much better than, say, a 9600 PRO.

On a final note: I'd try out Doom3 with your 5200 before assuming it won't run it. Because Nvidia and ID are in Kahoots, I wouldn't be surprised if the 5200 would run Doom3 a bit better than, say, a 9600 non-pro at lower resolutions. I seem to recall a benchmark of the D3 alpha confirming this a while back... just a thought, anyway.


________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 329/337)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @ 2208 Mhz)</i>
<b>3dMark03: <font color=red>4,876</b>
July 28, 2004 5:31:10 PM

Ive used both DVI and DB15 and the difference is negligible. One thing to keep in mind, when you are using DVI, your screen auto adjust buttons will not work, so if you have an off center display from switching resolutions youll have to correct it by hand.

Personally I think at higher resolutions you can see a slightly sharper image (especially text) with DVI, but in games you cant really tell.

As far as the VRAM amounts.....I dont think it will make any difference at all unless we are talking about X800 or 6800 based cards. You cant expect the GPU to handle the higher detail settings just because your VRAM can cache the texture data.

"Who is General Failure, and why is he reading my drive?"
P4 3.0 HT, Intel D865GBF, 512MB Crucial PC3200 DDR, WD 36GB Raptor 10,000RPM, BBA Radeon 9800PRO, SB Audigy, Hauppage WinTV
a b U Graphics card
July 28, 2004 9:20:42 PM

True I was really just a little taken aback by the generalized statement about DVI. Anywhoo, I was tired and cranky, and I admit it. I really thought the thread was dead, but maybe that's just me reading more into than I should have.

I still think the answer is it won't make much difference, but DVI will offer the opportunity for better IQ, but I don't think the difference will be noticeable on moving images for the most part. Really static images are the most demanding for quality, HD video is up therem but not as critical IMO. The main thing is that the DVI will likely reduce alot of the surrounding noise from things like TV (when I turn on either of my two video monitors for editing, they mess with my VGA connected monitor for a split second, but my DVI is fine, the same could happen from sympathetic or parasympathetic interference.

The thing about the setup is that really D scales BOTH the system and the Cards, if you look at the impact of the DDR2 and the faster CPU in [H]'s original review (don't know if a new one is up yet), you can see that impact. Also likely good 256mb memory on the R9800 would help with things like AA or higher resolutions, but I'm not sure if a R9800PRo-256mbGDDR2 which is somewhat OC limited would benifit as much from the more memory as seriously asoverclocking the 128mbDDR(1) would help. Reallly only an actual investigation would answer that, right now I've seen no one talk about that at any length.

Likely there will be settings on everything, lower this setting for slowr CPUs, lower this for lesser VPU-VRAM. Expect the FX5200 to be near the bottom of the pile even with optimizations (Cleeve rememeber Carmack removed the NV30 path [folding in some optimizations], and so the FXs perform slightly slower than their ATI counter parts for the most part (still no real benchies of the FX5200 yet).

My advice, Wait fo rthe [H] update, they have a whole mess of hardware they will be testing. It should be released soon, Kyle talked about it a B3D on the weekend.

Anywhoo, sorry for the hijack, but the statement just caught me the wrong way after a long day.

BTW, I'll soon KNOW what it's like to run D3 on an FX5200. :redface:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 29, 2004 9:55:39 AM

lol you said parasympathetic interference. try working words like vicarious and discombobulate into yur next post :p  god im bored in work today

"Its only when you look at ants closely with a magnifying glass on a sunny day that you realise how often they burst into flames"
a b U Graphics card
July 29, 2004 2:56:56 PM

:wink:

Well it's really a question of what's on. Jerry Springer is the worst for that.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
!