Sign in with
Sign up | Sign in
Your question

Newbie needs video card help!

Last response: in Graphics & Displays
Share
August 31, 2003 2:26:42 PM

Hey there folks. I'm looking to get a new video card. Right now I'm just using the onboard video on my compaq. I don't play games, just mainly surf the net and watch some avi's, divx's etc. All I really want to get out of the card is to take some strain off my processor and a little improvement in display. I have a NEC LCD 1512 monitor. Sorry if I sound dumb but I'm really new to this stuff.

More about : newbie video card

August 31, 2003 2:45:54 PM

Your onboard video is not taking much cpu power, instead they "steal" your RAM to store stuffs. So in your situation it is better to boosting your total RAM instead of getting new video card.
a b U Graphics card
August 31, 2003 3:01:27 PM

Actually you ARE better off getting a new card. Your onboard will still be slow, still steal memory and still just be crap, even for divx stuff. Get a Cheap R7000 or higher (recommend a 64mb R9100) or a Cheap GF2/3 or GF4MX with at least 32mb. This will speed up your graphics side of things. If you already have enough memory (depends on your setup) then this will help.
Another thing that helps things like Divx decompression is a faster CPU. Memory is only and issue if you are running less than 128mb (on Win 98/2000) or less than 256 on XP.
Try a cheap video card first, then maybe upgrade your processor. Upgradin your processor might be a good idea anyways, but we'd really need to know your full spec./setup to know what would work best for sure.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
Related resources
August 31, 2003 3:12:13 PM

My setup is:

Compaq S3000NX
256 MB DDR RAM
Athlon 2000+ (1.67GHZ)
Windows XP

I'm thinking of adding another 256 of RAM and a video card. This computer works well for me but it can get a little slow when multitasking. Thanks!
August 31, 2003 6:05:55 PM

Thats a good idea. Pop another 256 stick in there, and get a 64mb Radeon 9100, which cost less than $100. Your CPU is fine:)  ATI cards usually have better image quality than nVidia cards, AND they have an ATI only feature known as FULLSTREAM, which works with the latest divx, and i think the latest wmp version? (They should be incorporating it into more and more things as time passes.)

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
August 31, 2003 6:24:06 PM

If he's really tight on cash, I don't beleive heal even notice the difference in speed between an old 16 meg TNT card over a 32meg GeForce 2 card, or even a 64meg GeForce3 or Radeon 9100 for 2D stuff.

By the way, is the Radeon 9100 & 9200 a castrated R200 core just like the Radeon 9000 core was/

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
August 31, 2003 7:05:32 PM

Does your motherboard have an AGP port?

If not you will have limited choices.
There are PCI slot graphics cards (both ATI and Nvidia) but choices are limited.

The loving are the daring!
August 31, 2003 8:18:23 PM

I do have an AGP slot. Think I'm best to go with the 9100 or 9200?
a b U Graphics card
September 1, 2003 12:00:52 AM

For AGP, the GeForce4 Ti4200 gives a lot of bang for the buck.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
a b U Graphics card
September 1, 2003 6:52:34 AM

Actually the R9100 is just like the R8500LE and does rather well. And more memory than 256mb? Who are you kidding. Yeah that's nice for multi-tasking but he's not RENDERING the scenes. The difference in many applications will depend on his resolutions and colour depth.
The amount of ram tied up by his onboard graphics will be freed up, and it will also shunt things faster. As much as you may THINK it won't make a difference, likely it will make a large difference. And there isn't an onboard chipset right now that comes anywhere CLOSE to an R9100 so it's definitely a NICE step up, especially due to the video-smoothing of Fullstream that it comes with it. Adding memory alone would do LITTLE (very LITTLE) to improve things. XP is a memory hog, granted, but it can easily get away with running on even 128mb for basic apps. Now editing or authoring a DIVX/AVI or other media WOULD require much more memory, and since he has 256mb I think he's covered for the basics.
I know this from experience, Even offloading the video from my onboard video in my old IBM Celeron to the AIW Rage 128 PCI 16mb sped things up noticeably, (yes even in 2D apps) and you could see screen refreshes faster and page swapping happened faster. Then moving to the R8500DV and RageFury on the AGP allowed me much higher resolutions where I wouldn't see cursor lag or delays in opening pages or displaying images.

If he were running 64mb or less I'd say memory is the biggest issue, if it were around 128mb then I'd say it's an even toss and BOTH are a good idea. But with 256mb the priority is GPU, and then memory, whihc considering the cost of memory isn't a bad addition anyways.

But then again, That's just MY two frames worth.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
a b U Graphics card
September 1, 2003 7:03:16 AM

I would once again recommend the R910064mb. The R9100 is much more powerful (that obscure word) than the R9200. 64mb should do you for most every application. The only reason I don't recommend the GF4s, is because they do cost a little more for power you definitely won't use (the R9100-64mb in Canada goes for just under $60 so in the US it should be even cheaper [like $45-ish]). I did recommend something along the lines of a GF4MX (God Help Me) but that's because you won't use much more power than that, however I'd say the ATI's do have BETTER visual quality IMO (and others). Another option would be to get a 32mb Matrox card G200/G450/G550. They have the BEST image quality and work well with most apps (not really a gaming card by any means). They would also meet most of your requirements. Howeverthe ATIs are sometimes cheaper and definitely have some additional tweaks. IT's really a matter of what you like once you see it, and read reviews, see the results. If you get an idea of what they offer before you buy you'll do best. I think getting anything more powerful (like a GF4, an R9600, and FX5200, etc.) is overkill for you as you won't use it for gaming, and pretty much the lower end cards will master most of the day-to-day apps you throw at them. Going with a DX8/8.1 card gives you access to some nice effect that will be nice for programmed demos, etc., but none of the cards will be tested by normal video.
Picking up a cheap stick of 256 or 512 mb of ram while you're there (PC2700 sells for less than $40 in Canada for 256mb) might be a good idea too. Just for the multi-tasking you speak off. It can't hurt if you can spare the $.

EDIT: Also what resolution do you like to run at? Anything above 1280x1024 with 32bit colour I'd recommend 64mb video (not necessary, just nice).


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:  <P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 09/01/03 01:05 AM.</EM></FONT></P>
September 1, 2003 7:08:40 AM

Grape how often do you use/take a look at AVERAGE peoples computers? So what theyre not doing 5 intensive, or even just memory hogging, things at once, but theyve usually got unneccessary crap running in the background (usually alot).

For this and other reasons I think he would see a diff with 512mb of ram.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
September 1, 2003 7:26:33 AM

There really is that much of a difference between a 16 meg TNT and a 64 meg 9100 in standard 2d apps? I remember when I had 1MB PCI video cards on my Pentium 1 systems awhile back and I had Voodoo2 add-on cards. I thought the 1 meg cards were decent for navigating through Windows 95 and Word Processing back then. i didn't have any issues then. But then again, I didn't use 2d much except for homework. How much difference would there be in the 2D architecture of a TNT/TNT2/GeForce DDR/GeForce2 MX/ GeForce 2 Ultra/ GeForce4 Mx/ GeForce 4 Ti/ GeForce FX/Rage/Radeon/ Radeon 8500/ Radeon 9800 Pro? I just listed soem for a tentative basis. You don't have to discuss all of them. I am dead in the bucket as far as knowledge about improvement in 2d. Also, would 2d users even notice the difference between PCI & AGP?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 1, 2003 7:30:13 AM

When I go over to a friend's house & check out their computer & their system resources are about 40%, I get this temporary urge to lead the outside, line 'em up against a wall, and shoot'em. (not really) But still, I think its insane to have ten cazillion programs load up in your task manager and taskbar. Usually more than 85% of that <b>bull-crap</b> they have installed they don't even use, or possibly even never heard of before.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 1, 2003 7:32:35 AM

Unfortunately, one of my gaming systems was equipped with a Matrox G400 16meg *cough*. THe most I could say for it was that it ran UT decently at High Quality & Jedi Outcast on Mediumish settings.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
a b U Graphics card
September 1, 2003 10:13:28 AM

Quote:
one of my gaming systems was equipped with a Matrox G400 16meg *cough*. THe most I could say for it was that it ran UT decently at High Quality & Jedi Outcast on Mediumish settings.

Right, but like he said he's not gaming. And like I mentioned it's NOT for gaming. It's simply the best 2D series of cards out there.

As for advantage of better arch. cards in 2D applications it's mainly things like the speed of the VPU/GPU, good for all things, the dedication of the memory (not shared, and the speed of the RAMDACs (most important for 2D). Would you notice a difference betweeen an R7200 64mb and an R9100 64mb, no not really, except for the raw speed differences of the VPU, and then likely only at very high res and very fast frame rates (for video or the like) for something needing to be rendered. The RAMDACs being the same a video will appear the same. The advantage there would be that a lesser VPU (especially onboard) is usually running at a much lower clock rate and also needs to share resources (accessing system memory much more often, especially when a seperate card usually wouldn't tap those resources in most 2D apps.) Most modern stand alone cards also come with some hardware-assisted MPEG aspects that are not usually found on integrated chipsets. The main difference is simply how much of the computer's resources are you freeing up and how powerful is the integrated video? Something like the 9100IGP would likely see very little diff. (except the raw memory issues) between itself and a stand alone card. However most modern chipsets are so poor that they are barely up to the task. His is a ProSavage KM266 with a S3 ProSavage8 graphics chip (consuming 32mb), which s a very low-end graphics chip. We are not talking nForces MX equivalent or EVEN an Intel Extreme. You can pop another 512mb on this puppy and it will still run slow, simply because it IS slow. The access to the memory alone is 2.1GB/s compared to even the early Radeon's 6.4BG/s or the original Geforce's 4.8GB/S or the GF4MX's 6.4GB/S. This matter at high res. and high colour depth. The RAMDAC speed isn't listed at VIA but they are likely the typical 230-260mhz kind found on most cheaper/older integrated setups. This means the max colour depth for 1600x1200 is 8bit at 60fps and who wants that! Even for 1280x1024 you max out at about 24bit. Only 1024x768 and just above run smoothly on 32bit colour. the speed of the RAMDAC is very important for 2D. Stand-alone board usually come with at least 350-400mhz RAMDACs, even my RageFuryPro was 300+MHZ. The Matrox cards like the G400 you had and above (even the PCI versions) have 360MHZ + 230MHZ RAMDACs (primary/secondary), Of course your older cards would have had slower RAMDACs, just like the G200 which has 250mhz and the TNT also had/has 250MHZ RAMDACs and a memory bandwith of 1.2-2.4GB/s .
I'm not saying that ALL integrated are bad, just MOST and if he's currently having trouble with 256mb, he'll have just about as much with 512+.

If you really want to test this real world, you compare the systems while running multiple windows and runnning video/flash animations at the same time as you are scrolling or something. You will see visible slow down. Also, pushing the resolution up will make the problem worse, like I said. The issue is not the complexity of the engine so much as the SPEED and resources availible to the grpahics. If something hits the CPU hard and it needs the memory it will compromise the graphics and vice-versa. Whereas with a separate card with it's own resources memeory dumps and all that 'work in the background' doesn't kill the graphics, just the processes.
Hey I know it's all a part of the WHOLE equation, heck the computers at work in my previous position had Savage 32 AGP CARDS married to a PIII, and when it did an NTFSDefrag it shut everything down even page draws. But I even tried increasing memory on the system (canabalized it on the night shift from another computer) going from 256 to 768 and that did NOTHING because the SYSTEM was bottlenecked at the CPU because of all these tasks (heck it even dropped keystorkes), and that was based on a system that HAD a seperate AGP card. So like I said it depends on the system and what it's doing. In MY case at work the only solution (other than disabling features) was a CPU upgrade which wasn't up to me.
The only time you will ever notice the difference (wether it's between PCI and AGP or Integrated and Stand-Alone) in 2D applications is at high res. and good colour depth.
In this case I still say he's better off with ANY modern stand alone card. IMNSHO.

Sorry the post is so long and meandering just a stream of conciousness what hit me at the time and I didn't want to go back and edit. I also had to double check the speeds and specs of some things.

Anywhoo hope that helps.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
a b U Graphics card
September 1, 2003 10:24:10 AM

Actually I am OFTEN saddled with solving the problems of alot of my 'average' friends and their below/equal/above average computers/computer-problems. I seriously need a 'No I will NOT fix your Computer!' shirt from ThinkGeek.com

Yes many have things running in the background, but someone looking here should know how to avoid that and clean background activity, which is still not a memory problem so much as how their system USES memory. Ok yes he would be better of running with more memory if he was running a system that as poor memory management (XP is better than the others), but it would still eventually collapse none the less. The external card would not be affected by that.

I don't denny that a boost in memory will help overall performance, but it won't impact the video performance as much as an add-in card. Heck a CPU upgrade would also help. Even with 256mb of memory he will still have a crippled graphics system that once again is poaching his resources.

Anywhoo I still say ADD both. Memory is freakin' cheap, and so are the cards I'm recommending. If he's REALLY worried sure try the 256mb or 512mb of extra memory but I'm sure that it will only slightly improve the situation if it is truely a graphics issue. Unfortunately lowering the resolution or colour depth while doing the ame tasks won't tell us much since less strain on the graphics free's up resources for the system as well, unlike the situation for deciding whether to upgrade from one stand-alone card to another.

Well that's my view. Yours may differ, but that's cause you're wrong! :tongue: JK


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
Anonymous
a b U Graphics card
September 1, 2003 4:13:24 PM

Hey! My name is Tyler also!!! This forum has a limit of one Tyler so get the hell out!

Well actually you can stay if you agree to become my minion of darkness.

<font color=red> Gaaaahh! My hand is all old and wrinkly! </font color=red>
September 1, 2003 4:19:50 PM

Could either get more ram or get yourself the radeon 9000 64mb card. Either way is good for you.

F-DISK-Format-Reinstal DO DA!! DO DA!!
September 1, 2003 4:26:06 PM

What the hell is a minion of darkness? Someone who does your overclocking for you? (Wow, I'm a loser...)

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
September 1, 2003 5:27:47 PM

Actually, I often enjoy the longer posts Grape as they are usually are more helpful in providing background information and provide more neccessary background information on the issues at hand.

From what you were saying, it sounds like you were suggesting that the speed ofthe RAMDAC was for more important than even the memory amount or VPU speed. Tons of people with older computers use TNT 1 &2/GF2MX cards. In my question is that that people will notice the difference in 2D apps on these cards between the beefy newer cards. I can't. My 16 meg Matrox G400 card didn't seem any slower than the GeForce4 card I upgraded too, in 2D that is. There was a GIGANTIC leap in 3D perfrmance though and that's it. As far as the old 1 meg PCI cards, Id did see a difference in 2D performance AND visual quality compared to the Matrox G400. But then again, I use no higher than 1024x768 in either 3D games or 2d desktop resolutions. It hurts my eyest to go to very high resolutions and the text is too small for my eyes. In fact when I load up other people's computers, there resolution is usually set to one of these 3 resolutions: 640x480, 800x600, or at the most, 1024x768. Most old people will complain if there desktop resolution is higher than 640x480 and even say "Something's wrong with my puter" if their grandchildren krank up the res.



As for the difference between 1b-bit color and 32-bit, there is not that great of a difference, and sometimes I can't even tell. There was a much greater leap between 25 colors and 16-bit than 16-bit vs. 32-bit. Even here, I sometimes see a 32-bit vs. 16-bit arguement floating around. The difference is so little IMO that I would MUCH rather play a game on max settings with the olor reduced to 16-bit than play at lower setting keeping the color at 32-bit, its just a bandwidth hog. In fact ont the Matrox card, playing on 16-bit vs. 32-bit meant the difference between great playability on a good detail setting on 16-bit, or unnacceptable framerate at even on the lowest detail settings on 32-bit. 32-bit in 3DD apps was nothing more than a bandwidth killer, to say the least on the Matrox card.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
Anonymous
a b U Graphics card
September 1, 2003 6:48:12 PM

Uhhh, yes! I'm too busy to overclock my cards, so I make my minions do it. Bwahahaha!

<font color=red> Gaaaahh! My hand is all old and wrinkly! </font color=red>
September 2, 2003 2:15:57 PM

I'm running at 1024x768. I'm gonna check out the Radeon 9100 and pop in a 256 of RAM.
a b U Graphics card
September 2, 2003 7:21:37 PM

Likely the best bet on all counts. That way everyone is happy. :wink:

Let us know how it goes.



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
a b U Graphics card
September 3, 2003 3:54:41 PM

Just an interesting follow-up is LARS' recent article about integrated graphic chipsets. Look at the Sysmark (2D) figures to see the negligeable effects of a 'GOOD' intetgated solution (with good RAMDACs) versus even an R9800.

That's the main issue. Needs to be a good integrated solution (check the SIS results vs. Intel and nV).

Here's a <A HREF="http://www.tomshardware.com/graphic/20030903/integrated..." target="_new">link</A> to the very page focusing on sysmark results.

It's worth a read, I just wish he got his hands on an ATI IGP9100 chipset, especially up against that R9200 card.

I think there will be two divergent futures. Very FEW sales of low end CARDS (they will match integrated performance closely) and then a focus on mid-high end for stand alone cads. That just my take on it.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
September 3, 2003 4:04:27 PM

That's quite possibly right. According to the "crystal ball" the future is uncetain. Why is AMD saying the Hammer chipset is going to change the Integrated graphics market?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 3, 2003 4:04:28 PM

That's quite possibly right. According to the "crystal ball" the future is uncetain. Why is AMD saying the Hammer chipset is going to change the Integrated graphics market?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 3, 2003 4:13:40 PM

UFO, I have to tell you, you are SOO SOO wrong (about 16 vs 32bit color).

Now when ur surfing teh intarweb, or staring at ur desktop, sure, you may not notice much if any difference.

Play any game with fog or culling effects and you will see a huge difference.

Tell me UFO, if there wasn't much of a diff between 16 and 32bit color, why the hell would developers and vid card manufacturers be moving to 16/24/32 bit FLOATING POINT color? Because there just aren't enough colors for a truly realistic image, and smooth transitions.

Maybe your eyes suck ass.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
a b U Graphics card
September 3, 2003 4:59:09 PM

I would tend to agree with WS, but even in he day to day stuff. I don't min te 24bit on my laptop, but 16 bit just bugs me. Looks not quite as 'clean' (hard to describe).

There's just something about it. Especially since some of the colour even when surfing cause strange banding artifacts due to colour interpolation.

Anywhoo I prefer 24+ but hey I'm just glad to be beyond the era of EGA/MCGA, and even that was a nice boost from just CGA. 256 colours SUX for everything!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
September 3, 2003 5:43:06 PM

I'm not saying there's <i>no</i> difference in 32-bit color between 16-bit, I'm just saying if your graphics card takes a hit on high detail with 32-bit, lowering the color to 16-bit is a much better compromise than sacrificing texture resolution. The differences in color between each color upgrade seems to make less & less of an impact as time goes on.

Take monochrome color, when the graphics went to VGA 16 colors that was way better monochrome. Also the bump from 16 colors to 256 colors was also a giant leap in color quality. The bump between 256 & 16-bit wasn't as big of a bump as the other 2 leaps, but it still was a nice enhancement. I even remember playing Rebel Assualt II in the mid-90s and the FMV sequences looked terrific in 256 colors. I thought the graphics were getting getting so good back then, but FMV based games don't use 3d-Hardware rendering. The graphics on Rebel Assualt II were surprisingly good back then, but it was limited in nature because it did not offer the freedom of movement of a 3D-engine. I've compared UT (original UT GOTY) in 16-bit vs. 32-bit many many times. I see a difference in color quality, but its just not that great or noticeable if you are concentrating on actually killing your opponents. Yes, I do play on 32-bit color, but I also definitely think that playing on 16-bit color is worthwhile if framerate or texture resolution must be compromised to play on 32-bit. Did you see my comments about the matrox card. I can play at high quality at great framerates on that card IF AND ONLY IF I use 16-bit color. If I use 32-bit color on the Matrox card, the game is unplayable at any resolution or level of detail, except really low ones. If you think about it, 16-bit is thousands & thousands of colors, and how many wavelengths are there on the visibile light specrum in increments of 1 nanometer, not more than a 1000 I think. Here let me go dig it up in my serries of "Barron's EZ-101 Studdy Keys":

Quote:
<b>Visible light</b> is nothing more than that part of the elctromagnetic spectrum to which the hman eye is sensitive. The visible region is only a smlall portion of the electromagnetic spectrum, ranging from about 4 x 10^-7 m (400 nm) to 7 x 10^-7m (700 nm).

The length of even 1 nanometer is unvisible to the human eye. With a small range of like 300 nanometers, how many different wavelengths can you cram into that range? In theory, millions, infinite in fact. But If you were to look at the visible light spectrum for 320 nm red, would you see the difference between that and 321 nm red, maybe so, but there won't be much difference. 16-bit color has covers about 65,000 colors or so in this small range. So if you divide the 300nm visible light range by approximately 65,000 you will find out the distance of each increment in color on the 16-bit digital visiblle light spectrum increases by 0.004615 nanometers.

The calculation i used was:

300 nm/65,000 = 0.004615 nanometer increments.

As you can see, each increment is less than .5% of a nanometer. This is actually pretty good. The more bits you go, the more you are just merely fine-tuning the color quality. The increment size of 16-bit color is a much better than the (300 nm) gap of monochrome. I assume the gap is 300 nm because neither black or white exist on the "visible light" spectrum & or supposed to be void colors.

I'll be very intrested in awaiting replies, Williamette or anyone else for that matter on this issue. :smile:

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 3, 2003 5:55:45 PM

We have yet to see an article that compares how well an IGP440 with Dualchannel memory stacks up against a GeForce4 MX 440 add-in card. I really think it was neccessary for them to have included it in that integrated chipset article. The article would have been more logical that way. That way we could see how much of a "true" performance hit an integrated chip that borrows on-board memory takes compared to an exact equivalent add-in card. that would have gone in nicely with Grape's suggestion to include the IGP9100 chip in the article. That article just left to many variables unresolved as far as integrated chipset go.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 3, 2003 10:30:24 PM

Bumped because an important reply remains unanswered by Williamette & Grape. Sorry guys, I've just been anticipating your answer all day. Oohh I can't wait.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 3, 2003 11:08:10 PM

Xeen that spectrum only represents colors not intensity.

Theres infinity times infinity possibilities within that spectrum.

16 bit color sux.

Don't have alot of time and don't know off hand I'm sure someone does but if you look at JUST the shades of, say, blue, that 16-bit color can represent, its not alot. L00X like crap.

Someone elaborate:) 

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
September 3, 2003 11:57:04 PM

[/quote]Theres infinity times infinity possibilities within that spectrum.
Quote:


Agreed, I even mentioned that in my post.

65,000 colorsor 65,000 wavelengths exist on the 16-bit digital visible light spectrum. There are 8 basic colors on the digital light spectrum. Even though the width of the range, varies from color to color if you divide 65,000 by 6 this will give you the average number of different wavelengths for each color that exist for the parameters of the spectrum that I speciifed. This means that there are is an average of 10,833.333 varying wavelengths per each basic color on the visibile light spectrum. This figure however, isn't even remotely accurate for colors like Yellow that only cover 10 nm of the visible light spectrum. If you want more precise values for each basic color, you can calculate it from the ranges of wavelength for each color below.

1. Violet 400 - 424 nm
2. Blue 424 - 491 nm
3. Green 481 - 575 nm
4. Yellow 575 - 585 nm
5. Orange 585 - 647 nm
6. Red 647 - 700 nm

Quote:
16 bit color sux.[/quote}

I definitely agree that 16-bit color in 3D games is not quite as good as 32-bit, but I don't think it looks "down-right horrid" or "not even worth playing on"

I think my agruement has been taken a little out of context. It's true I'm overemphasizing the wrong points, but I think their worth a look at.

The crux, if you will of my original arguement before your first reply was, that "Sacrificing color-quality is a better compromise then sacrificing texture detail."

So let me put it this way, all I am trying to ask is the following:

Would you rather play a game at full detail @ 16-bit color or low detail @ 32-bit color.

<b>There is indeed a noticeable difference between 16-bit and 32-bit.</b> But what I was trying to say is that this difference becomes less and less apparrant with each color-bit upgrade.

For example: 32-bit makes a more noticeable improvement over 16-bit than 64-bit will to 32-bit.

64-bit makes a more noticeable improvement over 32-bit than 128-bit will to 64-bit

128-bit makes a more noticeable improvement over 64-bit than 256-bit will to 128-bit.

Aww, I'm confusing myself! But basically I am saying that the difference between each progressive color upgrade won't make as big of a noticeable impact as the last color upgrade.

Yes, I know 256 colors sucks by today's standards, but it had a much greater affect against 16 colors than 16-bit had against 256 colors. But 16-bit was a really nice upgrade to.

Mathematically, the visibilie difference between each color upgrade is logarithmic IMO even though the number of wavelenths available in the pallete increases exponentially with each upgrade.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 4, 2003 12:32:25 AM

I don't think that 32bit color makes a bigger diff. over 16bit color than 24/32bit FP will/does over 32bit. Period. It's neccessary.

32bit color over 16bit color DOES NOT make a large performance impact on current cards! So ur hypothetical question would never be 2 real-world choices. And I would always choose 32 bit, as 16 bit color does look horrid. HL2 in 16 bit color? Turn that beatiful pixel-shaded water into a blocky piece of crap.

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
September 4, 2003 12:56:34 AM

Your absolutely correct Williamette about that being a non-realworld hypothetical situation on current cards. It was a real-world situation on that crappy Matrox card. I've got a feeling that 64-bit color gaming is on the horizon pretty soon. When the first 64-bit color-capable 3D accelerators hit the market the probably won't have acceptable framerates in games that provide options for 64-bit color. So we're back to the hypothetcal situation, except this time around, its 64-bit vs. 32 bit. You'll be able to run all your games with awesome framerates in 32-bit color on High detail. But when you switch to 64-bit mode, your framerate takes a big dive. You would still go to lower texture res @ 64-bit rather than using a hi texture &hi screen res @32-bit color + better framerates? I respect your opinions & all, if you disagree with me on this last point, I can live with that. I'm just curious, that's all. However, I can almost assure you that will see a much greater difference in 16-bit vs. 32-bit than you will ever see in 32-bit vs. 64-bit.

One last note: I've noticed that on newer cards that 16-bit usually looks more garbled than 16-bit on older cards. In fact, my Voodoo2 didn't have an "airy" or "noisy" appearance in 16-bit like my 8500LE did. I even thought my 8500LE was defective because of the garbled image in 16-bit compared to older cards that I owned. This could be happening to someo fo the other not-so-old cards. If you've seen this "airy" appearance that looks like television static on newer cards, I'd definitely understand why you'd probably think I'm downright crazy. I never thought about this until now when you mentioned the word "blocky". But 16-bit doesn't look so "noisy" on other cards. Maybe Grape has an explanation for this. What I am saying, is you might be experiencing the same affect on your newer graphics card as well.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
a b U Graphics card
September 4, 2003 5:13:43 AM

Well as it wasn't directed at me I didn't answer it. However since you wanted a follow up I will say this, as someone who has been downgrading 36/48 bit images all week. The difference is there and it's perceptibles, only in the sense that the grass doesn't look quite right, the flowers and people look a little duller and even some colours look a SLIGHT bit different. Someone who didn't see the originals wouldn't notice at all and would say they look perfect. It's usually the transitions that suffer. Now when it comes to games it all depends on what you are playing, like I said about creeper gamers versus jump-n-frag FPS, the creeper doesn't need as high framerate, but would benifit from better IQ at the cost of fps, whereas a shooter that involves alot of movement doesn't need the same level of precision to look good because most of the images are moving, but by the same token losing fps will be very noticeable because of the way the brain perceives motion.

Once again it all comes down to personal preferences and what YOU can perceive and what you find acceptable when you are doing a given task.

BTW, just FYI, there is a VERY rare condition (in women only, and like 1 per ~100 million) where they have an extra cone which gives them even sharper colour diferentiation, and also sharper vision (usually associated with rods oddly enough) The case I read about was a women working for a paint maker. So I'm sure her perception would be different.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
a b U Graphics card
September 4, 2003 6:55:31 AM

Quote:
There really is that much of a difference between a 16 meg TNT and a 64 meg 9100 in standard 2d apps?

Belive it or not, there actually is. From personal experience, I have seen the slow 2D performance of a TNT card when it comes to alpha transparency. This is noticeable in Windows XP using the feature that shows shadows under menus. With these shadows turned on, menus open slowly.

Although I don't know the reason for sure (someone suggested RAMDAC speed?), I know that the TNT card does not support a feature called per pixel alpha blending. Once you move up to a Geforce 1 level card, there is support for per pixel alpha blending. I would venture to say (though I don't know for sure) that anything Geforce or better would probably give better performance in this area of 2D.
September 4, 2003 7:21:53 AM

What about in standard apps under Windows 98? I know its an old OS, but its commonly used, and commenly supported. 3 of my systems us Windows 98 SE because we wished to have full compatibility for older DOS programs, like games. Do you ever get a hunger for nastalgia that you can't seem to fill? It happens all the time on my laptop that ses Windows XP. The only way I am able to play classic DOS titles is if they have a Widnows 32-bit port for 3D acceleration which leaves some of them like Wolfenstie, Rise of the Triad, & Dark Forces out of the camp sadly.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
a b U Graphics card
September 4, 2003 4:56:32 PM

Windows 98 wouldn't have the same kind of lag. Optionally, you can turn off the feature in Windows XP, but it's still something to consider if you don't want to resort to turning off features.
As for old DOS games, I have one myself that I would like to play on my XP system. What I may do is just buy DOS (for $8) and make me a boot disk to use for DOS games.
!