Sign in with
Sign up | Sign in
Your question

Graphics card battle

Last response: in Graphics & Displays
Share
February 23, 2006 12:02:28 AM

X1900XT
7800GTX
Your first born child

Here's a question to ponder about. Whether you have to most insane video card or not that runs 50, 60, 70+ FPS, Does the human eye actually pick up the difference in that many FPS(Frames Per Second). If a X1900 runs Doom3 at 75 FPS and the 7800GTX runs Doom3 on the same settings but only scores 70 FPS, Do you really see the difference in that high of frame rate? The only way we know it scores faster is because of a numerical value that is put on the card's performance. Interesting questions I've been pondering lately about this whole graphics card race.

I personally own 2 512 7800GTX, and honestly I really don't see a visual difference between my friends comp that runs 1 512 7800GTX. We need to start wearing some super sensitive glass while gaming so we can be like... damn only 88FPS its freaky choppy as all hell. New video card time!

Another question to those who gaming in the MMORPG department...
What all actually factors in the FPS of those types of games, EQ2, WoW, etc...? How do you find out where your bottleneck is on those types of game? Does FPS affected also by internet speeds? Just wondering 8)

~Maxiius

More about : graphics card battle

February 23, 2006 8:31:15 AM

Personaly running a game @70 or 75 fps you wouldnt see a difference but your eyes will feel it.
let me explain & ill keep it short cause i hate typing.
when i sit in front of my pc for hours and play games @ 25fps or less i get eye strain my eyes hurt and i use to get dizzy spells as well, but if its 50-70 fps i can play all night.
but there will be a difference in a year or 2 when the latest games will run @20-40fps on your present card, true you will update by then. but many dont so they like me get the best than can when they do upgrade.

Internet games depends on your modem speed first and then pc specs
February 23, 2006 1:55:33 PM

So there is less eye strain with higher frames per second, but the fact that you can play all night with anywhere from 50-70 FPS with no eye strain bring right back to the first point. There is a diminishing point of no return in video cards, once you reach playing any game at a minimum of lets say 65 FPS, visually there is no difference only a numerical difference.

I'm pretty intrigued about the subject. Would be interesting to see in fact a study done and proven to show that pass a certain point of FPS its un-noticeable. Maybe play a game at 50FPS for 5 hours straight, then play the same game at 70FPS and see how it affects difference aspects of humans.

Here's another question... Does anyone every buy a video card based on the cards image? (As in like you buy MSI card because they got that cool looking chick printed on the card)
Related resources
February 23, 2006 8:20:14 PM

i like black/blue/red pcb lol green is tiresome
but hey, a chick helps too lol. i dont liek ati chicks.. ugly
February 23, 2006 8:30:08 PM

The whole thing to me about frame rate is the following : If a game runs avg 70FPS, then it gives me some headroom for low FPS at certain point in time in a game, so that I don't play at 40FPS and drop to sub 25. So a high FPS just makes falling in FPS in intensive cases more bareable.

Also, the battle of FPS to me is a waste of time. It should just serve as a reference to see if your setup is utilized to it's fullest potential compared to similar systems on the web via TOMS/ANAND etc.

Lastly, benchmarking to me (as has been stated countless times) is not real world performance. It's a collection of frames recorded every second - and this is where a bugger up comes in : let's say for 2 minutes a benchmark will run. In this 2 minutes we have 1min of 25-30FPS, while 30sec goes to 30-45fps and the other 30sec goes to 200FPS. Because we are using avg to get a result, the 200FPS of 30sec translates to a higher FPS total avg result, yet 50% of the time we were running on the minimum considered to be enjoyable. This is a crude example, but it can apply to most benches seen. Benchmarking as found on TOMS etc, only serve to compare previous gen cards with current ones, as there is a common reference point - and alot of people miss this point.

And the packaging of a gcard is a nice to have for me, and if I was going to shops and I could see 3 different types of card X next to each other, then packaging might play a role. But seeing that cards are expensive (the ones I buy stupidly every 3 months - bye bye Subaru :)  ) and I mainly order online, I go first for a good and known manufacturer and secondly the best price - packaging is then trivial. If they started shipping games that are not yet released (by 2 weeks or so) then I will throw packaging back into the equation. But when was the last time anyone got a game that could really show case a high end cards' capabilities??
February 23, 2006 8:33:24 PM

I try to pick out a nice looking card but it doesn't matter much to me. For example, I'll probably try to avoid a card with a green PCB and a reference HSF and go for something more "unique". However, since my case doesn't have a window, it's all about performance for me.

As for the FPS question, I don't really notice much. On WoW, I can't really tell between 75 FPS and 85 FPS. At lower FPS, it will be more noticeable, such as comparing 15 to 30. But once you hit 65+, there's not a big difference except for how your eyes "feel".
February 23, 2006 8:51:32 PM

The average human eye is only about 60 fps fast. But having a higher frame rates makes difference in gaming. Not all pc are capable of maintaining frame rates over 60fps. So even if your pc can achieve an average 60fps in some cases the frame rate can drop specially at times where theres a lot of things going on in a game at once. Take for example the game FEAR, at the corridor with no enemies good pc give 60fps but when theres' so much action going on like shooting your opponents and grenading them, it put so much stress in both cpu and graphics card that it decrease its frame rates.

Also your eyes may not notice the frame rates of 60fps during theres no movement, but try looking left to right repeatively as quick as you can and you will notice the frame rates.

So I would say that having a rig that can push way over 60fps is a lot better for gaming and also reduces eye strain. :D 
February 23, 2006 9:52:26 PM

i think that the fps battle is all about the future games, because u can't see any difference between lets say a x1900xt and a 7800gtx( ofcourse if u don't play on an insaine resolution :wink: )
February 23, 2006 11:40:08 PM

somebody explained this subject really well last month.

supposedly the human eye can't see or feel the difference over 60fps. And also, the video card can't possibly produce more than 60fps on the screen because the AC electric current in the wall is a 60hertz signal means 60 flickers a second, and that's why you can't tell or maybe if you look very closely like i did that the lights flicker. Supposedly in europe, the signal is 55hertz and you CAN see the lights flicker a little. I don't know why but europe always seems to get the short end of everything. :lol: 
February 24, 2006 4:46:06 AM

So your saying that if you even had a computer to run these top end games at 75+ FPS, that visual output is only maxed out 60hertz? Which in turn would only produce a maximum of 60FPS from the monitor?
February 24, 2006 5:05:50 AM

Quote:
somebody explained this subject really well last month.

supposedly the human eye can't see or feel the difference over 60fps. And also, the video card can't possibly produce more than 60fps on the screen because the AC electric current in the wall is a 60hertz signal means 60 flickers a second, and that's why you can't tell or maybe if you look very closely like i did that the lights flicker. Supposedly in europe, the signal is 55hertz and you CAN see the lights flicker a little. I don't know why but europe always seems to get the short end of everything. :lol: 

I believe Vsync does not allow your monitor to show higher FPS than your refresh rate (60 hertz for most people). However, you can disable Vsync using the nVidia/ATi control panel or the in-game settings. So yes, it is possible to get over 60FPS in a game, but your eye wouldn't know the difference between 60 and 75. The only way to measure accurately is to use an FPS program such as FRAPS.
February 24, 2006 7:54:26 AM

I'm surprised to see so many old myths being revealed in this thread.

This should clear up any questions you have about the human eye.


Just remember, everyone is different. My eyes are extremely sensitive and I can easily distinguish a difference between 140fps and 160fps. Not that it's something to brag about, it's actually quite a hinderance if you've chosen computers as your field of expertise.
a b U Graphics card
February 24, 2006 8:38:33 AM

if one card is cheaper then the other, get that card - you wont notice the diffrence, but if there bout the same price, show your colors - either red or green.
February 24, 2006 8:54:01 AM

Euro gets 50hz because they use 210 AC V line. It is supposed to be more efficient to transfer electricity over the line using that setting instead of 110/60hz. We all know that we like to waste valuable energy in the US.

No matter how fast your video card is, we all still have to see it through our monitors right? so, older CRTs can have up to 120 hz refresh rate at low resolution setting. But if u crank the resolution all the way up, only a few can even do 85hz. At this point, no matter how fast your video card is, your picture on the monitor will only refresh at the rate of 85 times per second. LCDs are even slower in respond compare to CRTs. When I play game on my old CRT, my refresh rate has to be at least 75 hz or I will feel dizzy after an hour or so. Now that I use LCD, things are much better.

MMORPGs depend more on RAM, CPU, HDD speed than video card. But it is always a plus to have a nice video card. Most MMORPGs engines don't even use the lastest 3d technologies. Having 200-300 players fighting each other on your screen at the same time will bring most PC to their knees.
February 24, 2006 11:37:38 AM

That was a very interesting article to read Red_Frog. Now relating that back to computers. What monitor / video card is able put out that many frames per seconds?

Also how does this Vsync (MTLink) work? Does turning it off have any negative affects on the monitor, as in like blowing it up after running 120FPS or something?
February 24, 2006 12:05:25 PM

Sounds to me like u got the melt o vision goin on and it aint too bad!
Pumped up graphics dude. That's what ya got.
Screamin GTX Duo.
February 24, 2006 1:07:52 PM

As more people have been noticing, the emphasis has been shifting away from MAX FPS, to image quality. Yes above 75 FPS 90% of the human population will not see a difference.

However, EVERYONE can tell the difference of 75 FPS at 1280X1024, and 75 FPS at 1600X1200, not to mention various filters and such (AA/AF)

My $0.02
February 24, 2006 1:20:54 PM

I have the same problem man. I cannot run my monitor at lower than 75Hz refresh or it kills my brain. I regularly run @ 85 just to keep strain down. I notice huge differences from 60 fps to 80, to 100 etc... Also notice much more subtle differences is quality before others too. Drives me nuts! It is a curse at best as I wish that I could enjoy some of the games on cheaper hardware like my friends. Instead I fork out mo' money to play at same enjoyment level.

really sucks.

Another thing to remember about fps benchmarks is the quality level that the given card is running those frames at. A 7800 gtx will run slightly faster than a x1800, but at a MUCH higher quality level... an x1900 much higher qual than the 7800 and so on. Frames just give a sort of "apples to apples" comparrison, the quality is more subjective but the more important aspect to someone with this hyper sensitive eye thing that some of us have...
February 24, 2006 2:17:12 PM

SLI'd cards provide the benefit of being able to run at much higher resolutions or levels of detail at the same framerates.

If you can't see a difference, increase your resolution and AA. Then, you WILL see a difference over a single card... your framerates might be the same, but the experience will be entirely different.
February 24, 2006 9:50:40 PM

Quote:
Euro gets 50hz because they use 210 AC V line. It is supposed to be more efficient to transfer electricity over the line using that setting instead of 110/60hz. We all know that we like to waste valuable energy in the US.

No matter how fast your video card is, we all still have to see it through our monitors right? so, older CRTs can have up to 120 hz refresh rate at low resolution setting. But if u crank the resolution all the way up, only a few can even do 85hz. At this point, no matter how fast your video card is, your picture on the monitor will only refresh at the rate of 85 times per second. LCDs are even slower in respond compare to CRTs. When I play game on my old CRT, my refresh rate has to be at least 75 hz or I will feel dizzy after an hour or so. Now that I use LCD, things are much better.

MMORPGs depend more on RAM, CPU, HDD speed than video card. But it is always a plus to have a nice video card. Most MMORPGs engines don't even use the lastest 3d technologies. Having 200-300 players fighting each other on your screen at the same time will bring most PC to their knees.



ummm, if that 110 is the volts, thats wrong. the volts are also at 210 because we need to run major appliances like washers and dryer, washing machine, stoves, and fridges, that would not run on 110 volts. The 55hertz may be slightly more efficient, but a constant flickering of the lights would piss me off beyond belief, and is worth the extra money, however much it is.
February 25, 2006 1:13:18 AM

We get 220 volts for appliances in the US through the main fuse box wiring, by combining 2 110 volts lines into 1. Check your house fuse box wiring to find out. All of the appliances relays are double relays instead of single.

About the flickering of the light bulbs, you must know that it only apply to fluorescent type, incandescents don't flick. And whenever people use fluorescent lights, they usually have it set up in set of 2, 4 or more to reduce the flickering rate and make it easier on our eyes. Using 2 fluorescent bulbs with the right set up will turn it into a 120 hz effect instead of 60hz.
February 25, 2006 1:38:49 AM

yeah i know "technically" its 220volts, but for some reason, in actuallity it comes out as 210<shruggs shoulder> and yeah i know the flickering only happens in fluorecent tubes, however i'll have to find out if that applies to those high efficency bulbs, the fluorecent lamp bulbs, not that they're mainstream yet, but they may be eventually if hydro keeps going up. Don't know if these things are the same in canada and the states, but considering we're on the same grid i'd think so.
February 25, 2006 1:43:31 AM

just remembered that the line that comes in to the house, is technically 120volts, but actually works out to 110volts in real life, so maybe that same thing thing happens to a double line, probably has something to do with the amount of ohms in the copper, perhaps the current leaves the generator at 120v, but by the time it reaches you its been reduced to 110volts, but not sure.

a guy from hydro explained that to me last year.
!