Sign in with
Sign up | Sign in
Your question

Is fps really noticable - radeon 3870 at 1680x1050?

Last response: in Graphics & Displays
Share
January 10, 2008 11:40:48 AM

Hi. I know for sure that the 8800gt 512mb performs better than the radeon 3780 512mb. But what isnt really discussed is that at what resolutions is it really that good? I will have an lcd with a max resolution of 1680x1050. I plan to play games like COD2,3,4 etc, and also newer RTS games (wont bother about Crysis, have plenty others to catch up on :p ). But, at the above mentioned resolutions, wont the radeon be as good with high settings?
Lets say for arguments sake, that with the radeon 3780 one I can get about 45fps (with all high settings) by playing cod4 at 1680x1050. With the 8800gt, its about lets say 55fps. Is the difference of 10fps that big? OR, what is the minimum fps needed for a game to run smooth? thanks, would like to hear your opinions/comments

CORRECTED -> radeon 3870
January 10, 2008 3:47:12 PM

I'm no expert but supposedly your eye can't see anything over 30fps. So whats counts is how many fps you get when there is a bunch of stuff going on like big explosions or tons of enemies. The 8800gt is less likely to drop below 30fps than the 3870 and therefore you're less likely to notice the slowdown that occurs.
January 10, 2008 4:15:46 PM

I would get the 8800 just for that extra oomph. It's true that we can't see more than 30fps, but what we do notice is the lack of blur. The more FPS, the more "natural" blur and thus a more fluid look. With 30fps you don't get as much blur, so even though you can't see a difference frames-wise, it still won't look as natural as, say, 60fps.

TVs routinely display at 30fps because the images are already blurred when captured on tape, so that's why it looks so smooth. Some games, such as Crysis, will synthetically add blur.
Related resources
Can't find your answer ? Ask !
January 10, 2008 4:22:35 PM

wish the vga benchmarks took into account of native resolutions on widescreens.. I have a 22'' as well and am in the exact same dilema is you windie
January 10, 2008 4:32:44 PM

The human eye is not limited or measured by a simple framerate. I can notice quite a different between 30fps and 60fps, but some claim to be unable to distinguish the difference. When it comes to PC games, I generally want it to run with at least an average framerate well above 60fps to make sure the game can run as smoothly as possible during even the most graphically intense scenes. For console games, I don't really have a problem with 30fps as they're usually far more optimized then their PC counterparts, so a severe framerate drop is far less likely.
January 10, 2008 4:54:40 PM

Hmmm...I have an 8800GTX (in the rig in my sig) and @ 1680x1050 in Lost Planet I see it drop down around 30 fps during intense scenes. It gets notably choppy around 25fps. Perhaps I should do some tuning or disable my 2nd display. Anyways, I'd say if you're not dropping below 30-35fps during intense scenes you should be pretty good.
January 10, 2008 5:15:14 PM

leo2kp said:
It's true that we can't see more than 30fps,


No, No, No. That is absolutely false.

25-30 fps is the MINIMUM for smooth animation, not the maximum. The human eye can detect hundreds of fps.

However, your monitor can only refresh once per Hz, so a 60 Hz display can only show 60 fps.
Most people will agree 60 fps is very smooth.

When buying a videocard, the minimum framerate is a more important benchmark than the average framerate... think about it. If a game has 50 fps on average, but dips to 5 fps when things get busy, it'll suck. Ideally, a game's MINIMUM framerate won't dip BELOW 30 fps.

Aside from all that... a Radeon 3850 should be fine for 1650x1080.
January 10, 2008 5:42:50 PM

leo2kp said:
It's true that we can't see more than 30fps,


And what medical study did you get that from? Cleeve's correct, the normal human eye can perceive for greater the 30 FPS. I personally don't like it when frame rates drop below 50 fps, which is why I haven't loaded any recent games, as my old video card won't support them decently. This 30 fps is nothing more than an old wive's tale, perhaps used by video companies to sell old cards or people who want to justify their old cards.
January 10, 2008 10:00:52 PM

thanks for your replies everyone. Well I might not notice a higher frame rate, need new glasses :p .
I do have alot of old games to catch up on (havnt played in like 4-5 years :I). A more demanding game like crysis, no biggy, could play it even after a year or two when cards can play it to the best.
I read the other day (forgot the source) on a forum, people were discussing that ati would release drivers for its graphics card, which would further increase performance. Can only a driver update really do that?
A 3780 is about 336US (converted it from aud :p ) right now. I believe it spiked up due to the shortage of the 8800gt. A 8800gt, could get around 370$. Hoping prices will drop a bit more in couple of weeks time as more units get made (if they are increasing production)
January 10, 2008 10:03:11 PM

Quote:
Hmmm....never heard of a 3780, when did those come out?


ati radeon 3780 and 3750, I think came out sometime mid December 07. Not sure the exact time frame, but its very new.
January 10, 2008 10:20:54 PM

windie said:
ati radeon 3780 and 3750, I think came out sometime mid December 07. Not sure the exact time frame, but its very new.


Either someone is asleep at the keyboard or they need to look up sarcasm in the dictionary :whistle: 
January 10, 2008 10:32:45 PM

hahaha. My sarcasm running days died about 3 weeks ago :( . in blank mode now :I
January 10, 2008 10:34:28 PM

like homer says 'doh!'
January 10, 2008 10:38:01 PM

windie said:
ati radeon 3780 and 3750, I think came out sometime mid December 07. Not sure the exact time frame, but its very new.


Oh boy, and people wonder why the world's going downhill. :pt1cable: 

The correct numbers are 3850 and 3870. The OP obviously got his fingers mixed up as he posted and now more people are repeating his mistake. :non: 
January 10, 2008 10:50:40 PM

You won't notice any difference between the HD 3870 and 8800GT with the naked eye at that resolution, with the possible (and I stress that word) exception of Crysis.

Keep in mind, though, that the HD 3870 pwns in CF.
January 10, 2008 10:53:24 PM

argh, again ill say DOH!. Ok, i promise going to get my eyes checked soon!

a b U Graphics card
January 10, 2008 10:59:03 PM

cleeve said:
However, your monitor can only refresh once per Hz, so a 60 Hz display can only show 60 fps.

Though that would only be true for a CRT yes? Since LCDs don't refresh.
January 10, 2008 10:59:12 PM

SEALBoy said:
You won't notice any difference between the HD 3870 and 8800GT with the naked eye at that resolution, with the possible (and I stress that word) exception of Crysis.

Keep in mind, though, that the HD 3870 pwns in CF.



Just a point to ponder, does CF or SLI do anything else other than increase the fps (using same highest settings a single card could reach upto)?


I know crysis is all over the place, as its ahead of its time, but is the game itself (excluding graphics) that great? just wondering :) 
a b U Graphics card
January 10, 2008 11:03:08 PM

It's good, but not amazing. You will find some people love it, others say it sucks, and people like me say it's ok. Everyone will agree the ending was total rubbish though.
January 10, 2008 11:12:55 PM

Just thought I would point this out: the 30/24fps myth comes from fact that film is roughly 30fps and looks smooth. However, film uses motion blur to compensate for low fps.

you can see well above 30fps in games. Also note that even if you average 30fps, or max out at 30 fps... your minimum frame rates are going to be pretty far below that.
a c 143 U Graphics card
January 10, 2008 11:17:57 PM

windie said:

I read the other day (forgot the source) on a forum, people were discussing that ati would release drivers for its graphics card, which would further increase performance. Can only a driver update really do that?


Yes, some ATI cards have become noticeably faster in 2007 thanks to driver updates. nVidia's too, in a smaller measure.
January 10, 2008 11:21:18 PM

skittle said:
you can see well above 30fps in games. Also note that even if you average 30fps, or max out at 30 fps... your minimum frame rates are going to be pretty far below that.


That's exactly why I look more to what minimum frame rates are instead of maximum or average, and why I plan to get two cards when I make my next build as my new monitor seems a bit demanding on the present cards.
a b U Graphics card
January 10, 2008 11:31:57 PM

skittle said:
Just thought I would point this out: the 30/24fps myth comes from fact that film is roughly 30fps and looks smooth. However, film uses motion blur to compensate for low fps.

I read somewhere that you actually see every frame 3 times in a cinema :heink: 
January 10, 2008 11:34:01 PM

windie said:
Just a point to ponder, does CF or SLI do anything else other than increase the fps (using same highest settings a single card could reach upto)?


I know crysis is all over the place, as its ahead of its time, but is the game itself (excluding graphics) that great? just wondering :) 


Well fps is really the main measure of any graphics cards performance. A high FPS gives you scope to increase the detail and AA levels of the game. The quality of the image should be the same with identical settings for a 8600 compared to a 8800 GTX the difference is only going to be FPS. So while a CF/SLI wont nessecarily have better graphics it will be proabably be faster in most cases so you can turn up the settings and increase the graphic quality. One non-fps advantage CF or SLI gives you is the ability to hook up more screens.

I found crysis to be a rubbish game. Brings nothing new to the genre apart from shiny graphics. Id much prefer to spend my time on UT3 (Hey I know its not revolutionary either but the gameplay is faster and more dynamic imo).
a b U Graphics card
January 10, 2008 11:45:40 PM

Note that it is nearly always better to buy one higher end card than two lower end cards for SLI/CF. Obviously that doesn't work in the 8800gt vs GTX case, but you wouldn't buy two 8600GTS's for SLI instead of an 8800GT.
January 11, 2008 1:18:38 AM

randomizer said:
Note that it is nearly always better to buy one higher end card than two lower end cards for SLI/CF. Obviously that doesn't work in the 8800gt vs GTX case, but you wouldn't buy two 8600GTS's for SLI instead of an 8800GT.


But when I do my next build, I want two higher end cards! No wait, make that two highest end cards! :kaola: 

Ok, what I want and what I will get may be two different things. :bounce: 
a b U Graphics card
January 11, 2008 1:29:14 AM

I never pay >$350 for any hardware.
January 11, 2008 2:27:24 AM

They way i think of FPS is if im getting under 30 a cant aim fast enough playing a first person shooter. Above 60 and im really starting to get some smooth fast aiming like in CoD .. When im getting over a 100 im jumping in the air spinning around and aiming so much faster than othe guy... Its like a get a bead on them quicker (I do play with insanely fast mouse sensitivity too)

My test is when playing a game slide your mouse quickly like trying too see something behind you... IF your lagging or its choppy.. Your getting 30's and under
a c 143 U Graphics card
January 11, 2008 2:52:04 AM

OK, help me figure this out, I'm totally confused. Your screen refresh rate is 60 Hz, right? When your card produces 100 fps, you still see only 60 because that's all that the monitor can do. Still, you get an advantage over somebody who gets 60 fps, and aiming is easier for you than for him?
January 11, 2008 3:17:49 AM

yea because graphics cards can queue up frames, so at 120 fps, for example, there will always be an extra frame ready so there is less delay in the frames, then, say, at 30 fps, where the monitor refresh rate is faster than the computer can support, so the monitor actually has to wait for the frame to be generated. I'm by no means an expert, so don't quote me on this. this is just what I heard.
a b U Graphics card
January 11, 2008 3:40:58 AM

Actually it would be easier for him, I would think. Assuming you are both at 60Hz, he is seeing all 60 of his frames every second, while you are seeing only some of the frames being outputted, meaing there is data missing in irrigular points. I'm not really sure, someone else could tell ya.
January 11, 2008 4:28:05 AM

All good points mentioned above. From what i have read, refresh rates dont really apply to the lcd monitor. But still dont know much to support either argument.
http://en.wikipedia.org/wiki/Refresh_rate

THanks, I get more now the sli/CF concept. I guess those with no budget issues could get the latest in sli/cf mode. But its good to see an alternative also to the 8800gt that is pretty good and decent.
January 11, 2008 4:34:16 AM

yay, moved up from newbie profile :p 
a b U Graphics card
January 11, 2008 6:06:27 AM

The "refresh rate" of an LCD, as far as I know, is simply a backwards compatibility thing for video cards which are still designed with CRTs in mind (currently all of them). They don't actually refresh, yet interestingly tearing still occurs when the framerate exceeds the rated refresh rate, and thus vsync is used, but that normally limits you to 60FPS, or 75FPS on an LCD which supports 75Hz.
January 11, 2008 7:09:40 AM

ATI driver have indeed increased performance alot, and judging by the clock speeds of modern ATI cards, there should be more in the tank.
January 11, 2008 1:20:24 PM

wolfseeker2828 said:
yea because graphics cards can queue up frames, so at 120 fps, for example, there will always be an extra frame ready so there is less delay in the frames, then, say, at 30 fps, where the monitor refresh rate is faster than the computer can support...


Yes, but this comparison only works when the framerate is below the monitor's refresh rate.

Once you go above the refresh rate, a 'frame queue' would be useless. All extra frames will give you is visual tearing.
a b U Graphics card
January 11, 2008 9:19:43 PM

But LCDs don't refresh, so how does that work with them? Unless the LCD still acts as though it had a refresh rate and only displays 60 frames, otherwise I can't think how it could be that LCDs produce tearing.
January 11, 2008 10:02:25 PM

windie said:
Hi. I know for sure that the 8800gt 512mb performs better than the radeon 3780 512mb. But what isnt really discussed is that at what resolutions is it really that good? I will have an lcd with a max resolution of 1680x1050. I plan to play games like COD2,3,4 etc, and also newer RTS games (wont bother about Crysis, have plenty others to catch up on :p ). But, at the above mentioned resolutions, wont the radeon be as good with high settings?
Lets say for arguments sake, that with the radeon 3780 one I can get about 45fps (with all high settings) by playing cod4 at 1680x1050. With the 8800gt, its about lets say 55fps. Is the difference of 10fps that big? OR, what is the minimum fps needed for a game to run smooth? thanks, would like to hear your opinions/comments

CORRECTED -> radeon 3870



Here you go a little info for those who are interested.

Human Eye Frames Per Second

http://amo.net/NT/02-21-01FPS.html



January 11, 2008 10:33:29 PM

thanks, was quite informative. thanks too all for responding and having a good discussion :) 
!