steve

Distinguished
Sep 10, 2003
2,366
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

I'm a n00b to the whole graphics card scene and I'm about to buy a Radeon
9800Pro for my system but I've been talking to some people and I was told by
someone that the human eye cannot notice the difference once you get above 20 or
so frames per second. Is this true and if it is what is the big bragging rights
about frames per second? I'm not looking to start a flame war just for some
information.

Thanks,
Steve
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

On Sun, 18 Jul 2004 21:05:32 +0100, "Steve"
<bond_youknowtherest_uk@yahoo.co.uk> wrote:

>I'm a n00b to the whole graphics card scene and I'm about to buy a Radeon
>9800Pro for my system but I've been talking to some people and I was told by
>someone that the human eye cannot notice the difference once you get above 20 or
>so frames per second. Is this true and if it is what is the big bragging rights
>about frames per second? I'm not looking to start a flame war just for some
>information.

To sum up: You can see 60+ FPS. Anyone who states 20-30 FPS is the
limit is clueless and should be ignored.
--
Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevant text.
Check groups.google.com before asking a question.
 

sunbow

Distinguished
May 14, 2004
27
0
18,530
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

"Steve" <bond_youknowtherest_uk@yahoo.co.uk> wrote in message
news:2m03gcFhijskU1@uni-berlin.de...
> I'm a n00b to the whole graphics card scene and I'm about to buy a Radeon
> 9800Pro for my system but I've been talking to some people and I was told
by
> someone that the human eye cannot notice the difference once you get above
20 or
> so frames per second. Is this true and if it is what is the big bragging
rights
> about frames per second? I'm not looking to start a flame war just for
some
> information.
>
> Thanks,
> Steve
>
>
I think whats more important is the change from say a game running at 60fps
and then slowing to 30. thats what becomes noticeable to your eyes. avoiding
slowdown by having a powerful graphics card is preferable to being stuck
with a game that grinds to a halt every few minutes (like say if you have an
nvidia 5200 or something).
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Steve wrote:
> I'm a n00b to the whole graphics card scene and I'm about to buy a Radeon
> 9800Pro for my system but I've been talking to some people and I was told
> by someone that the human eye cannot notice the difference once you get
> above 20 or so frames per second. Is this true and if it is what is the
> big bragging rights about frames per second? I'm not looking to start a
> flame war just for some information.

It's actually a lot more complicated than that.

Basically you are always looking for the weakest link, the slowest frame
rate it will ever get to. Now, that might be half or even less than the
average frame rate. So, to achieve 20fps min, you may need an average of
50fps.

Also, it's not all about FPS. With todays cards, there is such a variation
that the level of detail, accuracy, realism is defined not by the speed, but
by the features that the card has. I don't think it is reasonable to
compare cards on speed alone. You need to look at image quality. Defining
image quality is not easy as it is subjective. My idea of image quality
will be different to yours.

To take a simple example, nobody has come up with a universally accepted
definition of "noise" for signal to noise ratio of compressed images - and
there we KNOW what the decompressed image should be (the original). It's
too complicated to factor in all the differences people perceive, and again,
it's subjective. Blocking artefacts in JPG images (DCT) are horrible to
look at, but seem not to affect many measures of SNR as greatly as we
perceive. Anyway, I'm rambling...

FPS is only a small part of a complicated set of interactions. People who
brag about FPS take a simple minded view, because they have simple minds :p

Ben
--
A7N8X FAQ: www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...
 

steve

Distinguished
Sep 10, 2003
2,366
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Ben Pope wrote:
> Steve wrote:
>> I'm a n00b to the whole graphics card scene and I'm about to buy a
>> Radeon 9800Pro for my system but I've been talking to some people
>> and I was told by someone that the human eye cannot notice the
>> difference once you get above 20 or so frames per second. Is this
>> true and if it is what is the big bragging rights about frames per
>> second? I'm not looking to start a flame war just for some
>> information.
>
> It's actually a lot more complicated than that.
>
> Basically you are always looking for the weakest link, the slowest
> frame rate it will ever get to. Now, that might be half or even less
> than the average frame rate. So, to achieve 20fps min, you may need
> an average of 50fps.
>
> Also, it's not all about FPS. With todays cards, there is such a
> variation that the level of detail, accuracy, realism is defined not
> by the speed, but by the features that the card has. I don't think
> it is reasonable to compare cards on speed alone. You need to look
> at image quality. Defining image quality is not easy as it is
> subjective. My idea of image quality will be different to yours.
>
> To take a simple example, nobody has come up with a universally
> accepted definition of "noise" for signal to noise ratio of
> compressed images - and there we KNOW what the decompressed image
> should be (the original). It's too complicated to factor in all the
> differences people perceive, and again, it's subjective. Blocking
> artefacts in JPG images (DCT) are horrible to look at, but seem not
> to affect many measures of SNR as greatly as we perceive. Anyway,
> I'm rambling...
>
> FPS is only a small part of a complicated set of interactions.
> People who brag about FPS take a simple minded view, because they
> have simple minds :p
>
> Ben

Wow guys, thanks for the info that's very informative :)

Thanks,
Steve
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

"Steve" <bond_youknowtherest_uk@yahoo.co.uk> wrote in message
news:2m03gcFhijskU1@uni-berlin.de...
> I'm a n00b to the whole graphics card scene and I'm about to buy a Radeon
> 9800Pro for my system but I've been talking to some people and I was told
by
> someone that the human eye cannot notice the difference once you get above
20 or
> so frames per second. Is this true and if it is what is the big bragging
rights
> about frames per second? I'm not looking to start a flame war just for
some
> information.

You cant distinguish above 60fps as far as I am aware. Above that is simply
for boasting.
 

ste

Distinguished
Nov 14, 2002
36
0
18,530
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Jamie_Manic wrote:
> "Steve" <bond_youknowtherest_uk@yahoo.co.uk> wrote in message
> news:2m03gcFhijskU1@uni-berlin.de...
>> I'm a n00b to the whole graphics card scene and I'm about to buy a
>> Radeon 9800Pro for my system but I've been talking to some people
>> and I was told by someone that the human eye cannot notice the
>> difference once you get above 20 or so frames per second. Is this
>> true and if it is what is the big bragging rights about frames per
>> second? I'm not looking to start a flame war just for some
>> information.
>
> You cant distinguish above 60fps as far as I am aware. Above that is
> simply for boasting.
You should tell the monitor makers who have had 85Hz for years, I'm sure
they would thank you for saving them money.
Or laugh at you...

--
STE ;¬!
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

STE wrote:
> Jamie_Manic wrote:
>> "Steve" <bond_youknowtherest_uk@yahoo.co.uk> wrote in message
>> news:2m03gcFhijskU1@uni-berlin.de...
>>> I'm a n00b to the whole graphics card scene and I'm about to buy a
>>> Radeon 9800Pro for my system but I've been talking to some people
>>> and I was told by someone that the human eye cannot notice the
>>> difference once you get above 20 or so frames per second. Is this
>>> true and if it is what is the big bragging rights about frames per
>>> second? I'm not looking to start a flame war just for some
>>> information.
>>
>> You cant distinguish above 60fps as far as I am aware. Above that is
>> simply for boasting.
> You should tell the monitor makers who have had 85Hz for years, I'm sure
> they would thank you for saving them money.
> Or laugh at you...

Like I said... this is complicated issue.

You can see sharp contrast changes up to around 75-80 times/second.
(depending on the conditions, I can tell you the difference between 75 and
80Hz refresh, even 72Hz and 75Hz on a monitor I know well, in stable
conditions). 80Hz is about my limit though.

For low "motion" frame changes with low contrast changes it might a figure
closer to 25fps, or even less.

If all of your objects on sceen moved at a max of 1 pixel second, 1fps would
be enough. You would of course still need 80Hz screen refresh on a CRT.
For 2 pixels/s motion, 2fps would be enough.

So in fast action games like First Person Shooters (I'll refrain from using
"FPS" for that as well!), you need a higher frames per second than say, a
platform game (do they even make them any more!).

So FPS, Refresh rates and anything else you can measure over time really
does depend on a whole host of things. If the contrast of a monitor is set
really low, 60Hz refresh of a CRT *might* be sufficient. With high
contrast, 80Hz might be required.

For low motion games, 25fps might be plenty, for FP shooters, 40+ might be
required.

Ben
--
A7N8X FAQ: www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

I'll also mention the difference between an LCD and a CRT:

A CRTs brightness decays over time. On the refresh a pixel goes from it's
darkest to it's brightest in a short space of time, hence a sharp contrast
change, and high refresh rates required.

An LCD is not "refreshed", it's brightness doesn't change unless you tell it
to. It is therefore updated. Hence contrast changes tend to be lower and
lower FPS, or updates can be quite satisfactory, A 25ms display is clearly
updated 40 times/second. And they seem quite reasonable. Most (if not all)
16ms LCD displays are actually updated UP to around 62 times/second, but as
low as around 30 times/s. The worst case is often worse than a 25ms
display.

Thus, comparing the refresh/update rate of an LCD to a CRT is pointless.

Ben
--
A7N8X FAQ: www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...
 

JohnS

Distinguished
Apr 2, 2004
314
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

> someone that the human eye cannot notice the difference once you get above
20 or
> so frames per second. Is this true

Who knows? I service and repair all kinds of monitors,
and I still don't know what the gamers mean by "frame
rate". When they talk about frame rates in the 100s,
there is no way they are talking about video screen
"frame rate". A video screen frame rate of 150 would
require a horizontal scan rate that no monitor can
produce .... and a dot write rate somewhere up around
the speed of light. I run 3DMark, and it tells me that
I have "frame rates" going nearly 200 fps ?????????
Bullshit! Maybe they can update a video buffer at
those rates, but there AIN'T NO WAY they are
pushing that buffer to the screen on every single update.
So what is video game "frame rate" ???? I use to write
a lot of beam steering code back in my Apple 2 days.
Is it something like that? And if you guys believe this
stuff about 200fps, show me a flyback that can take
it without screaming its lungs out and going up in smoke.
Those flybacks are sitting in tuned circuits, and you can't
steer their voltages all over the map and not have them
oscillate and break up. That scan rate determines
their voltage output, and it has to hold steady, or the
screen would be changing size all over the place. So
this frame rate business is a lot of doubletalk. !!!!!

johns
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Can't notice a difference above 20fps? Uhm... no. They might have brought up
the fact that standard films are recorded at 24fps and that these films have
fooled the eye for years. However, what they don't realize is that the
technology used in a film projector is very different from the technology
used in TV's and monitors. Film projection in theatres flash images on the
screen 24 times per second. Each frame is flashed in it's entirety and all
at once... meaning every element of the image (pixel if you will... but
pixels don't really apply to non-digital film) hits the screen at the same
exact time. Conversely, TV's and monitors draw images on the screen by
horizontally scanning an electron beam, drawing one pixel at a time until
the every possible horizontal line of resolution is drawn.

So... 24fps is enough to fool the eye into perceiving smooth life-like
motion in the theatre because the brain, more specifically the visual
cortex, interpolates the missing data between each frame... ultimately
almost doubling the effective framerate (this is commonly referred to at
"motion blur". The brain does not respond in exactly the same way to images
that are horizontally scanned (refreshed) or updated like in LCD's.

Reference this article: http://www.daniele.ch/school/30vs60/30vs60_3.html

The maximum frame rate is still debatable. The above article suggests 72 fps
to be the max before the optical nerve is completely saturated with sensory
data. However, and I'm not suggesting that my optic nerve is genetically
superior to anyone else's, I can notice a BIG difference between 75fps and
85fps. I can not really notice any additional fps over 85.

Long story short... I would say the game should never drop below 85fps if
you want life-like motion.

Damnit.. I think my boss caught me.

--
Tony DiMarzio
djtone81@hotmail.com
djraid@comcast.net
"Steve" <bond_youknowtherest_uk@yahoo.co.uk> wrote in message
news:2m03gcFhijskU1@uni-berlin.de...
> I'm a n00b to the whole graphics card scene and I'm about to buy a Radeon
> 9800Pro for my system but I've been talking to some people and I was told
by
> someone that the human eye cannot notice the difference once you get above
20 or
> so frames per second. Is this true and if it is what is the big bragging
rights
> about frames per second? I'm not looking to start a flame war just for
some
> information.
>
> Thanks,
> Steve
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Actually. There are plenty of CRT's that can do 160hz Vertical Refresh
rate... albeit at lower resolutions. Aside from that, "Yes", when gamers
talk about frame rates, whether they know it or not, they are generally
referring to the rate at which frames are written to the frame buffer.

--
Tony DiMarzio
djtone81@hotmail.com
djraid@comcast.net
"johns" <johnsxxx@mudbog.edu> wrote in message
news:cdgq5r$fih$1@kestrel.csrv.uidaho.edu...
>
> > someone that the human eye cannot notice the difference once you get
above
> 20 or
> > so frames per second. Is this true
>
> Who knows? I service and repair all kinds of monitors,
> and I still don't know what the gamers mean by "frame
> rate". When they talk about frame rates in the 100s,
> there is no way they are talking about video screen
> "frame rate". A video screen frame rate of 150 would
> require a horizontal scan rate that no monitor can
> produce .... and a dot write rate somewhere up around
> the speed of light. I run 3DMark, and it tells me that
> I have "frame rates" going nearly 200 fps ?????????
> Bullshit! Maybe they can update a video buffer at
> those rates, but there AIN'T NO WAY they are
> pushing that buffer to the screen on every single update.
> So what is video game "frame rate" ???? I use to write
> a lot of beam steering code back in my Apple 2 days.
> Is it something like that? And if you guys believe this
> stuff about 200fps, show me a flyback that can take
> it without screaming its lungs out and going up in smoke.
> Those flybacks are sitting in tuned circuits, and you can't
> steer their voltages all over the map and not have them
> oscillate and break up. That scan rate determines
> their voltage output, and it has to hold steady, or the
> screen would be changing size all over the place. So
> this frame rate business is a lot of doubletalk. !!!!!
>
> johns
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

On 7/19/2004 8:59 AM Tony DiMarzio brightened our day with:

>Actually. There are plenty of CRT's that can do 160hz Vertical Refresh
>rate... albeit at lower resolutions. Aside from that, "Yes", when gamers
>talk about frame rates, whether they know it or not, they are generally
>referring to the rate at which frames are written to the frame buffer.
>
>
>
And that "write to frame buffer" fps can be important...
A lot of the fps freak-out has to do with the Quake games. I can't
remember this bit exactly, but a good while ago Quake2 based games, and
perhaps Half-Life/Counter Strike, the number of updates to the server
was dependent upon how many frames you were generating, so super high
fps actually could equate to greater accuracy.
Quake3 had almost everyone striving to achieve 125 frames per second,
which had little to do with what one can notice visually and everything
to do with the game's physics engine. As has been tested and discussed
in many places before, 125 fps was ideal for getting longer jumps, so
you could reach the mega health or some out of the way armor power up.
The above situation led many experienced Q3 gamers to claim that they
could tell the difference between the game running at 125 fps and
running at 100 fps, this was effectively true since the games
performance was slightly difference at those different rates. A noob
would hear this and scream, "you can't tell the difference between such
high frame rates", and even a veteran Q3 player viewing the game
non-interactively couldn't make the distinction. Telling the difference
while playing is a whole 'nother story.
Here's what I think, if while you're playing a game your frame rates
drop from 85 to 60 and then shoot back up, just about everyone can tell
something happened. If you were to look at two screens side by side
with identical refresh rates of 100, and one was running capped at 85
fps, the other at 60 fps, you wouldn't be able to tell the difference.

--
"You keep using that word. I do not think it means what you think it means."

- Inigo Montoya
Steve [Inglo]
HES PORN
 

JohnS

Distinguished
Apr 2, 2004
314
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

> Actually. There are plenty of CRT's that can do 160hz Vertical Refresh

Maybe some whopping expensive ones. None of the monitors I
service have a spec like that.

> rate... albeit at lower resolutions. Aside from that, "Yes", when gamers
> talk about frame rates, whether they know it or not, they are generally
> referring to the rate at which frames are written to the frame buffer.

I wish you would write a book. I would truly like to know more
about gamer "frame rates". It's been nearly 20 years since I picked
up a decent book on the subject, and now I just don't see them
in the stores anymore.

johns
 

JohnS

Distinguished
Apr 2, 2004
314
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

> something happened. If you were to look at two screens side by side
> with identical refresh rates of 100, and one was running capped at 85
> fps, the other at 60 fps, you wouldn't be able to tell the difference.

I can see game differences based on the number of "sprites"
on the screen. Still the facts escape me .. I have to assume
that the screen with more sprites takes longer to draw, and
that causes the "next" buffer ( of two ) to be incomplete ??
or what ?? Is the buffer just held at high impedance and
blanked or what? I seem to be amazingly dense on this
subject. I don't understand what the fps is doing.

johns
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

johns wrote:
>> something happened. If you were to look at two screens side by side
>> with identical refresh rates of 100, and one was running capped at 85
>> fps, the other at 60 fps, you wouldn't be able to tell the difference.
>
> I can see game differences based on the number of "sprites"
> on the screen. Still the facts escape me .. I have to assume
> that the screen with more sprites takes longer to draw, and
> that causes the "next" buffer ( of two ) to be incomplete ??
> or what ?? Is the buffer just held at high impedance and
> blanked or what? I seem to be amazingly dense on this
> subject. I don't understand what the fps is doing.

OK, here's roughly how it works:

You have 2 buffers, effectively (or sometimes more). You draw to buffer A,
building the image as required. Whilst you are doing this, you have the
RAMDAC of the video card pulling data out of buffer B. Then, when you have
finished drawing buffer A, you point the RAMDAC to buffer A, and start
drawing your new image in buffer B. This is called a page swap. You don't
usually have separate buffer chips to stick in high impedance states, you
just change the pointer to a different part of the same chip (since this is
after all just SDRAM).

Page Swaps can be synchronised to the vertical refresh of the monitor or not
(sometimes called wait for vertical synch or similar). Hence, page swaps
can be asynchronous to the screen redraw. You could display buffer A x
number of times before performing a page swap, or you could perform x page
swaps before displaying the picture again.

If you do not wait for vertical synch, the buffer can get changed half way
through a redraw and you *may* see "tearing". Which is ugly...

Ben
--
A7N8X FAQ: www.ben.pope.name/a7n8x_faq.html
Questions by email will likely be ignored, please use the newsgroups.
I'm not just a number. To many, I'm known as a String...