Radeon 2 vs. Geforce3
Tags:
-
Graphics Cards
-
Radeon
-
AMD
- Princess
-
Graphics
Last response: in Graphics & Displays
Anonymous
a
b
U
Graphics card
May 10, 2001 4:08:56 PM
kavbear
May 10, 2001 11:47:24 PM
Anonymous
a
b
U
Graphics card
May 10, 2001 11:49:28 PM
LDT = Lighting Data Transfer
Some new AMD tech. they have been praising about in the CPU forum. LDT is the reason the Geforce3 bandwith is 7.6gb making no bootlenecks for it .
While the Radeon2 uses HyperZ which saves alot of bandwith needed.
Image a Geforce3 with hyperZ could get 10gb of bandwith.
I only steal the princess cause she is hot ;*)
Some new AMD tech. they have been praising about in the CPU forum. LDT is the reason the Geforce3 bandwith is 7.6gb making no bootlenecks for it .
While the Radeon2 uses HyperZ which saves alot of bandwith needed.
Image a Geforce3 with hyperZ could get 10gb of bandwith.
I only steal the princess cause she is hot ;*)
Related resources
- Geforce3 Ti200 vs Radeon 8500 - Forum
- NVidia Geforce3 vs ATI Radeon 8500 - Forum
- Geforce2 Ultra 64 MB vs. Geforce3 64 MB? - Forum
- GeForce3 vs. GeForce2 Ultra - Forum
- 2gb vs 4gb Radeon R9 270x - Forum
HolyGrenade
May 11, 2001 9:08:31 AM
Anonymous
a
b
U
Graphics card
May 24, 2001 11:38:23 AM
HolyGrenade
May 24, 2001 11:47:11 AM
Anonymous
a
b
U
Graphics card
May 24, 2001 12:09:57 PM
i wasnt saying it didnt the guy said it gave more bandwidth i was telling him wht it dose. and the radeon256 was only ment for the gf256 the fact it beats out the gf2 cards is realy funny to me. and this is teh reason ill only buy radeon aside from the quality is much better in the card and the picture
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
HolyGrenade
May 24, 2001 12:53:20 PM
It is funny that people say this card is "meant" to compare with that card. It just doesn't work like that. Compare it with other cards in the same price range.
If I were to go out and buy a card, i would compare my options in a certain price range. I wont compare cards on the basis that the manufacturer said this card is our answer to that card of that company.
The technology used in the GeForce 2 cards is similar to the Technoogy used in the radeon cards. They were released in the same timeframe. Just because they are the first generation T&L cards from ATI, doesn't mean they are meant to compare with the first generation T&L cards from nVidia.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
If I were to go out and buy a card, i would compare my options in a certain price range. I wont compare cards on the basis that the manufacturer said this card is our answer to that card of that company.
The technology used in the GeForce 2 cards is similar to the Technoogy used in the radeon cards. They were released in the same timeframe. Just because they are the first generation T&L cards from ATI, doesn't mean they are meant to compare with the first generation T&L cards from nVidia.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Anonymous
a
b
U
Graphics card
May 24, 2001 1:02:21 PM
well basicly price meens nothing in compairing a cards praformance, the type of people that will actualy compaire cards on price could care less about how the card praformes. and yes accoring to ati the radeon was ment to compiete with geforce256 and the gf2 cmae out after the radeon. and my point was simply that once ati brings out there new card it takea while for everyone to catch up. and the quality to anyone that realy cares about the video card is much more importnat to price. gf3 is the only card that can say it matches and surpasses a radeon in tech. and they were not the first t&l cards just the first where teh t&l actualy worked. look at quality not price. as far as nvidia there over priced for what you get. radeons t&l engine is supirior to geforce2 and the hyper z is not on gf2. Quality is the #1 not price i would pay more fr a better card then a cheaper poor praformer.
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
HolyGrenade
May 24, 2001 1:45:12 PM
I've seen a lot of benchmarks, and the radeons beat the GF2 GTS in only a tiny minority. I know the Radeons are technically superior, that alone should be a pointer to show that they were never meant to compete with the GF1 cards. If they were really meant to compete with them they would have been released in the same price range.
That is entirely untrue. If you look in the threads in this forum, almost everyone asking for advice on which card to buy, mentions cards of the smae price range, or they state a price range. See, not every one can afford the best performing card with highest quality. If it was like that, everyone would go out and buy a GF3.
This is only true featurewise. The technology in the GF2 cards allow it to outperform the radeon cards. So any nvidia guy may claim their technology to be superior. It is a matter of perception.
Right now, my advice to the people buying cards has been to get a Radeon based card if they want a card for general use plus gaming. If they wanted gaming first and everything else second, I told them to get a GF2 card. Afterall, they are the best performers. You could talk about the Kyro 2. I always refrained from advising about that card. I'm jsut not certain about that card, but I do think it is not a wise longterm investment.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Quote:
the type of people that will actualy compaire cards on price could care less about how the card praformes.That is entirely untrue. If you look in the threads in this forum, almost everyone asking for advice on which card to buy, mentions cards of the smae price range, or they state a price range. See, not every one can afford the best performing card with highest quality. If it was like that, everyone would go out and buy a GF3.
Quote:
gf3 is the only card that can say it matches and surpasses a radeon in techThis is only true featurewise. The technology in the GF2 cards allow it to outperform the radeon cards. So any nvidia guy may claim their technology to be superior. It is a matter of perception.
Right now, my advice to the people buying cards has been to get a Radeon based card if they want a card for general use plus gaming. If they wanted gaming first and everything else second, I told them to get a GF2 card. Afterall, they are the best performers. You could talk about the Kyro 2. I always refrained from advising about that card. I'm jsut not certain about that card, but I do think it is not a wise longterm investment.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Anonymous
a
b
U
Graphics card
May 24, 2001 1:59:14 PM
let me say first off price meens nothing.
they creat the price on how much it cost to produce the chip and how they can make up profit loss the fastest with out way over pricing there cards.
holygrenade:This is only true featurewise. The technology in the GF2 cards allow it to outperform the radeon cards. So any nvidia guy may claim their technology to be superior. It is a matter of perception.
blade: fetures yes only reason the gf2 can out praform is clock speed wize. it the fetures of the gf2 arent more advanced. your taking something like radeon that has to work harder to create a better quality picture that has been proven over and over to something that just simply cannot keep up in areas of quality. yeah of course it will operate faster its clocked higher them most radeon cards.
holygrenade: Right now, my advice to the people buying cards has been to get a Radeon based card if they want a card for general use plus gaming. If they wanted gaming first and everything else second, I told them to get a GF2 card
Blade: i simply cannot understand why since the radeon is way mroe then enough of a gaming card and can do alot more then that. and the picture quality is better. dont say it gets more pfs becaus who cares. are you getting more then 30fps? then your playing a game just fune. the geforce cannot create a better quality picture thena radeon becasue it failes to use the tech in radeon. ok ok lets ramp clock speed then its faster. so what. thats likes p-4 ramping speeds to be faster insted of building a good cpu. if your going for best why buy a gf3. there is nothing out there that is going to sue its new tech. so picture quality it nill. so i fail to see the point in that. speed? yet again speed is nothing if your getting over 30fps its the quality of the picture that matters more. so from what i was reading to your post the radeon 2 is technicly supierier to gf3 dose that meen its in a class of its own? i dout it. they jsut care about making a quality product insted of ramping clock and hype to sell there cards. honistly if ati would have got radeon out first we would probly be arguing the other side. thats about the only reason i can see the gf cards being considerd better. its got the name established like intel. first may not meen alot but it has a impact on the less savy computer users.
Computer Shop owner and Head tech.
they creat the price on how much it cost to produce the chip and how they can make up profit loss the fastest with out way over pricing there cards.
holygrenade:This is only true featurewise. The technology in the GF2 cards allow it to outperform the radeon cards. So any nvidia guy may claim their technology to be superior. It is a matter of perception.
blade: fetures yes only reason the gf2 can out praform is clock speed wize. it the fetures of the gf2 arent more advanced. your taking something like radeon that has to work harder to create a better quality picture that has been proven over and over to something that just simply cannot keep up in areas of quality. yeah of course it will operate faster its clocked higher them most radeon cards.
holygrenade: Right now, my advice to the people buying cards has been to get a Radeon based card if they want a card for general use plus gaming. If they wanted gaming first and everything else second, I told them to get a GF2 card
Blade: i simply cannot understand why since the radeon is way mroe then enough of a gaming card and can do alot more then that. and the picture quality is better. dont say it gets more pfs becaus who cares. are you getting more then 30fps? then your playing a game just fune. the geforce cannot create a better quality picture thena radeon becasue it failes to use the tech in radeon. ok ok lets ramp clock speed then its faster. so what. thats likes p-4 ramping speeds to be faster insted of building a good cpu. if your going for best why buy a gf3. there is nothing out there that is going to sue its new tech. so picture quality it nill. so i fail to see the point in that. speed? yet again speed is nothing if your getting over 30fps its the quality of the picture that matters more. so from what i was reading to your post the radeon 2 is technicly supierier to gf3 dose that meen its in a class of its own? i dout it. they jsut care about making a quality product insted of ramping clock and hype to sell there cards. honistly if ati would have got radeon out first we would probly be arguing the other side. thats about the only reason i can see the gf cards being considerd better. its got the name established like intel. first may not meen alot but it has a impact on the less savy computer users.
Computer Shop owner and Head tech.
HolyGrenade
May 24, 2001 2:23:46 PM
calm down and fix your post. I'll try reading it again. From What I understood so far I'll reply:
Price does matter.
The difference in quality in a Radeon and a GF2 only becomes significant when texture compression is used.
The radeon 2 is not out yet, so I wont make any specific comparitive remarks on it. If you want to know what I think about it, check nokos thread.
The fact is GF2s do run faster than the Radeons which allows them to perform better. Your analogy with the p4 seems to break down right there.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Price does matter.
The difference in quality in a Radeon and a GF2 only becomes significant when texture compression is used.
The radeon 2 is not out yet, so I wont make any specific comparitive remarks on it. If you want to know what I think about it, check nokos thread.
The fact is GF2s do run faster than the Radeons which allows them to perform better. Your analogy with the p4 seems to break down right there.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Anonymous
a
b
U
Graphics card
May 24, 2001 2:47:03 PM
actualy you totaly missed my point on the p4 comparison. it was ment to be saying that nvidia ramps clocks to gain praformance insted of fixing there current chip like intel is doing with p-4. sorry about the broken post but i was writing as i was going. radeon is more advanced thena gf2 its been writen several times. gf3 is way better in options and quality of picture assuming its designed to take advantage of its new tech. but my initial point is to say either is better for gaming isnt right. the quality of the picture was my real point. any card that can render at 30fps+ is good enough fro clitch free play. and i fail to see what a price has to do with it. when one company selling the same thing as annother could be selling it fr alot more. if u want it to compiete its a good idea to not over price to that product but the price is nill in most cases.
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
HolyGrenade
May 24, 2001 2:53:40 PM
The thing is if it can run everything fast, you can afford to run everything at high resolutions with all the settings to the maximum to enjoy the game more. And not everyone would agree to 30 fps being enough.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Anonymous
a
b
U
Graphics card
May 24, 2001 2:55:45 PM
Sihs
May 24, 2001 4:03:55 PM
Anonymous
a
b
U
Graphics card
May 24, 2001 4:06:23 PM
killall
May 24, 2001 4:19:38 PM
i personally think that the radeon will be better... the gf3 would be a great card if it was clocked higher so i could appreciate its "effects" at 2048x1536... but it will be close, but remember thats a GOOD thing, the closer AMD is to Intel the closer ATi is to NVidia the better it is for us, it drives prices down (the geforce 3 will have to become a lot cheaper to compete with the radeon 2) and improves the market as a whole...
you do not strengthen the weak by weakening the strong
you do not strengthen the weak by weakening the strong
Gog
May 24, 2001 4:29:47 PM
ajmcgarry
May 24, 2001 4:30:42 PM
Hehe its like Tennis, Blade-Holy-Blade-Holy.
Personally what summed it up for me was when I read a post in this forum from a guy who upgraded from the old Riva 128ZX to a Geforce 2 Ultra.
He complained that the games ran a lot faster but the image quality was the exact same.
I had the STB Velocity 128 which uses the Riva 128 chip. In its time I thought it was kick ass, but that was 3 years ago.
I would go with the Radeon first for that very reason. But then again I'm not a heavy gamer. I just play for fun every now and again.
Is there any difference in image quality between a Geforce 2 and a Quattro 2.
I've read guides on how to unlock your Geforce card and mark it a Quattro card. Apparantly its the same chip but the Geforce has some features disabled.
<font color=blue>Little Miss Muffet was arachnaphobic</font color=blue>
Personally what summed it up for me was when I read a post in this forum from a guy who upgraded from the old Riva 128ZX to a Geforce 2 Ultra.
He complained that the games ran a lot faster but the image quality was the exact same.
I had the STB Velocity 128 which uses the Riva 128 chip. In its time I thought it was kick ass, but that was 3 years ago.
I would go with the Radeon first for that very reason. But then again I'm not a heavy gamer. I just play for fun every now and again.
Is there any difference in image quality between a Geforce 2 and a Quattro 2.
I've read guides on how to unlock your Geforce card and mark it a Quattro card. Apparantly its the same chip but the Geforce has some features disabled.
<font color=blue>Little Miss Muffet was arachnaphobic</font color=blue>
Anonymous
a
b
U
Graphics card
May 24, 2001 4:34:30 PM
i wouldnt know i never used one of those im sorry to say i used a geforce at all. but i was reading the artical you did and if your not a big gamer go radeon LE its nice for games but not alot of money and spanks the geforce cheapo cards. and you can flash it to a full retail verson radeon if ya use retail drivers and a bios flash. might want to add a fan if u can but it dont need it.
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
Anonymous
a
b
U
Graphics card
May 24, 2001 7:44:40 PM
Here's my take. Last summer is was building my own computer and debateing on a vid card. At the time, GF2 was on top. Then the Radeon came out and took the lead. I was like, "Wow, that's what I'm going to get!!!!" So I bought the card, and not 1 week later, Nvidia released new drivers and the GF2 was spankin' the Radeon. Am I the only one that remembers this? It's been less than a year and Nvidia has had more driver updates(to my knowledge) than ATI. Even when I got the new Radeon drivers, I didn't notice any difference. I think my Radeons going to be on eBay soon and i going to get a GF3 or GF3 Ultra-something-or-other when it comes out.
peace out,
paks
FA|Paksman-MTK-
Warlord on the Sacrifice division of the Fallen Angels
peace out,
paks
FA|Paksman-MTK-
Warlord on the Sacrifice division of the Fallen Angels
Bud
May 24, 2001 8:29:40 PM
I can see the difference between 50 fps and 90 fps. You must not play games seriously. I play online and believe
me, 30 fps is NOT fine. I want 70fps min! And I'll even lower my res to get 70 fps.
40-50 is good for single player (by yourself) but when you're playing against humans who shoot back....30 fps is
a joke.
I thought I was wrong once, but I was mistaken.
me, 30 fps is NOT fine. I want 70fps min! And I'll even lower my res to get 70 fps.
40-50 is good for single player (by yourself) but when you're playing against humans who shoot back....30 fps is
a joke.
I thought I was wrong once, but I was mistaken.
Anonymous
a
b
U
Graphics card
May 25, 2001 2:04:19 AM
i dont understand how your the only human that can see more then 60fps but then again you have to be right because you say you are. over 30fps is cleen play no matter what you talking about. going online adds more variables. over 60 and game play dose not change since humans unlike you of course cannot see over 60fps. over 60 is jsut waisted over head that acts like a buffer ive played at 30 to 100 and i laugh at not a serious gamer.HA! and as long as it stays above 30fps its cleen play. but i forgot you implanted some new eyes to see more fps so ill leave this convo at that. just to add to is i have a rage fury 32 meg agp 2x card. my comp dosnt not slow down except in quake and unreal at 1024x768x32 where that action drops the card under 30fps my card is like 3 years old and can still play both games glitch free to a point. heh made by ATI evem.
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
OzzieBloke
May 25, 2001 3:22:15 AM
Oh not the flamin' frame rate thingy again
Let's sort this out right now, with a little physiology 101
Nerves transmit info by action potentials. The average maximum of action potentials an optic neuronal cell can transmit is about 25 per second. Transmission is asynchronous, therefore signals are sent from individual retinal receptor cells in a dynamic fashion in relation to all other cells, giving us visiual capabilities that are able to perceive movement faster than 25 fps.
Wave a finger in front of your face, side to side, really fast *under natural light*, and then in front of a computer monitor while it is on. Under natural light, the finger blurs because the light bouncing off it is continuous, and the stimulation of the neurons is such that the brain processes the signals temporally, forming a blurred image. In front of a monitor, you see individual *frames* of your finger, because the light coming from it is intermittent and, depending on your monitor's refresh rate and how fast you move your finger. Thus the brain can detect individual frames as the retinal cells are all being stimulated at the same time, at the frequency of your monitor. Thus, you are removing the dynamic processing aspect of the retinal cells, so you can detect the individual frames beyond 100 fps, because even though the neurons tansmit at only 25 action potentials per second, the retina will register three images of your finger (if your monitor refresh is 75 hertz, four images of your finger if at 100 Hz,) and transmit that, therefore you see a semi-static image of 3 (or 4) fingers.
Why then do we perceive movies as smooth at only 25 (or 50 double flashed) frames per second? Because the images recorded on the film blur as they are recorded. Now even though the brain is only seeing individual frames, the blurring of each frame encourages the cells of the retina to fire at different rates relative to the change in shade or colour of the image. Thus, the blurry bits synthetically re-create the latency that would be normally present in real life lighting situations, where if you watch a car zooming by without moving your head or eyes, it is nothing but a blur. See? Now, a film on telly runs at 25 fps, and looks smooth... but now wave your finger in front of the telly. You can see the individual frames of your finger reflecting the refresh of the telly, but the picture remains smooth due to the synthetic latency created by motion blur. Do the same thing in a movie theatre. Notice that your finger blurs in front of the screen instead of forming frames like a monitor or telly. Why? Because the light passing through the film from the projector is continuous, only the frames of the film are staggered.
So, in conclusion, 30 frames per second on a computer will be perceived as smooth as long as the speed of movement on the screen is not so fast that individual frames of characters or things moving overlap by less than 50% on the pixel level. A vertical line moving left and right only 1 pixel thick therefore will not look smooth at 30 fps, if it is moving more than 30 pixels across the screen each second. Thus you can do two things. Increase the frame rate and refresh rate, and let the brain combine the images, which will create the same effect as waving your finger in front of a monitor, or motion blurr the image, which will create a more realisting, natural looking image at a frame rate of 30 fps.
The former method has the problem that, for an object moving at 200 pixels a second, the brain will not process the images into a *believable* smooth blurred image unless the refresh rate is at least at fast, that is, 200 fps, and your monitor is at 200 Hz.
The pixel rate of movement only holds true, however, so long as the resolution of the screen is below that of the resolution of the eye. Each receptor in the eye detects about 26 seconds (angular) of visual field. That is one 13/1800th of a degree (0.00722222.. degrees). If you sit 2 feet away from a 17 inch monitor, you are viewing about 40 degrees horizontal and 30 degrees vertical of your visual field. That is 5,538 by 4,154 "pixels" in the that area that your eye can detect. So when a 17 inch monitor reaches that resulution (har har), not only will anti-aliasing no longer be needed, but movement of 200 pixels per second at a refresh of 201 Hz would be perceived as *absolutely* smooth. Ofcourse, something moving across the entire width of the screen (5,538 pixels) would need an equivalent refresh rate to be perceived as absolutely smooth, so that is when you revert to motion blurring frames, for at that resolution, detail would not be lost, and would reflect almost perfectly what the human eye would perceive in normal condition.
Okay, that should clear things up.
Cow with legs spread wide either dead or playing 'cello.
Let's sort this out right now, with a little physiology 101
Nerves transmit info by action potentials. The average maximum of action potentials an optic neuronal cell can transmit is about 25 per second. Transmission is asynchronous, therefore signals are sent from individual retinal receptor cells in a dynamic fashion in relation to all other cells, giving us visiual capabilities that are able to perceive movement faster than 25 fps.
Wave a finger in front of your face, side to side, really fast *under natural light*, and then in front of a computer monitor while it is on. Under natural light, the finger blurs because the light bouncing off it is continuous, and the stimulation of the neurons is such that the brain processes the signals temporally, forming a blurred image. In front of a monitor, you see individual *frames* of your finger, because the light coming from it is intermittent and, depending on your monitor's refresh rate and how fast you move your finger. Thus the brain can detect individual frames as the retinal cells are all being stimulated at the same time, at the frequency of your monitor. Thus, you are removing the dynamic processing aspect of the retinal cells, so you can detect the individual frames beyond 100 fps, because even though the neurons tansmit at only 25 action potentials per second, the retina will register three images of your finger (if your monitor refresh is 75 hertz, four images of your finger if at 100 Hz,) and transmit that, therefore you see a semi-static image of 3 (or 4) fingers.
Why then do we perceive movies as smooth at only 25 (or 50 double flashed) frames per second? Because the images recorded on the film blur as they are recorded. Now even though the brain is only seeing individual frames, the blurring of each frame encourages the cells of the retina to fire at different rates relative to the change in shade or colour of the image. Thus, the blurry bits synthetically re-create the latency that would be normally present in real life lighting situations, where if you watch a car zooming by without moving your head or eyes, it is nothing but a blur. See? Now, a film on telly runs at 25 fps, and looks smooth... but now wave your finger in front of the telly. You can see the individual frames of your finger reflecting the refresh of the telly, but the picture remains smooth due to the synthetic latency created by motion blur. Do the same thing in a movie theatre. Notice that your finger blurs in front of the screen instead of forming frames like a monitor or telly. Why? Because the light passing through the film from the projector is continuous, only the frames of the film are staggered.
So, in conclusion, 30 frames per second on a computer will be perceived as smooth as long as the speed of movement on the screen is not so fast that individual frames of characters or things moving overlap by less than 50% on the pixel level. A vertical line moving left and right only 1 pixel thick therefore will not look smooth at 30 fps, if it is moving more than 30 pixels across the screen each second. Thus you can do two things. Increase the frame rate and refresh rate, and let the brain combine the images, which will create the same effect as waving your finger in front of a monitor, or motion blurr the image, which will create a more realisting, natural looking image at a frame rate of 30 fps.
The former method has the problem that, for an object moving at 200 pixels a second, the brain will not process the images into a *believable* smooth blurred image unless the refresh rate is at least at fast, that is, 200 fps, and your monitor is at 200 Hz.
The pixel rate of movement only holds true, however, so long as the resolution of the screen is below that of the resolution of the eye. Each receptor in the eye detects about 26 seconds (angular) of visual field. That is one 13/1800th of a degree (0.00722222.. degrees). If you sit 2 feet away from a 17 inch monitor, you are viewing about 40 degrees horizontal and 30 degrees vertical of your visual field. That is 5,538 by 4,154 "pixels" in the that area that your eye can detect. So when a 17 inch monitor reaches that resulution (har har), not only will anti-aliasing no longer be needed, but movement of 200 pixels per second at a refresh of 201 Hz would be perceived as *absolutely* smooth. Ofcourse, something moving across the entire width of the screen (5,538 pixels) would need an equivalent refresh rate to be perceived as absolutely smooth, so that is when you revert to motion blurring frames, for at that resolution, detail would not be lost, and would reflect almost perfectly what the human eye would perceive in normal condition.
Okay, that should clear things up.
Cow with legs spread wide either dead or playing 'cello.
noko
May 25, 2001 3:46:19 AM
My month is still hanging open, W
W!! So are you telling me that if I showed you a 45 FPS animation and a 60 FPS animation you will be able to tell the difference and the rate? Or during game play when the frame rate changes from the max refresh rate of montior lets say 100hz (100FPS) to lets say 50 FPS then 61 FPS 55 FPS and then back to 100 FPS that you would notice it? Are you really saying that after around 25 FPS the brain uses other cues to determine speed such as blurring since of the limitations of signals from eyes to brain. Obvously it is the eyes that does the blurring and not the brain. The brain as you say gets a 25 FPS image rate. If the image is blurred then the brain is cued by it and the amount of blurring is used to determine speed. Obviously the strobe effect of CRT monitors limit the natural blurring of an image by the eyes.
That was a very good explanation by the way :smile: .<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/24/01 11:48 PM.</EM></FONT></P>
W!! So are you telling me that if I showed you a 45 FPS animation and a 60 FPS animation you will be able to tell the difference and the rate? Or during game play when the frame rate changes from the max refresh rate of montior lets say 100hz (100FPS) to lets say 50 FPS then 61 FPS 55 FPS and then back to 100 FPS that you would notice it? Are you really saying that after around 25 FPS the brain uses other cues to determine speed such as blurring since of the limitations of signals from eyes to brain. Obvously it is the eyes that does the blurring and not the brain. The brain as you say gets a 25 FPS image rate. If the image is blurred then the brain is cued by it and the amount of blurring is used to determine speed. Obviously the strobe effect of CRT monitors limit the natural blurring of an image by the eyes.That was a very good explanation by the way :smile: .<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/24/01 11:48 PM.</EM></FONT></P>
noko
May 25, 2001 4:12:35 AM
Anonymous
a
b
U
Graphics card
May 25, 2001 4:32:10 AM
that has to be the most in depth description i have ever seen. respect is warrented. but i was simply saying as long as frames will stay above 30+ it owuld be smoth taking out some variobles of course and at 60 framsyou cannot proccess higher. that was interesting at any rate. i wasnt realy getting in to refresh thats dependant on monitor as well as video if the card refreshes at 200 lets say and the monitor at 60hrz then you get a bottle neck. in video quality. anyways im sure having 200fps is good so it will stop slow down that wasnt what i was trying to say however. and sorry but im used to talking to idiots that listen to the people that make the stuff insted of using it and doing there own resurch. i must say this was a welcome convo.
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
Bud
May 25, 2001 4:52:02 AM
I'm fully awawre that 30 fps + is just only just smooth...yes I will agree with you there. But In FPS games, just only smooth, to the naked eye, is not the same as being fast! Im talking about traveling as fast as I can over the game terrain. Not just only smooth to the eye.
...I'll kick your ass all over town in LandWarrior if ur at 30 fps...lol
There's a big difference between "smooth" to the naked eye....and raw speed which drives my character accross
the gamescape.
I can't process higher than 60FPS? ...according to you anyway...BUT MY CPU CAN !!!!!!! And that creats an advatage in what I see vs what my opponent sees.....It's not about pleasing my eye,... or rods.... or cones....or some refresh rate...or some anal-retentive techno crap.
...It's about driving the program fast.
Haven't you ever played a really old game on a new computer and seen how freaking blazing fast it is??
Hell...try scrolling in Warcraft II on a new computer...LOLOLOL
But what's the point in telling a smart ass know it all like you. Just come on in, and I'll show you....hehehehe.
I thought I was wrong once, but I was mistaken.<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 05/24/01 11:15 PM.</EM></FONT></P>
...I'll kick your ass all over town in LandWarrior if ur at 30 fps...lol
There's a big difference between "smooth" to the naked eye....and raw speed which drives my character accross
the gamescape.
I can't process higher than 60FPS? ...according to you anyway...BUT MY CPU CAN !!!!!!! And that creats an advatage in what I see vs what my opponent sees.....It's not about pleasing my eye,... or rods.... or cones....or some refresh rate...or some anal-retentive techno crap.
...It's about driving the program fast.
Haven't you ever played a really old game on a new computer and seen how freaking blazing fast it is??
Hell...try scrolling in Warcraft II on a new computer...LOLOLOL
But what's the point in telling a smart ass know it all like you. Just come on in, and I'll show you....hehehehe.
I thought I was wrong once, but I was mistaken.<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 05/24/01 11:15 PM.</EM></FONT></P>
Sihs
May 25, 2001 7:21:39 AM
You tell him bud. I'd like to see Blade2G playing Quake 3 at 30FPS so I can kick his ass all over the place. Listen Blade, I am a gamer and I know how it works. People with 30FPS are usually fragbaits. Your computer must be "fast enough" so it won't drop frame rates when things get ugly. But for a Radeon, I heard 40fps is good enough because it has consistent frame rates, they don't drop as fast as the GTS does. That's different than 26fps on tv that appears really smooth.
Sh!t Happens.
Sh!t Happens.
Sihs
May 25, 2001 7:26:51 AM
HolyGrenade
May 25, 2001 8:46:29 AM
noko
May 25, 2001 3:00:21 PM
Well I wasn't 100% right with my above post anyways, sarcastic? Who me? naaaaaa. Sometimes I do come off that way unintentionally I might add. Rods and cones of the eye does play a part in this discussion for those who are interested to know if someone can really tell the difference between 30 and 60 FPS, then go <b>HERE</b>:
<A HREF="http://www.penstarsys.com/editor/30v60/30v60p1.htm" target="_new">http://www.penstarsys.com/editor/30v60/30v60p1.htm</A>
Let us know what you think.
<A HREF="http://www.penstarsys.com/editor/30v60/30v60p1.htm" target="_new">http://www.penstarsys.com/editor/30v60/30v60p1.htm</A>
Let us know what you think.
Anonymous
a
b
U
Graphics card
May 25, 2001 8:18:27 PM
i didnt know ya did but by any means its all good. i wasnt pointing out games i was pointing out the video cards. i was pointing out clearity in picture and the fact 60fps is all you need for smooth clear looking picture. sure i hate playing at 30 to 50 fps since i hardly get over 35fps. but i dont have problems. i wasnt taking into affect my computer config could be better then others and oppoligise for that. but i will say im not a software tech and only a hardware tech and i am a gamer. i have clans in a few games heh. but this was defanitly a fun post.
Computer Shop owner and Head tech.
Computer Shop owner and Head tech.
Related resources
- SolvedRadeon R9 295x2 vs 290x (Preference) Forum
- SolvedAMD Radeon R9 295x2 VS Nvidia GTX980 SLI Forum
- SolvedRadeon R9 270x 4GB (PCI-E 3.0) vs. Asus P5Q3 (PCI-E 2.0) Forum
- SolvedRadeon R9 290x vs Radeon R9 295x2 vs Matrix R9 290x vs GTX Titan Black Forum
- SolvedTWO Radeon R9 290x vs ONE Radeon R9 295X2 vs ONE GeForce GTX Titan?? Forum
- SolvedAsus AMD/ATI R7 260X Direct CUII 2GB DDR5 vs Sapphire AMD/ATI Radeon R7 265 DUAL-X 2GB Graphics Card Forum
- SolvedEVGA GeForce GTX 660 3GB SuperClocked Video Card vs. Asus Radeon R9 270X 2GB DirectCU II Video Card Forum
- SolvedSapphire Radeon R9 280X 3GB vs x2 XFX Radeon HD 5970 Black Edition 2 GB Forum
- Solved2 sapphire radeon r9 270 2gb SLI vs gigabyte geforce gtx 770 4gb Forum
- SolvedMSI Radeon R9 280X 3gb vs. ASUS GeForce GTX760 2gb? Forum
- SolvedSAPPHIRE DUAL-X Radeon R9 270 2GB 256-Bit GDDR5 VS EVGA GeForce GTX 750 Ti Superclocked Forum
- SolvedGuys, is there much of a difference between an AMD A10 7850K vs MSI AMD Radeon R7 260X 2gb and a AMD Athlon X4 760K? the amd a Forum
- Solvednvidia gt 640 2gb ddr5 vs amd radeon r7 1 gb ddr5 vs nvidia gt 730 2 gb ddr5 which is better250 Forum
- SolvedXFX Radeon R9 270X Double Dissipation 2GB vs. Sapphire Radeon R9 270X Boost OC 4GB Forum
- SolvedMSI Radeon r9 280 GAMING 2-way crossfire vs MSI Radeon r9 290 GAMING Forum
- More resources
Read discussions in other Graphics & Displays categories
!