Question for ATI Radeon over the GeForce2.

Following my post about the price of GeForce3, if it is too expensive for me, I have to choose between GeForce 2 or Radeon. I was not really attracted with the radeon, considering GeForce 2 much more superior, but lots of people seems to consider Radeon as equal or better than GeForce2. I wish to play in 1024x768x32. And for Radeon, I want no less than the 64 meg DDR. Is there more than one version of 64 meg DDR? And I want your opinion for Radeon. And I have to say, that I don't always play game so other advantages are welcome, but I don't do video and I don't plan to plug my TV on my computers. I hope that you will answer my big bunch of questions!
Thank you in advance!!! I love so much that forum!!!
(sorry, my joy exploded....)

Dasdrasvyet Sovyetskikh Soyuz! (not true)
109 answers Last reply
More about question radeon geforce2
  1. Well the Radeon 64mb VIVO is a good choice my Radeon 32mb ddr runs 1028x768x32 in half-life runs great. Well i have seen 64mb VIVO radeon's for $150.00

    Nice Intel and AMD users get a Cookie.... :smile: Yummy :smile:
  2. If you can wait, I do beleive that the new Radeon will be coming out shortly, and what this will do is lower the prices of all the other cards as competition has a way of doing. I Use the AIW Radeon and it has done a good job with what I use it for.

    If it works for you then don't fix it.
  3. The Geforce 2 is not superior to the Radeon at all! For playing in 1024x768 32bit colour you will get the same frame rate if not better on the radeon! Geforce card is great for 16bit but when it come to 32bit the radeon is equal if not better! Texture compression on the Geforce card sucks to! Image quality is also better on the Radeon!

    One more thing it's a great price!
  4. I have a Radeon 64 and I usually play above 1024x768x32 without any problems, the Radeon shines at High Res, 32bit gaming. The quality is awesome. The new Radeon 64s are better then the orginal due to faster memory and a revised Radeon core. They are quit frankly screamers and beat GF2 Ultras in 3dMark2001 and some games.
  5. Radeon's a great card, GF2 is also. There is a newer version of the retail 64DDR card that has faster ram and a higher clock, so go for it.

    Cast not thine pearls before the swine
  6. Quote:
    I was not really attracted with the radeon, considering GeForce 2 much more superior

    A common mistake. NVidia have more partners and therefor more variations of their card and more advertising.

    So far I've recommended Radeon cards to 2 friends of mine and both are nothing short of flabbergasted.
    Just because the GeForces can be configured to give enormous benchmark scores in 3DMark does not make it superior.

    <font color=blue>GeForce2 Cards.</font color=blue>
    Lightning Fast especially in 16bit modes
    Driver support second to none
    Availability and support
    Relatively poor image. Some people say little improvement over 128ZX chips
    Some incompatability issues with Socket-A Motherboards

    <font color=blue>Radeon Cards.</font color=blue>
    32bit speed performance almost matches 16bit performance.
    Superior image quality over Geforce in Games (On par with Voodoo 5)
    Superior 2D quality and DVD playback second to none.
    Radeon LE (Ask and I'm sure someone will tell you)
    Relatively poor driver support.
    Some reported bugs in Win2K drivers.

    Basically with the Radeon your getting speeds in 32bit that are always close to the Geforce and in many games supassing it, because of Hyper Z (Z occlussion and culling in memory). It losses out in 16 bit, but you don't care about that. You also get a higher quality image, and crisper DVD playback.

    <font color=blue>Smoke me a Kipper, I'll be back for breakfast!</font color=blue>
  7. You could wait for the Radeon 2 to come out for a price drop on the Radeon, or even get the Radeon 2. Personally I can't wait for the Radeon 2. If it does has Truform (N-Patches) then Nvidia are going to [-peep-] themselves. Unfortunately ATi are not giving us the full story, all we know is that they are working on the technology.
    Check out this site or Anandtech for more on that.
    Or you could ask Noko.

    <font color=blue>Smoke me a Kipper, I'll be back for breakfast!</font color=blue>
  8. I disagree with the poor driver support, that is more of a reputation from the hell drivers of the Rage128 chip. I had a Rage Fury with a super7 mother board, I know what hell drivers are. Radeon drivers are very good now. The Win9x drivers where from the beginning rather good, now they are very good. W2K drivers are like night and day, when first release they just plain sucked, now they are also very good. I do most of my gaming in W2K without to many problems, in fact the only problem I have is with FLY2, the first real game problem I had with my Radeon. Everything else just screams.
  9. Well the RADEON gang has chimed in :) I like how when someone posts this question how they all rush in to convert

    I sugest you look at benchmarks at various res and detail
    levels. Don't just take someone's word...see the empirical
    evidence for yourself.

    I can tell you that GF2 really holds great frame rates at high res and detail.

    But the Radeon gang seems to love their cards too. But I always get the impression that they're rebeling against the establishment though.

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/13/01 11:43 AM.</EM></FONT></P>
  10. hehe, well we just know it is a great card. I do have a MX400 card and the 3d quality really doesn't compare. Colors are much better, textures look better, Anisotropic filtering kicks major butt with the Radeon. Plus the superior 2d, DVD and video quality, more DX8 features such as Enviromental bump mapping, 3d textures, 4 matrix skinning. We know what we got and most of us are very very happy with the Radeon. Even us W2k users. FPS benchmarks counts for about 30% for my overall ratings, too many other considerations need to be made in which the Radeon does very good overall. Just locking in on FPS I think is a big mistake.
  11. I'm so glad you are comparing your MX to Radeon...NOT
    Poop on MX. That's not even the real GeForce man! It don't mean a thing!

    We've been over this before and I simply disagree you.
    GF2gts/pro/ultra are all better cards for gaming than
    Radeon...Radeon is good but not as good.

    And looking at frame rates is NOT "a big mistake", as you put it. It all depends on what you want.
    I want to kick people asses on-line :) And I spades...with Geforce2

    I couldn't care less about slightly better image...Hell,
    I've instructed my GF2 (in my primary machine), to go for
    higher frames rates at the expense of image quality as it is!

    ...And even those lower quality images are really good.
    I mean seriously....If I can see and hit people at 300M with a saw,
    or snipe people at 1000M over cascading landscapes (which actually REQUIRES HIGH image quality (unlike the dungeon motif in Quake where people are in your face at 50m or less), then the image quality is NO problem, Is it?

    I tested my D3d settings at high image quality and it really wasn't ALL THAT much better.....but when I instructed GF2 for frame rate over image quality...I got ass kicking game speed...just ask anybody who plays DFLW who Bud Bogart is....
    heheh....People either love me or accuse me of cheating...depending what team they're on, LOL

    And I do no video watching or editing so there's really no reason for me to own a Radeon simply because it has a little better image quality. Other People here (who actually own a GTS & Radeon), have already said the Radeon is slower....and that's confirms what I've already seen as far as benchmarks go....your propaganda isn't going to work on me :) The empirics are in, GF2 is a better card for games.

    Now go be happy with your slower Radeon and I'll go be very happy with my faster GeForce.

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/13/01 04:49 PM.</EM></FONT></P>
  12. The image quality of a MX200 is the same as a GF2 Ultra dude. The speed maybe different but the image quality is the same for all the GF2's. I wasn't compairing speed between the two. Do you want me show you some images showing the weakness of the GF2 line?????

    If you are happy with your setup thats great, keep it.

    So what benchmark are you talking about dude??? I am sure my measly slow Radeon would wipe your GF2 all over the place in VillageMark. In 3dMark2001 and probably in a number of other benchmarks. While in QuakeIII the GF2 would do ok and a number of other programs. If the game is playing at a smooth playable rate then what the hell good is adding more FPS, especially when your FPS is exceeding the monitor refresh rate. Your right, not a damn.

    So image quality should be improved and that is what the Radeon does, gives you alot more substance in game image, alot more. Not propaganda but truth, just except it.
  13. Oh yea the new Radeon 64's would indeed be faster then a GF2 due to its 5ns ram and faster core. The new Radeon 64's are getting over 4500 in 3dMark2001, are you???
  14. "That's not even the real GeForce man! It don't mean a thing!"
    sports the same image quality as a gts/pro/ultra and that is what he was talking about. I don't think that you understand that if you have a certain amount of fps in a game having more won't make you better in a game. You obviously don't care about image quality so why are you even here? Changing your setting to go from 100fps to 120fps and making the image even worse is pointless. Man, you probably set all your games for 16bit...
  15. Man, you crack me up.

    "If the game is playing at a smooth playable rate then what the hell good is adding more FPS,.."

    Well to enlighten you….It makes my program run faster…understand??
    And if my program runs faster, My game character runs faster, and when my game character runs faster than yours….you're toast….The people I'm shooting at have a harder time hitting me. It has some to
    do with lag aslo …but the frame rate and game speed increase that in my favor. I can prove it if you want to meet me in a LW room.

    I can get better image quality simply by forcing it in the properties menu…but RAW FPS is where it's at.

    Maybe I'll post some benchies when I get time.

    I'm not in touch with my feeings, and I like it that way!
  16. Blow me my previous post.

    I'm not in touch with my feeings, and I like it that way!
  17. I've got 5 and 5.5 nano on my cards...what r u talking about??? And since we're talking about ram nano now, GF3
    has 3-3.5 nano...whatever...

    I'm not in touch with my feeings, and I like it that way!
  18. So are you saying the GF2 makes you run faster, jump taller buildings and dodge bullets?? That the cpu will process the game code faster too?? So then a GF3 would really make you be able to leave orbit right?

    Really that is not how it works, at least in every game I've ever played. So then you should play in 640x480x16 all the time, so that you can be SuperBoy. Just kidding there.
  19. Actaully, YES, IT DOES :)

    That IS how it works. The game runs faster. Is that hard for you to understand?? Haven't you ever played an old game on your new computer and noticed that is was totally to fricking fast. I've got tons of old games that simply run to freaking fast to play on my newer machines. Try scrolling WarcraftII for instance.

    I can actaully jump up on roofs (from fences) when I have faster frames (100+)...really. I couldn't do that at lower FPS.

    I can clear more ditches, dodge more bullets...

    You medicore gamers think you know it all....LOLOL.

    Mess with the killers and we'll show you how it's done.


    I'm not in touch with my feeings, and I like it that way!
  20. Well bud, if framerate is everything. Why don' u get a card withno EMBM, no bumpmapping, no fsaa, no nFinix engine, no pixel shader, no vertex shader, no Dx 8 compatability, no HDVP( in the GF3), no 32bit Z-stencil, with 16 colour, 640x480 resolution, no Digital Vibrance control(it's in your geforce MX) , no Twinview, no DVI support, no DVD hardware decoding. Then you just need add some Superfast ram and core clock, it will become the perfect card for u? Correct???
  21. LOLOL....Dork. I play LW @1024x76x32 bump and bit maps are on....detail is high...texture is high color is 32-bit...keep ranting dude. I play Superbike2001 at
    1280x1024...or 1600x1200 in single player where a simple 50 FPS will do....bit mapping on....details on "ultra", as high as they will go...

    ...put that in your pipe and smoke it...LOL

    You slay me man....keep tryng though

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/13/01 05:41 PM.</EM></FONT></P>
  22. All this talk about games is making me want to play....I need to go home!! ....hehehehe.

    I sold about 50 games on e-bay 2 sets of 25 CDs and manuals.

    And I still have about 50 games left.....I NEED MORE GAMES!

    I think I'll go buy a new game in a few min when I get off
    work. Any recomendations??

    You see, I play games for competion. Kicking all your
    inept asses relievs stress...aaahhhhhhh.

    I like to be #1 in a fifty player room!!! (Maybe #2...heheh)

    You play a couple rounds against the computer and you think
    you know it all about GOSH what ignorance!

    You novices crack me up.....You're the little whimps who
    whine and cry that all the high scorers cheat....LOL

    Get a GeForce.

    I'm not in touch with my feeings, and I like it that way!
  23. I'll challenge you to a game of Quake 3 or Counter Strike Anytime Friend and lets see if your 90fps to my 77fps will make a difference when my superior Skill kicks in.
    Don't listen to that Nividian Troll Frame per second is not everything! U want the card that suites u best overall, who give a flying [-peep-] if some guys gets 125 fps in 16bit with his GTS 2. I don't know about you but I play all my games in 32bit. There are many benchmarks of the radeon beating GTS 2 in 32bit.
  24. READ THE POSTS ...TARD...who said anthing about 16-bit??
    I sold my quake3...and I've tired of CS....But, I'll be hosting LW tonight in public at 9pm pacific, buttmunch. I'll call it "radeonwussies"....hmmm hope that fits. cya there pussy

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/13/01 06:30 PM.</EM></FONT></P>
  25. Quote:
    I can actaully jump up on roofs :tongue: (from fences) when I have faster frames :eek: (100+)...really. I couldn't do that at lower FPS.

    This is a simple test that anyone can confirm and see that you are smoking something:

    3dMark2001, game test 3, Max Payne (Same game engine for the Max Payne game coming out).

    . . 94.3FPS
    . . 37.28 seconds to complete

    . . 66.5FPS
    . . 37.28 seconds to complete

    . . 32.6FPS
    . . 37.26 seconds to complete

    You know what, the man did not jump any higher :frown: , didn't run any faster :eek: , didn't shoot any faster :redface: . As you can see the frame rate changed dramatically but yet finished the same amount of time. Run the test yourself and look at the timer as the benchmark runs. You are only fooling yourself man, don't get a GeForce, get a life. I even underclocked my Radeon to 146mhz and the time results where the same with slower frame rates.

    Anyone else can see that this is utter nonsense if the cpu is fast enough. It has nothing to do with the graphics card.

    <P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 07/13/01 08:52 PM.</EM></FONT></P>
  26. Don't you know? Bud can jump and never come down to earth. :smile:
  27. LOL, he was serious or he is very smart and got me to do all this testing. I remember the XT days where you had to have a turbo switch to play the old dos games that didn't have proper game timing controls built into them. That was years ago no decades ago and games are now consistent virtually on any graphics card or cpu. Really I had a very good laugh at all this, I hope he takes it better. Oh was it you who asked about the GF3 MX, here is some info from Xbitlabs:

    Gainward to Ship One More GeForce2 Pro Based Graphics Card
    Posted 7/13/01 at 2:40 pm by Rat

    NVIDIA has already stopped shipping its GeForce2 GTS and reduced the shipments of GeForce2 Ultra chips to the graphics cards makers offering GeForce2 Pro instead. <b>However, GeForce2 Pro will also leave the market quite soon, because according to NVIDIA’s roadmap, GeForce3 MX, which is to come out in the fall, will be aimed at the same price group as the today’s GeForce2 Pro.</b> That is why NVIDIA doesn’t pay too much attention now to the way the customers use its GeForce2 family chips. Therefore, no wonder that more cards based on GeForce2 Pro and featuring non-standard clocking begin cropping up now. One more card like that based on GeForce2 Pro and working at non-standard frequencies is now shipped by Gainward.
    The card is called GeForce2 Pro 400 (Gainward seems to be driving at MX 400, which is an overclocked GeForce2 MX). This card is equipped with 64MB 4.5ns memory working at 450MHz frequency instead of the common 400MHz, which almost reaches the GeForce2 Ultra parameter. Also GeForce2 Pro 400 features an overclocked core, which works at 220MHz instead of 200MHz. The card goes with a TV-Out.
  28. I just can't figure out if he was serious or it was a joke on his part. If it wasn't a joke to him, it still is to us :)
  29. I had high hopes for those MX-400 boards until you described your benchmarks. Later, THG had <A HREF="" target="_new">this review</A>. I was surprised that an MX-400 with 3.5ns memory and clocked at 250/260 was completely outclassed by a mundane Geforce 2 GTS with only 6ns memory. The only advantage with the MX-400s of value to me is they have 64mb of memory.
  30. only behind by about 20 fps at 1280x1024 in quake 3 which still put it in the playable zone. Only 10fps loss in benz racing and 1fps behind in aqua (which both cards where far from playable).
  31. OK, I was concentrating on the Quake 3 results but my point is that the MX-400 doesn't perform as well as a Geforce 2 GTS with only 6ns memory. Now, the fast MX-400 was pretty much maxed out. How would it compare to Geforce 2 GTS with faster memory or one that was overclocked?

    At pricewatch the fastest MX-400 that I can find is the 4ns variety at $98. I'm sure with 3.5ns memory it will cost more. I also see a Visiontek Geforce 2 GTS for $105. (I checked. 166mhz memory core, meaning 6ns DRAMs). Which would you rather buy?

    By the way, I missed a point in review. The fast MX-400s (3.5ns and 4ns memory) on have 32mb of memory.
  32. God you are retarded...And all that means nothing dude....just a bunch of self delusional bullshit really.

    I love it when some retards try and give me some irrelevant over anal-ized figures that mean nothing in order to delude themselves into thinking they're clever. How can you convince me otherwise from what I already know to be true from my own first hand experience.

    That somehow since I can't see more than 30FPS that therefore anything more is usless and won't imporove my game play...whatever man...U don't get and you never will.

    So when we all hit 30fps Iit's time to throw r GF2s away and get radeon...because 30fps is all anyone can make use of...LOL...tard

    Just becuase ur benchmark finishes in a certain time period means absolutely nothing to the point I'm making....which is, the more FPS I can get, the more fluid my character is going to move. Is there a limit? perhaps, but I ain't seen it yet.

    My hosted game finished in a certain time period also....
    But that had NO bearing on how fast I moved during the play.

    Like I said, you like your card...great....but stop bullshitting everyone man. And for God's sake stop comparing
    low end cards to (non-existant) Radeon2,3,4,5....etc. Talking about coming down to reality....geez

    Oh and you don't need raw FPS because the next version of NASCAR or Quake or whatever will NEEEEVER need more horsepower because it won't have highr res, polys, textures, and details.....right...hahah...But at least it will look good on your radeon...even if it only runs 10fps

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/14/01 02:48 AM.</EM></FONT></P>
  33. Quote:
    If you can wait, I do beleive that the new Radeon will be coming out shortly, and what this will do is lower the prices of all the other cards as competition has a way of doing.

    So when does the release happen? Only I'm putting off buying a Radeon cause I want to get it cheaper.

    "Now drop your weapons or I'll kill him with this deadly jelly baby." :wink:
  34. Here's a radical thought....(repeat of what I originally said when this thread began) Let's let this guy search out the benchmarks for himself....I'm perfectly at ease with his ability to search and perus the the empirical evidence
    out there on which card is the best (for him).

    I don't come here to go around the mulberry bush with self delusional champions of video card and chip makers. Like I said originally Radeon is good...almost as good as GeForce.

    I've seen plenty of benchmarks and heard plenty of anecdotal
    testimony that confirms my point.

    And If I thought for one moment that Radeon was all you say it is, I'd RUN LIKE HELL right this second to get one ==8-)

    I'm not married to Geforce, Asus, Nvida or any damn manufacturer. I'll use whatever I think is best based on personal experience, reliable benchmarks and testimonial.

    And I'm not switching to Radeon any time soon because I want the fasted card (32bit color with all the truefrom, pixel shader, hyper-z,...whatever) that I can get.

    But like I said....I'm not here to brainwash anyone (like u)
    If raw FPS isn't you bag, and total image clarity is a must....get the Radeon....Like I care.....

    I'm only here to lend a hand where I can...To the newbies who are frustrated as hell at all the computer tech stuff.
    I feel for them because several years ago I was one of them.
    And If I can help someone it makes me feel good. I have no
    interest whatsoever in PROSELYTIZING the way you do. And from here on out I'm gonna ride you on that.

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/14/01 03:26 AM.</EM></FONT></P>
  35. "By the way, I missed a point in review. The fast MX-400s (3.5ns and 4ns memory) on have 32mb of memory."
    Your also forgetting that they have SDR not DDR.

    But I do agree. I'm not trying to say that someone should buy it over a normal geforce 2, what I'm saying is if you had those MX's it wouldn't be all bad. Right now, there isn't a reason to buy a geforce as the pro's only go for about $30 more and would have 5.5ns or 5ns DDR ram. On top of that, the Radeon with 64mb VIVO 5ns DDR is only another $30 more and you'd get way more performance. I think only an geforce 2 ultra would beat the 5ns radeon and nvidia jacked the price on the ultra so no one would pick it over a geforce 3.

    "That somehow since I can't see more than 30FPS that therefore anything more is usless and won't imporove my game play...whatever man...U don't get and you never will." who said that?

    "Just becuase ur benchmark finishes in a certain time period means absolutely nothing to the point I'm making....which is, the more FPS I can get, the more fluid my character is going to move. Is there a limit? perhaps, but I ain't seen it yet."

    the point we are making is that over say 60fps having more fps wouldn't make the characters seem more fluid as that is pretty much the point of no return for what you are trying to say.

    You are young, but on top of that you are arrogant and you are just plain misinformed. You obviously don't know too much about what your talking about.
  36. My gosh you PRESUME alot about me. Do you make presumptions
    like that in all aspect of your life?? Because that is foolish as hell.

    Young? No, Arrogant? Yes. But certainly no more arrogant than you. I'm 35 BTW and don't mind admitting it. I also
    don't mind admitting I don't know everything. And I'll add to my last post that I also come here to glean information from others who know more than me.

    But I also know how to sort bullshit from fact and I'm not impressed with bullshiters and people who spout irrelevant
    figures to impress youngsters like yourself.

    I could go into the fact that I do notice a difference even beyond 50-60 FPS...perhaps that's because LW uses more horsepower than the games you're used to playing. But It's late and I just finished kicking asses on-line with my HIGH FPS GEFORCE CARD THAT DOESN'T CHOKE WHEN PEOPLE GET CLOSE, IT MAINTAINS IT'S HIGH FRAME RATE CONSISTENTLY. can call me a youngster, a liar, misinformed...whatever, But I know the truth...And so do the hard-core gamers. I don't expect anal techno-crates
    to understand.

    Did you say, No reason to buy a regular Geforce bacause
    mx has 3-4 nano memory?? Are you forgetting the mx architecture...2 pipes...etc....

    Uh...I've installed and played on a few mx cards and they
    start to drop FPS under load...I don't think it's all about the nanoseconds rated on the memory.

    I've got several Gfs here and can testify that the GF2pro I bought a few weeks ago for $130 was worth the extra money over an MX. I've got the PRO up to 420mhz mem and 250mhx core and it's doing very well.

    Goodnite now, I need sleep because this old man needs to take his kids to the lake tomorrow.

    I'm not in touch with my feeings, and I like it that way!<P ID="edit"><FONT SIZE=-1><EM>Edited by bud on 07/14/01 04:22 AM.</EM></FONT></P>
  37. Sorry for calling you young, I just assumed that by your conduct.

    "perhaps that's because LW uses more horsepower than the games you're used to playing"
    That's all I have to do. Uses more horsepower so the game runs faster if you have more fps? Jesus your wayyy off.
  38. Bud, I am an nvidiot, and Love the geforce line, but you are a moron.

    "Friends don't let friends buy Pentiums"
  39. Bud I hope you stick around because you are a good joke. Delusional and contradictory as well. How high someone jumps in a game is controlled by the program, do you understand??? Obviously not, I can take my MX or my Radeon your TNT or a GF3 and do all sorts of resolution changes and clock changes and you know what? In QuakeIII and Serious Sam the character jumps the same distance, move just as fast from point A to B, shoots the same number of grenades at the same rate etc. etc. etc. Only difference is the smoothness of the movement not the speed.

    Really? you can tell the difference between 40FPS from 80FPS in an animation, you must have one big headach when you play because when you play the framrate is jumping around the place from the hundreds to the teens except the hundreds will be truncated by your refresh rate of your monitor if you didn't know. Maybe you see that FPS counter and when it goes balistic (as in 150FPS) you get an mini organism :lol: . It is nice knowing someone who can react in 1/60 of a second continuesously. You must have very fast hands and fingers. Can you tap your finger 60 times a second Bud? What is the quickest time someone can react from sight? How fast is the signal from the eye to the brain and then to the arms and fingers? Do you think it is faster then 1/30 of a second?

    I can type over 100 words per minute, hauling ass. Lets see, average length is 6 characters, so how many characters per second. 100w/min x 6ch/w x min/60sec = 10 characters per second. That is with 10 fingers doing the action there bud, that is the fastest my well trained mind can react to the visual stimulation the print and the keyboard. Yet you are saying that with your finger(s) and arm (mouse) and small brain you will be able to react better seeing a 80FPS then over a 60FPS game? Or 30FPS game? In other words you are saying that you can react 1/80sec intervals 1/60sec intervals etc. when in reality physiologically it is impossible. Just plain delusional. I do get the feeling that this is way over your head.

    Bring it on Bud, we just can't wait for the crap that will ooze out of your fingers. Come on. We all need a good laugh. Bud, you just made yourself a new Buddy. Don't you feel grand now?
  40. He he... I'm the original poster and I didn't know it grew so much since I didn't receive any email because you replied to each other, but I'm not angry for that of course, I'm just very surprised to see how much the post evolved since I saw it last. All those people speaking with their heart is very exciting, LoL! Well, for myself, if you remember well, I was asking what is best all around card between RADEON 64 megs, GeForce GTS 32 megs and GeForce Pro 64 megs. Here, my dealer sells the GTS a bit less expensive than the RADEON and the PRO is significatly more expensive than both other. In canadian devise, that is: 260 $CAN for RADEON VIVO 64 megs, 220 $CAN for GTS 32 megs (PURE from ASUS) and 395$CAN for the PRO from the same companie.
    I just reconsulted the benchmark that TOM has made for the GeForce 3 at
    and I was happy to see that those 3 cards are in the shootout. I just looked for the 32 bits benchmark and at resolution at least 1024 x 768 because that is what I am looking for. The difference between them is not very big but my impression is that the ATI is the worst of the three, even though not by far. Except for 3dMark where it is the best of the three, BUT this benchmark is much more theorical, isn't it?
    My conclusion is that I should better go for GTS, perhaps PRO.
    To enter the polemic, my opinion is that you never see any difference between 30 FPS and 200 FPS! However, when you see that the benchmark is saying 30 FPS, nobody should be happy with that. That is because the benchmark makes an average, but if it was supplying a graphic of the evolution of the FPS in function of the time, we should clearly see that this line is not straight at all and in some time, it should be waving below 30 and then up, see what I mean? I mean that frequently, you should see scintillements. The conclusion for this is that we should always have an average of FPS that makes sure that the real FPS never drops below 30 FPs at any time. That may explain why bud says that he sees difference between much higher frame rates. However, I must say that he may exagerate. And about the statement that old games run too faster for new computers, that has been corrected for newer games. So there is no reason to believe that in Quake 3, for exemple, you would shoot faster than your opponents because your game is "running faster". Only lagging affects the speed whit which you kill faster than others.
    Another point is that if the cards really supplies 30 FPS, and our eyes are also at 30 FPS, our eyes not always refresh exctly at the same time as the video card do. That may also result as apparent scintillements. I don't know however what frame rates should remove that disadvantage.

    What do you think of my point of view?

    Dasdrasvyet Sovyetskikh Soyuz! (not true)
  41. Sounds pretty resonable. Just note that the Radeon 64 in Tom's review was at 183mhz not the 200hmhz version selling now. Which for the most part would have a significant impact on the benchmarks. Also the game Max Payne is reflected by 3dMark2001 so that benchmark will become partially more usable when the Max FX engine is used more often in games which I think it will be due to its advance physic effects. If you think FPS makes something better overall (not that you think this personally) then that is fine with me but I would disaggree. Still the GF2 pro is a fine card and a great gaming card. I just think it is missing a few bells and whistles.
  42. If you want to understand the significants of benchmarking I recommend you gloss over or read the following discerning and truthful article:

    <A HREF="" target="_new">Benchmarking: The Money Game</A>

    If you decide to read this very intuitive break down of benchmarking make sure you hit the <b>Next</b> button on the bottom of the page, it is actually very indebt and long but well worth it.<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 07/14/01 03:36 PM.</EM></FONT></P>
  43. Listen u Nvidia Anal LIcker, your not worth my time!
    As If I gave a flying [-peep-] about what U say or do with your Geforce in 16bit.
    <---Cares Ya!
    I wouldn't come to your [-peep-] server your probably cheat when I start beating you.

    Go Troll somemore for Nvidia!
  44. I'm not afraid at all to read in internet. I'm always searching good article and that one you gave me is a pearl. Well, I didn't read it yet, because there was a link to the at the beginning which was really good and I found another at the same site which is even better:
    The cas against Nvidia:
    It's talking about the part's price for computers that are all less expensive than two years ago except for video cards.
    I'm now going to read your original article which seems to be very interresting too. BTW, the adress you gave me is not good, it's

    Now, I know that benchmarking is not a totaly trusted way to compare products and I think now that Radeon 64 megs is better than GTS. Do you think that at 200 mhz, Radeon should have beaten the GTS in tom's article?

    Dasdrasvyet Sovyetskikh Soyuz! (not true)
  45. The above link should work now, at least it does for me. The Radeon did beat the GF2 already in a few things and would probably done much better with the added speed. The new Radeons are much more overclockable then the previous Radeons so now we are not talking about a 183mhz Radeon card but a Radeon that does 230mhz plus card, but yet that is still not the limit of the new Radeons. So basically all those benchmarks are virtually meaningless for you if you bought the 200mhz Radeon because its performance would be way better then indicated there. Also realize if somewhat went by those benchmarks and bought a GF3 and stuck it into his 400mhz celeron what would happen. He would never even come close to what Tom got. The article I mentioned above opens up the blinders about benchmarks.

    The best thing is to get accurate knowledge and then based your decision on facts vice emotional out bursts. I can show you benchmarks where the Radeon is as fast as a GF3 within one frame, what does that mean?? Benchmarks can be affected by some of the most trivial subtle difference that makes them not even an aid but a hinderance. For one, VilliageMark which was propagated by PowerVR for their first Kyro chip is used by some hardware review sites to test video cards. I see the Radeon get 40FPS on the default test which beats out the GF2pro and GF2 and slightly slower then the GF2 ultra. I already know that the rest of their benchmarks are faulted. Why because the Radeon will do 40FPS in VilliageMark with a 800mhz processor plus if hierarchicalZ is turned off, one registry switch makes that same Radeon do 53-58FPS on the exact same benchmark. A whopping 33% plus change. Already the rest of the benchmarks will be hindered becaused hyperZ is not fully used. This is an extreme example since VilliageMark uses 3 textures, which fits fine into the Radeon 3 texture unit but also VilliageMark renders front to back which works rather well with the Radeon HiearchialZ portion of HyperZ. In most games hiearchialZ may add 3-5%. So the bottom line is will the card do everything I need and want for as long as I want at the right price. Each card has benchmarks that it does better then any other card, but is that the program you use?

    I am glad you have a open mind and are seriously checking into facts. The Kyro2 may be a better solution for you, it is also a fine card I wouldn't rule out that card either.
  46. Here is a link comparing a number of cards in Giants from Ace's hardware.

    <A HREF="" target="_new"></A>

    Wow the Radeon is as fast as a GF3 in 32 bit, it must be the better card all the other ones are trash :mad: . Well that is how some people think and how misleading benchmarks can be. Especially if you compare one site to another you will see some pretty big variations in numbers with surposenly similar setups. Just something for you to consider.
  47. I admitt that I have already been attracted by Kyro 2, but after reflexion, I don't think that it is a card that will get old well compared with GTS and Radeon. Indeed, it lacks of some hardware technologies (the obvious one is T&L). Plus, the image quality may not be as good as RADEON for sure.
    I have to say that I'm not a 3D shooter gamer. I played Half-life but that's it. Quake and cie gives me an headache!!! especially when we are in a labyrinth. Anyway, that is not the point, the point is that I play lot on my computer with games where you command and control from above like C&C and Steel Panthers and close combat 3... the latest is Dune:Emperor and even though I'm not attracted with this game, it should be those kind of games that I will play. So, why buying a powerful 64 megs video card? Because, I may change my mind and play 3d shooter games :-) or, more probably, I want a long-lasting video card.
    It's because when I bought my last computer (PII 300) I was not really aware of video card and I ended up with an ATI expert 98 4 megs. LoL! It's not been long that I shopped for a new one and I bought the best available then: riva TNT which I still have.
    In conclusion, I want a card that will play everything on the market for a year at least except for extreme computer killers like aquanox and I want good image quality. I think that Radeon 64 megs or GeForce2 GTS 32 megs will do that. I'm now looking more for the RADEON...

    Dasdrasvyet Sovyetskikh Soyuz! (not true)
  48. From experience in game programmning I can tell you that the speed of a game on a nVidia TNT card will be the same as with a Geforce2 card or Radeon card. All games use "time scaling" (Not the correct term just what I call it). When running a game the FPS is never constant. To keep the game a constant speed the distances that everything travels is divided by the FPS for each frame rendered. The rendering time is returned by DirectX in the function call, not sure about OpenGL though. What you get is this.


    Unless you change the Constant, which can be done in some games in the options, the game speed never changes.
    However, you are right in saying that this affects the fluidity, but there is a limit to this too. Firstly as said before the refresh rate of the monitor, and secondly the limit of the human eye. It is generally accepted that a refresh rate above 70Hz may be termed, "Flicker Free". The screen is then being drawn fast enough so that the human eye will always perceive a still and comfortable image.
    It is more to do with the difference in speeds between the eye and the monitor.

    <A HREF="" target="_new"></A>

    Did you go by the name <font color=red>bud</font color=red> on <font color=purple>Novaworld</font color=purple> too. I remember playing against you last year on DF2. You were pretty good an bet me a couple of times but I clearly remember wasting your ass consistently. I went by the name <font color=green>Black Muther</font color=green>.
    Haven't played on Novaworld since though. I barely have time to type this.

    <font color=blue>Smoke me a Kipper, I'll be back for breakfast!</font color=blue>
  49. "Time Scaling"

    What we are talking about is the games physics engine. For each character (or car, plane whatever) in a game the physics, i.e. max speed, acceleration, deceleration, gravitational effects etc are mapped out in the games physics. Irrespective of frame rate in modern games these things remain constant. Old games were coded to get the maximum out of slow machines and therefore did not use constant physics engines.

    As pointed out earlier higher frame rates do mean that when the game bogs down it remains playable rather then suffering from flicker or jerks. Try it, grab an early pentium with an old card and Quake, choose a big level and then watch the screen, time to run down a corridor will remain the same irrespective of machine but the old machine will drop frames, the characters will jerk from spot to spot rather then being fluid. As long as your card never drops below about 30fps the game will appear relatively smooth.

    Out of interest, the human eye scans at a lower rate then 30 fps, (when I was at college I was told approx 12 times per second) for an image to appear smooth you need a frame rate that will not flicker in that period, roughly a minimum of twice the optical sensory rate hence a minimum of 30 fps should be ok.


    Look at the size of that thing!
Ask a new question

Read More

Graphics Cards Radeon Geforce ATI Graphics