Doom3 Benchmarks out!

Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

http://www2.hardocp.com/article.html?art=NjQy

Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
version was $345 shipped from provantage).

rms
58 answers Last reply
More about doom3 benchmarks
  1. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "rms" <rsquires@flashREMOVE.net> wrote in message
    news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > http://www2.hardocp.com/article.html?art=NjQy
    >
    > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > version was $345 shipped from provantage).
    >
    > rms
    >


    I thought it was a good article and it makes me happy I have a 9800 Pro
    video card. However, I can't wait to see how Doom 3 plays on systems that
    are a little more "real world". For example, I hope they bench it on
    processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
    like to see an all-round comparison with as many combinations of CPU and
    video cards as possible.

    Thanks for posting that link!
  2. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On Thu, 22 Jul 2004 00:31:21 GMT, "rms" <rsquires@flashREMOVE.net>
    wrote:

    >http://www2.hardocp.com/article.html?art=NjQy
    >
    >Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    >version was $345 shipped from provantage).

    Yes, 6800GT seems to be a great card to buy. The only difference
    between this card and 6800Ultra is the clock speed. Reminds me of
    Ti4200 in some regards.

    Since I have every intention of keeping my 9800 (overclocked past Pro
    speeds) at least till the end of next year, I find the Fx5950 and
    9800XT scores very encouraging. At 1024x768 with very high settings
    (4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

    I think, my graphic card should be able to hit average of 30 fps at
    1024x768 2xAA and 8xAF. That's all I need for Doom3 and the games that
    will be based on its engine, for now.

    As far as pricing of new graphic cards go, the next few months will be
    very interesting.
    --
    Noman, happy with his 9800 (Pro)
  3. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    > As far as pricing of new graphic cards go, the next few months will be
    > very interesting.
    > --
    > Noman, happy with his 9800 (Pro)

    The last couple months saw some good price drops when the 6800 and x800
    became available. Now you can get a 9800 PRO for under $200. I'm still
    clinging to my 5200 ultra until I am forced to part with it :)
  4. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "David Besack" <daveREMOVEbesack@mac.com> wrote in message
    news:cdn4hg$drl6$1@netnews.upenn.edu...
    > > As far as pricing of new graphic cards go, the next few months will be
    > > very interesting.
    > > --
    > > Noman, happy with his 9800 (Pro)
    >
    > The last couple months saw some good price drops when the 6800 and x800
    > became available. Now you can get a 9800 PRO for under $200. I'm still
    > clinging to my 5200 ultra until I am forced to part with it :)

    I follow this rule (Humga's 1st Law of Graphics Card Upgrade):

    Buy the new card when the performance (frame rate usually being a good
    measure) drops to roughly half of that of the new card. Then you must get at
    least **some** cash back for the 'old' card.

    This will ensure that you'll be able to play with your the old and new games
    with decent performance without costing you too much :D

    Please note that the 'new' card isn't necessarily the fastest card in the
    market...think about it.
  5. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Humga" <Humga@no-spam.com> wrote in message
    news:bc-dnfpBMrZzhWLd4p2dnA@eclipse.net.uk...

    >
    > I follow this rule (Humga's 1st Law of Graphics Card Upgrade):
    >
    >

    Pretty cool having a law named after you. ;-)
  6. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "NightSky 421" <nightsky421@reply-to-group.com> wrote:
    > "rms" <rsquires@flashREMOVE.net> wrote:
    > > http://www2.hardocp.com/article.html?art=NjQy
    > >
    > > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > > version was $345 shipped from provantage).
    > >
    > > rms
    > >
    >
    >
    > I thought it was a good article and it makes me happy I have a 9800 Pro
    > video card. However, I can't wait to see how Doom 3 plays on systems that
    > are a little more "real world". For example, I hope they bench it on
    > processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
    > like to see an all-round comparison with as many combinations of CPU and
    > video cards as possible.

    GeForce 4 MX will perform like a turd stuck in toilet seat. Heck,
    even GeForce 3 will drown into the quicksand. I have no idea how much
    difference there is between the "medium detail"- mode and "high
    detail"- mode, but I just refuse to believe that "GeForce 3" would
    surf the game with high details. I couldn't even turn on all the
    details in "Unreal 2" without diving into the bottom of the chart.
  7. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    ATI's OpenGL drivers aren't so great. They are workable but not great.

    The only thing impressive about the new Geforce cards is instancing
    support in Vertex Shader 3.0. And so far it's been used in exactly one
    game, and I don't expect that to change much for a long time.

    ATI hard their cards out first. Unlike NVidia, they don't need to cook
    their drivers. NVidia will have to work very hard to earn back my trust.
  8. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "NightSky 421" <nightsky421@reply-to-group.com> writes:

    > For example, I hope they bench it on processors 1.5GHz and up with
    > GeForce4 MX and GeForce3 cards and up.

    From the article:

    "As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4
    box with a GeForce 4 MX440 video card and having a surprisingly good
    gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce
    3 video card that is two years old will deliver a solid gaming
    experience that will let you enjoy the game the way id Software
    designed it to be."

    Not a benchmark, but at least it's positive (if subjective).

    Nick


    --
    # sigmask || 0.2 || 20030107 || public domain || feed this to a python
    print reduce(lambda x,y:x+chr(ord(y)-1),' Ojdl!Wbshjti!=obwAcboefstobudi/psh?')
  9. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Nada" <nada_says@hotmail.com> wrote in message
    news:b9c228ae.0407220517.66f1e6e0@posting.google.com...
    >
    > GeForce 4 MX will perform like a turd stuck in toilet seat.


    LOL, I love that description!


    > Heck,
    > even GeForce 3 will drown into the quicksand. I have no idea how much
    > difference there is between the "medium detail"- mode and "high
    > detail"- mode, but I just refuse to believe that "GeForce 3" would
    > surf the game with high details. I couldn't even turn on all the
    > details in "Unreal 2" without diving into the bottom of the chart.


    Well when I read the article, I was under the impression myself that the
    game details would have to be turned down in order to get a decent playing
    experience with GeForce3 and Radeon 8500 cards. As to what low detail
    will actually look like, we will see. Not that I'm immediately inclined
    to find out myself, of course. :-)

    As the release date for Doom 3 draws nearer, I for whatever reason find
    myself willing to loosen up the purse strings somewhat. Still, I'm going
    to wait and see if there are any technical or driver issues before taking
    the plunge. I very much look forward to seeing this newsgroup next week!
  10. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "NightSky 421" <nightsky421@reply-to-group.com> wrote in message news:<10fu4eb889gdk53@corp.supernews.com>...
    > "rms" <rsquires@flashREMOVE.net> wrote in message
    > news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > > http://www2.hardocp.com/article.html?art=NjQy
    > >
    > > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > > version was $345 shipped from provantage).
    > >
    > > rms
    > >
    >
    >
    > I thought it was a good article and it makes me happy I have a 9800 Pro
    > video card. However, I can't wait to see how Doom 3 plays on systems that
    > are a little more "real world". For example, I hope they bench it on
    > processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
    > like to see an all-round comparison with as many combinations of CPU and
    > video cards as possible.
    >
    > Thanks for posting that link!


    According to the article Doom3 will come with a time demo, so just run
    the time demo with your card and start a thread with your hardware.
    Then after a couple of weeks someone can put all the data in a
    spreadsheet and give an accounting for the cards that are listed. What
    gets me, is there is no mention of Multiplayer game play anywhere.
    When I get the game this will be one of the first things I will check
    out, cause it will determine the longevity of the game.

    Gnu_Raiz
  11. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "magnulus" <magnulus@bellsouth.net> wrote in message
    news:gMMLc.6214$yF.4333@bignews2.bellsouth.net...
    > ATI's OpenGL drivers aren't so great. They are workable but not great.
    >
    > The only thing impressive about the new Geforce cards is instancing
    > support in Vertex Shader 3.0. And so far it's been used in exactly one
    > game, and I don't expect that to change much for a long time.
    >
    > ATI hard their cards out first. Unlike NVidia, they don't need to cook
    > their drivers. NVidia will have to work very hard to earn back my trust.
    >
    Sour grapes?


    ---
    Outgoing mail is certified Virus Free.
    Checked by AVG anti-virus system (http://www.grisoft.com).
    Version: 6.0.725 / Virus Database: 480 - Release Date: 7/19/2004
  12. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    At 1024x768 with very high settings
    > (4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

    Those numbers were timedemos, not actual in game framerates
    which would be much lower.

    Jeff B
  13. Archived from groups: alt.comp.periphs.videocards.ati (More info?)

    >
    > ATI hard their cards out first.

    Neither ATI or nvidia have their top of the line cards out.

    Jeff B
  14. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "magnulus" <magnulus@bellsouth.net> wrote in message
    news:4EPLc.8235$yF.5657@bignews2.bellsouth.net...
    >
    > If you go out and buy a GeForce FX 6800 just because it runs faster in
    > Doom III, you're a fool. End of line.

    Couldn't agree more, especially when you consider that in three years
    it's going to be selling on eBay for 40 bucks. Video cards have very very
    short life-cycles.


    RayO
  15. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "rms" <rsquires@flashREMOVE.net> wrote in message
    news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > http://www2.hardocp.com/article.html?art=NjQy
    >
    > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > version was $345 shipped from provantage).

    They didn't bench anything older than 5950... what a bunch of clowns.
  16. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    > They didn't bench anything older than 5950... what a bunch of clowns.

    It says in the article that a broader range of CPU and GFX cards will be
    checked for a future feature

    --
    Toby
  17. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On Thu, 22 Jul 2004 16:57:10 +1000, "Darkfalz" <darkfalz@xis.com.au>
    wrote:

    >"rms" <rsquires@flashREMOVE.net> wrote in message
    >news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    >> http://www2.hardocp.com/article.html?art=NjQy
    >>
    >> Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    >> version was $345 shipped from provantage).
    >
    >They didn't bench anything older than 5950... what a bunch of clowns.
    >

    I'm wondering if the low benchmark scores on those cards are
    because of the heavy DX9 shader use. Since a card such as the GF4 Ti
    doesn't support DX9, I'm guessing the game will either switch to a DX8
    code for effects, or just omit the shaders all together. If so, would
    that mean that a GF4 Ti might get framerates that are about the same
    as the DX9 cards, but just not look as good?
  18. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Darkfalz" <darkfalz@xis.com.au> wrote in message
    news:2m96qiFj9hfbU1@uni-berlin.de...
    > "rms" <rsquires@flashREMOVE.net> wrote in message
    > news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > > http://www2.hardocp.com/article.html?art=NjQy
    > >
    > > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > > version was $345 shipped from provantage).
    >
    > They didn't bench anything older than 5950... what a bunch of clowns.

    My thoughts exactly...
  19. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Darkfalz" <darkfalz@xis.com.au> wrote:
    > "rms" <rsquires@flashREMOVE.net> wrote:
    > > http://www2.hardocp.com/article.html?art=NjQy
    > >
    > > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > > version was $345 shipped from provantage).
    >
    > They didn't bench anything older than 5950... what a bunch of clowns.


    I thought it was an okay preview benchmarking article, and I'm pretty
    sure that once the game is out, we'll see plenty of good
    benchmarkings. Keep an eye on www.xbitlabs.com in the upcoming weeks.
    I'd say that if we with our average graphics cards cut out the
    anisotropic filtering seen on the 5950 Ultra benchmark table, the
    framerate will most likely stay around the same speeds with 9800 Pros
    and 5900 XTs. As far as the engine's flexibility goes, I'd take that
    with a grain of ginger when it comes to the "high detail" modes. I
    personally won't consider playing the game anything less than Radeon
    9800 or GeForce 5900. Will GeForce 3 be able to swoop it with high
    details? Hell, no. That dog won't hunt.
  20. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On Thu, 22 Jul 2004 04:46:24 -0500, Larry Roberts <skin-e@juno.com>
    wrote:

    > I'm wondering if the low benchmark scores on those cards are
    >because of the heavy DX9 shader use.

    iD games are OpenGL, not D3D.
    --
    Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
    Help make Usenet a better place: English is read downwards,
    please don't top post. Trim messages to quote only relevant text.
    Check groups.google.com before asking a question.
  21. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    Agreed.......seriously, they couldn't test a Radeon 9800 Pro?? Which was the
    definitive ATI card to buy for more than a year's time........Another thing:
    Is there a particular reason why these guys claim to be "Just publishing
    straight up FPS numbers", and yet they dont test with AA and Filtering OFF?

    Those last batches of tests leave 8x AF *ON*.......seriously, there are
    plenty of gamers out there (like...ME) who never turn on AA or AF.....AF
    puts more of a hit on framerates than low-level AA does.....I'm guessing
    those Radeon XT tests would be higher if you turned off that 8x AF...


    "GuitarMan" <usa@yourface.com> wrote in message
    news:xXNLc.16656$W86.18@nwrdny03.gnilink.net...
    >
    > "Darkfalz" <darkfalz@xis.com.au> wrote in message
    > news:2m96qiFj9hfbU1@uni-berlin.de...
    > > "rms" <rsquires@flashREMOVE.net> wrote in message
    > > news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > > > http://www2.hardocp.com/article.html?art=NjQy
    > > >
    > > > Looks like the 6800GT is the sweet spot, if you can get it cheap (my
    pny
    > > > version was $345 shipped from provantage).
    > >
    > > They didn't bench anything older than 5950... what a bunch of clowns.
    >
    > My thoughts exactly...
    >
    >
  22. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On Thu, 22 Jul 2004 09:52:06 -0400, "magnulus"
    <magnulus@bellsouth.net> wrote:
    >
    > If you go out and buy a GeForce FX 6800 just because it runs faster in
    >Doom III, you're a fool. End of line.
    >

    GeForce 6800 line works fine in other games too. They do trail behind
    X800XT-PE in some DX9 games but not by much. Granted ATI still has to
    optimise their memory controller (which, I read somewhere, is running
    at 60-70% efficiency) and they are also rewriting their openGl drivers
    from scratch. You can expect more optimisations from nVidia as well.

    IMO, X800XT-PE is a better choice (if you can find it, that is) than
    6800Ultra and 6800GT is better than X800Pro, given their MSRPs and
    also the power requirements.

    The bottomline is that these are all great cards and should run most
    of the Source/Doom3/CryEngine/UT based games without any problems.
    --
    Noman
  23. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On 22 Jul 2004 07:46:18 -0400, Nick Vargish
    <nav+posts@bandersnatch.org> wrote:

    >"NightSky 421" <nightsky421@reply-to-group.com> writes:
    >
    >> For example, I hope they bench it on processors 1.5GHz and up with
    >> GeForce4 MX and GeForce3 cards and up.
    >
    >From the article:
    >
    > "As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4
    > box with a GeForce 4 MX440 video card and having a surprisingly good
    > gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce
    > 3 video card that is two years old will deliver a solid gaming
    > experience that will let you enjoy the game the way id Software
    > designed it to be."
    >
    >Not a benchmark, but at least it's positive (if subjective).
    >
    >Nick

    Fingers crossed then.

    --

    Bunnies aren't just cute like everybody supposes !
    They got them hoppy legs and twitchy little noses !
    And what's with all the carrots ?
    What do they need such good eyesight for anyway ?
    Bunnies ! Bunnies ! It must be BUNNIES !
  24. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "magnulus" <magnulus@bellsouth.net> wrote in message
    news:gMMLc.6214$yF.4333@bignews2.bellsouth.net...
    > ATI's OpenGL drivers aren't so great. They are workable but not great.
    >
    > The only thing impressive about the new Geforce cards is instancing
    > support in Vertex Shader 3.0. And so far it's been used in exactly one
    > game, and I don't expect that to change much for a long time.
    >
    > ATI hard their cards out first. Unlike NVidia, they don't need to cook
    > their drivers. NVidia will have to work very hard to earn back my trust.
    >

    The funny thing is, ATI is the company that gets caught "optimizing" their
    drivers in this article. Give it a close read.

    NVida made some unwise design decisions in the last round of cards. As
    such, they had to make some tradeoffs in image quality to get the
    performance up, basically making the best of a bad situation.

    It's funny how different people can interpret the same data differently.
    I've had an ATI card in my box for quite some time but I feel that NVidia
    has the better product this round. If you feel the need to "punish" NVidia
    for the FX series this go around I guess you can do that but I think it's
    your loss.

    B
  25. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    Looks like PCIe 6800GT SLI may be the sweet spot for this game.

    --
    "War is the continuation of politics by other means.
    It can therefore be said that politics is war without
    bloodshed while war is politics with bloodshed."


    "rms" <rsquires@flashREMOVE.net> wrote in message
    news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > http://www2.hardocp.com/article.html?art=NjQy
    >
    > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > version was $345 shipped from provantage).
    >
    > rms
    >
    >
  26. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "NightSky 421" <nightsky421@reply-to-group.com> wrote:
    > "Nada" <nada_says@hotmail.com> wrote:
    > >
    > > GeForce 4 MX will perform like a turd stuck in toilet seat.
    >
    >
    > LOL, I love that description!

    My younger cousins have a GeForce 4 MX and I'm expecting a few dozen
    panic calls at midnight.


    > > Heck,
    > > even GeForce 3 will drown into the quicksand. I have no idea how much
    > > difference there is between the "medium detail"- mode and "high
    > > detail"- mode, but I just refuse to believe that "GeForce 3" would
    > > surf the game with high details. I couldn't even turn on all the
    > > details in "Unreal 2" without diving into the bottom of the chart.
    >
    >
    > Well when I read the article, I was under the impression myself that the
    > game details would have to be turned down in order to get a decent playing
    > experience with GeForce3 and Radeon 8500 cards. As to what low detail
    > will actually look like, we will see. Not that I'm immediately inclined
    > to find out myself, of course. :-)
    >
    > As the release date for Doom 3 draws nearer, I for whatever reason find
    > myself willing to loosen up the purse strings somewhat. Still, I'm going
    > to wait and see if there are any technical or driver issues before taking
    > the plunge. I very much look forward to seeing this newsgroup next week!

    I'm sure the game will still look better than most average FPS games
    with medium details, but to me "Doom 3" is one of those games where I
    can't turn the details off if my life depended on it. It's meant to
    be played in full regalia. I might have to crush my piggy bank as
    well to purchase a new monitor, which is as expensive as getting a new
    graphics card.
  27. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "HeadRusch" <HeadRusch1@NO_SPAM_comcast.net> wrote:
    > Agreed.......seriously, they couldn't test a Radeon 9800 Pro?? Which was the
    > definitive ATI card to buy for more than a year's time........Another thing:
    > Is there a particular reason why these guys claim to be "Just publishing
    > straight up FPS numbers", and yet they dont test with AA and Filtering OFF?
    >
    > Those last batches of tests leave 8x AF *ON*.......seriously, there are
    > plenty of gamers out there (like...ME) who never turn on AA or AF.....AF
    > puts more of a hit on framerates than low-level AA does.....I'm guessing
    > those Radeon XT tests would be higher if you turned off that 8x AF...

    Hardocp do benchmark tests in a different way. In most cases they
    will choose a "sweetspot" for each card where the performance won't
    drop into early teens. My guess is that with 5900 XT and 9800 Pro we
    have to turn AF off, but can still play "Doom 3" with maximum graphic
    effects. That was just a preview test, and I'm sure the web will be
    flooded with "Doom 3" benchmarks once the game is installed in most
    homes. If anything, "Doom 3" will become the most used benchmark the
    next two years, just like Quake III was at the time of its release.
  28. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    ZZZYYno_m_anZZZYY@yahoo.com (noman) wrote:
    > On Thu, 22 Jul 2004 00:31:21 GMT, "rms" <rsquires@flashREMOVE.net>
    > wrote:
    >
    > >http://www2.hardocp.com/article.html?art=NjQy
    > >
    > >Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > >version was $345 shipped from provantage).
    >
    > Yes, 6800GT seems to be a great card to buy. The only difference
    > between this card and 6800Ultra is the clock speed. Reminds me of
    > Ti4200 in some regards.
    >
    > Since I have every intention of keeping my 9800 (overclocked past Pro
    > speeds) at least till the end of next year, I find the Fx5950 and
    > 9800XT scores very encouraging. At 1024x768 with very high settings
    > (4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).
    >
    > I think, my graphic card should be able to hit average of 30 fps at
    > 1024x768 2xAA and 8xAF. That's all I need for Doom3 and the games that
    > will be based on its engine, for now.

    It'll do pretty well for the next six months and perhaps can stretch
    its life to late spring 2005.

    > As far as pricing of new graphic cards go, the next few months will be
    > very interesting.

    It's been very harsh when it comes to the prices. I don't know how
    it's in Canada and USA at the moment, but here in Europe we're seeing
    prices of 500 euros for Nvidia's biggest guns abd ATI's top of the
    line cards aren't too cheap either. I read from somewhere on the
    internet where the Ultra was priced at 800 dollars max which is
    absolutely insane! I've never had a problem of giving up 230 euros,
    but over 500 euros is just way too dam much even for the performance
    these new cards have to offer. I remember the 1994 when we'd struggle
    with 386s with "Doom", so maybe there's a way to get through the
    autumn without a panic inside the piggy bank.
  29. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On 7/22/2004 9:56 PM BRanger brightened our day with:

    >"magnulus" <magnulus@bellsouth.net> wrote in message
    >news:gMMLc.6214$yF.4333@bignews2.bellsouth.net...
    >
    >
    >> ATI's OpenGL drivers aren't so great. They are workable but not great.
    >>
    >> The only thing impressive about the new Geforce cards is instancing
    >>support in Vertex Shader 3.0. And so far it's been used in exactly one
    >>game, and I don't expect that to change much for a long time.
    >>
    >> ATI hard their cards out first. Unlike NVidia, they don't need to cook
    >>their drivers. NVidia will have to work very hard to earn back my trust.
    >>
    >>
    >>
    >
    >The funny thing is, ATI is the company that gets caught "optimizing" their
    >drivers in this article. Give it a close read.
    >
    >NVida made some unwise design decisions in the last round of cards. As
    >such, they had to make some tradeoffs in image quality to get the
    >performance up, basically making the best of a bad situation.
    >
    >It's funny how different people can interpret the same data differently.
    >I've had an ATI card in my box for quite some time but I feel that NVidia
    >has the better product this round. If you feel the need to "punish" NVidia
    >for the FX series this go around I guess you can do that but I think it's
    >your loss.
    >
    >B
    >
    >
    >
    >
    As consumers we should be pleased that these two big video card
    companies are engaged in quality competition. nVidia catching and
    perhaps surpassing ATI on this round of card releases should lead to
    further innovation by ATI, and better products all around for us in the
    future.

    --
    Steve [Inglo]
  30. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On 22 Jul 2004 23:56:24 -0500, "BRanger" <noone@nowhere.net> wrote:


    >It's funny how different people can interpret the same data differently.
    >I've had an ATI card in my box for quite some time but I feel that NVidia
    >has the better product this round. If you feel the need to "punish" NVidia
    >for the FX series this go around I guess you can do that but I think it's
    >your loss.
    >
    >B
    >

    Why do you feel they have the better product? The ATI X800PE still
    beats the 6800u in many benchamrks and doesn't require a beefed up PSU
    or two power connectors.
  31. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    Andrew <spamtrap@localhost> wrote in message news:<g34vf0tg207imi8bjpkdqi31fldam9frbn@4ax.com>...
    > On Thu, 22 Jul 2004 04:46:24 -0500, Larry Roberts <skin-e@juno.com>
    > wrote:
    >
    > > I'm wondering if the low benchmark scores on those cards are
    > >because of the heavy DX9 shader use.
    >
    > iD games are OpenGL, not D3D.

    This is urban legend bullshit. iD (Carmack) prefers OpenGL but
    market sensibilities require them to use M$-Direct3D. A bit of
    clarification: M$-Direct3D is a subset of M$-DirectX. DirectX contains
    Direct3D which is the primary graphics handling portion of DirectX.
    OpenGL, on the other hand, is it's own API.
    A game developer, or any graphics rendering programmer, can choose
    whether to call into use the DirectX (Direct3D) or OpenGL APIs. And if
    they want to be totally safe they can program separate modules that
    let you choose which API to use. Anybody that remembers the original
    Half Life would know that it had an option to choose between OpenGL or
    Direct3D or Software. Most games are programmed for both OpenGL and
    Direct3D. The code to select one or the other is fairly trivial. What
    is not trivial are the pipelines afterwards. OpenGL does not have all
    the features of Direct3D and Direct3D does not have some of the
    performance that OpenGL does. Also, providing modules for both
    increases the programming effort.
    The difference these days is that a lot of games are trying to
    decide on their own which API to use based on what hardware is being
    used, sometimes with mixed results.
  32. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On 22 Jul 2004 23:56:24 -0500, "BRanger" <noone@nowhere.net> wrote:

    >"magnulus" <magnulus@bellsouth.net> wrote in message
    >news:gMMLc.6214$yF.4333@bignews2.bellsouth.net...
    >> ATI's OpenGL drivers aren't so great. They are workable but not great.
    >>
    >> The only thing impressive about the new Geforce cards is instancing
    >> support in Vertex Shader 3.0. And so far it's been used in exactly one
    >> game, and I don't expect that to change much for a long time.
    >>
    >> ATI hard their cards out first. Unlike NVidia, they don't need to cook
    >> their drivers. NVidia will have to work very hard to earn back my trust.
    >>
    >
    >The funny thing is, ATI is the company that gets caught "optimizing" their
    >drivers in this article. Give it a close read.

    Here's what John Carmack said,

    "On the other hand, the Nvidia drivers have been tuned for Doom's
    primary light/surface interaction fragment program, and innocuous code
    changes can "fall off the fast path" and cause significant performance
    impacts, especially on NV30 class cards."

    It may be that the 'fast path' is the way shaders are compiled to get
    around the NV3x series restrictions.

    Both cards have optimizations. The valid ones are good for everybody.
    I'd be worried if ATI and nVidia had given up on them and were just
    relying on brute force to solve all the issues.

    >It's funny how different people can interpret the same data differently.
    >I've had an ATI card in my box for quite some time but I feel that NVidia
    >has the better product this round. If you feel the need to "punish" NVidia
    >for the FX series this go around I guess you can do that but I think it's
    >your loss.

    Good thing about this generation is that both series of cards are
    equally capable and you can't have a wrong choice.

    X800 is still ahead in shader heavy DX9 games. The new FarCry
    benchmarks (using SM2.0b on X800) show X800PE to be 15-20% ahead of
    6800Ultra (which is using SM3.0). This should be good news for X800
    owners who are waiting for STALKER or Half Life 2. 6800 is clearly
    ahead in DOOM3 and it's likely that the lead will be carried over to
    other DOOM3 engine games. However, to me the more important thing is
    that nVidia in DX9 and ATI in openGl are competitive enough that most
    people would not regret purchasing either of the 6800 or X800 cards.

    It comes down to price then. It's hard to beat 6800GT if you can get
    it for 300-340$ and X800XT-PE is great at 400-450$ range..........
    (says the person, who doesn't buy graphic cards over 200$ :) )
    --
    Noman
  33. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On 23 Jul 2004 12:57:46 -0700, blog_smirk@yahoo.com (Blig Merk) wrote:

    > This is urban legend bullshit. iD (Carmack) prefers OpenGL but
    >market sensibilities require them to use M$-Direct3D. A bit of
    >clarification: M$-Direct3D is a subset of M$-DirectX. DirectX contains
    >Direct3D which is the primary graphics handling portion of DirectX.
    >OpenGL, on the other hand, is it's own API.

    I don't take tech lessons from clueless trolls.
    --
    Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
    Help make Usenet a better place: English is read downwards,
    please don't top post. Trim messages to quote only relevant text.
    Check groups.google.com before asking a question.
  34. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "JB" <fake@addy.com> wrote in message news:EgQLc.4994$eM2.1630@attbi_s51...
    > At 1024x768 with very high settings
    > > (4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).
    >
    > Those numbers were timedemos, not actual in game framerates
    > which would be much lower.
    >

    ?

    K
  35. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    On 23 Jul 2004 04:49:43 -0700, nada_says@hotmail.com (Nada) scribbled:

    [snip]

    >It's been very harsh when it comes to the prices. I don't know how
    >it's in Canada and USA at the moment, but here in Europe we're seeing
    >prices of 500 euros for Nvidia's biggest guns abd ATI's top of the
    >line cards aren't too cheap either. I read from somewhere on the
    >internet where the Ultra was priced at 800 dollars max which is
    >absolutely insane! I've never had a problem of giving up 230 euros,
    >but over 500 euros is just way too dam much even for the performance
    >these new cards have to offer. I remember the 1994 when we'd struggle
    >with 386s with "Doom", so maybe there's a way to get through the
    >autumn without a panic inside the piggy bank.

    Well, the difference of course is back then, the jump from a 386 to a
    486 (or even a 486SX25 to a DX2/66, etc) was huge. It'd be like going
    to a P4 8GHz. Moore's Law or no Moore's Law, the speed jumps made now
    aren't nearly as dramatic, simply because there are more incremental
    releases.

    But yeah, Doom 3 will be good, and it will cause many people to spend
    a lot of money. ;)

    -Slash
    --
    "Ebert Victorious"
    -The Onion
  36. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    No, it means they won't bench well.

    This is the funniest benchmark I've
    ever seen, one where 3.2GHz/2Gb/5950
    is the low-end.

    Hahaha, good one! <LOL>

    RayO

    "Toby Newman" <google@asktoby.com> wrote in message
    news:MPG.1b697cc68f36128f9897bf@localhost...
    >
    > > They didn't bench anything older than 5950... what a bunch of clowns.
    >
    > It says in the article that a broader range of CPU and GFX cards will be
    > checked for a future feature
    >
    > --
    > Toby
  37. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Slash" <slash3@geocities.com> wrote in message
    news:4kv5g0lgivp35b4oqedoigk4cjhsk59hk2@4ax.com...
    > On 23 Jul 2004 04:49:43 -0700, nada_says@hotmail.com (Nada) scribbled:
    >
    > [snip]
    >
    > >It's been very harsh when it comes to the prices. I don't know how
    > >it's in Canada and USA at the moment, but here in Europe we're seeing
    > >prices of 500 euros for Nvidia's biggest guns abd ATI's top of the
    > >line cards aren't too cheap either. I read from somewhere on the
    > >internet where the Ultra was priced at 800 dollars max which is
    > >absolutely insane! I've never had a problem of giving up 230 euros,
    > >but over 500 euros is just way too dam much even for the performance
    > >these new cards have to offer. I remember the 1994 when we'd struggle
    > >with 386s with "Doom", so maybe there's a way to get through the
    > >autumn without a panic inside the piggy bank.
    >
    > Well, the difference of course is back then, the jump from a 386 to a
    > 486 (or even a 486SX25 to a DX2/66, etc) was huge. It'd be like going
    > to a P4 8GHz. Moore's Law or no Moore's Law, the speed jumps made now
    > aren't nearly as dramatic, simply because there are more incremental
    > releases.
    >
    > But yeah, Doom 3 will be good, and it will cause many people to spend
    > a lot of money. ;)

    Some corrections:
    1) Doom I was released summer of '93, not '94.

    2) The 486 was nothing new even in '93, let alone '94. Many people
    had started upgrading to 486s from about late '91 and through '92 and
    '93. I believe Intel had even stopped making 386s by '92 to focus on
    the 486, to make AMD go away with it's 40MHz 386. So when
    Doom I was released it's system requirements were
    nothing very special at all, neither was Wolf3D's the previous year,
    which played nicely on a good 386. Both these games on their release
    played very well on the bulk of the installed base. They were both also
    sharewared, but that's another story.


    RayO
  38. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    In <41014386.771070150@news.individual.net> ZZZYYno_m_anZZZYY@yahoo.com (noman) writes:

    >X800 is still ahead in shader heavy DX9 games. The new FarCry
    >benchmarks (using SM2.0b on X800) show X800PE to be 15-20% ahead of
    >6800Ultra (which is using SM3.0). This should be good news for X800
    >owners who are waiting for STALKER or Half Life 2. 6800 is clearly
    >ahead in DOOM3 and it's likely that the lead will be carried over to
    >other DOOM3 engine games. However, to me the more important thing is
    >that nVidia in DX9 and ATI in openGl are competitive enough that most
    >people would not regret purchasing either of the 6800 or X800 cards.

    Maybe this will be the incentive ATI needs to finally make an OpenGL
    driver that's not a slug.

    --
    Artificial Intelligence stands no chance against Natural Stupidity.
    GAT d- -p+(--) c++++ l++ u++ t- m--- W--- !v
    b+++ e* s-/+ n-(?) h++ f+g+ w+++ y*
  39. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    In <yv-dnfdhr7iW8ZzcRVn-tA@speakeasy.net> "Eric" <Eric@nospam.com> writes:

    >But what about at high detail with no AF -- that's what I want to see (the
    >hell with medium quality)? I'm hoping my new 5900XT can run doom3 at 40 or
    >above fps at high quality settings, at 1024 x 768 (with no AA and no AF).
    >Note that I have a P4 2.6 (800mhz FSB) and 1 GB of DDR ram.

    >This article claims that there is little visual benefit to AF:

    >http://www.extremetech.com/article2/0,1558,1157434,00.asp

    >So if I can turn off AF and turn off AA at "high quality" doom 3 settings --
    >and run at 1024 x 768 with at least 40 fps -- I'll be happy.

    I don't know how recently you looked at the article but there's an
    apology at the bottom about having the same pictures for both af and no
    af, and the new pictures, at least for me, show a distinct difference,
    as he notes, mainly on the floor, where the lines between the stones on
    the floor are much more clear cut and realistic than the blurred
    together stones in the no af picture. I could easily live with the
    blurry floor but it certainly isn't as unnoticable as the article made
    it out to be to be.

    --
    Artificial Intelligence stands no chance against Natural Stupidity.
    GAT d- -p+(--) c++++ l++ u++ t- m--- W--- !v
    b+++ e* s-/+ n-(?) h++ f+g+ w+++ y*
  40. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    Quoth The Raven "RayO" <Menothere@nohog.net> in
    NwDMc.954$Qm2.769@nwrddc04.gnilink.net
    > "Slash" <slash3@geocities.com> wrote in message
    > news:4kv5g0lgivp35b4oqedoigk4cjhsk59hk2@4ax.com...
    > > On 23 Jul 2004 04:49:43 -0700, nada_says@hotmail.com (Nada)
    > > scribbled:
    > >
    > > [snip]
    > >
    > > >It's been very harsh when it comes to the prices. I don't know how
    > > >it's in Canada and USA at the moment, but here in Europe we're
    > > >seeing prices of 500 euros for Nvidia's biggest guns abd ATI's top
    > > >of the line cards aren't too cheap either. I read from somewhere
    > > >on the internet where the Ultra was priced at 800 dollars max
    > > >which is absolutely insane! I've never had a problem of giving up
    > > >230 euros, but over 500 euros is just way too dam much even for
    > > >the performance these new cards have to offer. I remember the
    > > >1994 when we'd struggle with 386s with "Doom", so maybe there's a
    > > >way to get through the autumn without a panic inside the piggy
    > > >bank.
    > >
    > > Well, the difference of course is back then, the jump from a 386 to
    > > a 486 (or even a 486SX25 to a DX2/66, etc) was huge. It'd be like
    > > going to a P4 8GHz. Moore's Law or no Moore's Law, the speed jumps
    > > made now aren't nearly as dramatic, simply because there are more
    > > incremental releases.
    > >
    > > But yeah, Doom 3 will be good, and it will cause many people to
    > > spend a lot of money. ;)
    >
    > Some corrections:
    > 1) Doom I was released summer of '93, not '94.
    >
    > 2) The 486 was nothing new even in '93, let alone '94. Many people
    > had started upgrading to 486s from about late '91 and through '92 and
    > '93. I believe Intel had even stopped making 386s by '92 to focus on
    > the 486, to make AMD go away with it's 40MHz 386. So when
    > Doom I was released it's system requirements were
    > nothing very special at all, neither was Wolf3D's the previous year,
    > which played nicely on a good 386. Both these games on their release
    > played very well on the bulk of the installed base. They were both
    > also sharewared, but that's another story.
    >
    >
    > RayO

    I had a 486sx2/50 (no math coprocessor) and doom1 S/W ran like a dog with 2
    legs, it was literally a slide show. I didn't upgrade until the P90 was a
    year old. who could afford it back then?

    --
    Some do, some don't, some will and some won't ...

    Take out the _CURSEING to reply to me
  41. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Highlandish" <ckreskay_CURSEING@dodo.com.au> wrote in message
    news:2mgvn7FmlmlcU2@uni-berlin.de...

    >
    > I had a 486sx2/50 (no math coprocessor) and doom1 S/W ran like a dog with 2
    > legs, it was literally a slide show. I didn't upgrade until the P90 was a
    > year old. who could afford it back then?

    You probably had a bad chipset, it made a big difference back
    then, even 486sx should've done better than that poor dog.
    I ran it on 33/386DX and 486/66DX2. On the 386 it
    chopped, not a complete slide show, but not
    pleasant either. On the 4/66 it rocked.


    RayO
  42. Archived from groups: alt.comp.periphs.videocards.ati (More info?)

    In article <C_HMc.5999$qT3.4835@nwrddc03.gnilink.net>, Menothere@nohog.net
    (RayO) wrote:

    > On the 4/66 it rocked.

    It's rather sad, but even after all these years the phrase "DX2/66" still
    causes a little twinge of excitement in some deep, dark pleasure centre in
    my brain. It was such an object of unaffordable techno-lust that the
    effects have obviously scarred me for life :-)

    Andrew McP
  43. Archived from groups: alt.comp.periphs.videocards.ati (More info?)

    >I had a 486sx2/50 (no math coprocessor) and doom1 S/W ran like a dog with 2
    >legs, it was literally a slide show. I didn't upgrade until the P90 was a
    >year old. who could afford it back then?


    in 93 I ordered a p90 the same week they came out. I was about to
    order a p60 or 486-100 and they started selling the 90 so I took the
    plunge and go it. I saved up a LONG time for it though. I paid 3800
    shipped. Canadian for a Gateway 2000 P90, 8 megs ram, 540 meg hd, 17"
    monitor,no sound card, 2x cdrom, no speakers, no printer, no modem.

    Crazy

    Then I upgraded from 8 to 16 megs, and paid almost 400 bucks for that.


    then agian, way before that, I had a 386-25, 40 meg hd, 2 meg ram, and
    it too was almost 4000 bucks.

    My first pc was a trs-80. 1000 bucks for a external 5 1/4" drive LOL
    I never bought it and got a tape drive instead. had to use the old
    style counter to line up the tape before you loaded it.
  44. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    Quoth The Raven "RayO" <Menothere@nohog.net> in
    C_HMc.5999$qT3.4835@nwrddc03.gnilink.net
    > "Highlandish" <ckreskay_CURSEING@dodo.com.au> wrote in message
    > news:2mgvn7FmlmlcU2@uni-berlin.de...
    >
    > >
    > > I had a 486sx2/50 (no math coprocessor) and doom1 S/W ran like a
    > > dog with 2 legs, it was literally a slide show. I didn't upgrade
    > > until the P90 was a year old. who could afford it back then?
    >
    > You probably had a bad chipset, it made a big difference back
    > then, even 486sx should've done better than that poor dog.
    > I ran it on 33/386DX and 486/66DX2. On the 386 it
    > chopped, not a complete slide show, but not
    > pleasant either. On the 4/66 it rocked.
    >
    >
    > RayO

    both your pc's had a math coprocessor in them, that's why they ran better.
    the sx models were severely hampered with out one

    --
    I put tape on my mirrors at my house so I won't accidentally walk
    through them into another dimension. - Steven Wright

    Take out the _CURSEING to reply to me
  45. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    "Highlandish" <ckreskay_CURSEING@dodo.com.au> wrote in message
    news:2mhqdlFmv3trU2@uni-berlin.de...
    > Quoth The Raven "RayO" <Menothere@nohog.net> in
    > C_HMc.5999$qT3.4835@nwrddc03.gnilink.net
    > > "Highlandish" <ckreskay_CURSEING@dodo.com.au> wrote in message
    > > news:2mgvn7FmlmlcU2@uni-berlin.de...
    > >
    > > >
    > > > I had a 486sx2/50 (no math coprocessor) and doom1 S/W ran like a
    > > > dog with 2 legs, it was literally a slide show. I didn't upgrade
    > > > until the P90 was a year old. who could afford it back then?
    > >
    > > You probably had a bad chipset, it made a big difference back
    > > then, even 486sx should've done better than that poor dog.
    > > I ran it on 33/386DX and 486/66DX2. On the 386 it
    > > chopped, not a complete slide show, but not
    > > pleasant either. On the 4/66 it rocked.
    > >
    > >
    > > RayO
    >
    > both your pc's had a math coprocessor in them, that's why they ran better.
    > the sx models were severely hampered with out one
    >

    Wow, I'm impressed, you are absolutely right. I did have a math
    coprocessor in my 386, I needed it for some other work.

    RayO
  46. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    It is interesting that they left the x800 PE data out of the last
    few benchmarks.

    Why would someone running an unbiased test publish partial results?


    "rms" <rsquires@flashREMOVE.net> wrote in message
    news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
    > http://www2.hardocp.com/article.html?art=NjQy
    >
    > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
    > version was $345 shipped from provantage).
    >
    > rms
    >
    >
  47. Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

    Blig Merk left a note on my windscreen which said:

    > > iD games are OpenGL, not D3D.
    >
    > This is...

    Didn't get much further than this. Now back into the killfile you go.
    --
    Stoneskin

    [Insert sig text here]
  48. Archived from groups: alt.comp.periphs.videocards.ati (More info?)

    Quoth The Raven "Newf !!!" <usenet42@hotmail.com> in
    b4k7g0lnuosfvtpsnsltnrbtjq4uk4lj3q@4ax.com
    > >I had a 486sx2/50 (no math coprocessor) and doom1 S/W ran like a dog
    > >with 2 legs, it was literally a slide show. I didn't upgrade until
    > >the P90 was a year old. who could afford it back then?
    >
    >
    > in 93 I ordered a p90 the same week they came out. I was about to
    > order a p60 or 486-100 and they started selling the 90 so I took the
    > plunge and go it. I saved up a LONG time for it though. I paid 3800
    > shipped. Canadian for a Gateway 2000 P90, 8 megs ram, 540 meg hd, 17"
    > monitor,no sound card, 2x cdrom, no speakers, no printer, no modem.
    >
    > Crazy
    >
    > Then I upgraded from 8 to 16 megs, and paid almost 400 bucks for that.
    >
    >
    > then agian, way before that, I had a 386-25, 40 meg hd, 2 meg ram, and
    > it too was almost 4000 bucks.
    >
    > My first pc was a trs-80. 1000 bucks for a external 5 1/4" drive LOL
    > I never bought it and got a tape drive instead. had to use the old
    > style counter to line up the tape before you loaded it.

    by the time I cold afford the p90, I was like all techy and look at me, I
    have a Pentium. I visited my mum because she bought a new pc for herself,
    and bugger it all, she had a p2-266, and she didn't even know how to use it
    and all she wanted was to type letters and do the accounting. FFS! I felt so
    small after that.

    --
    Some people practice what they preach, others just practice preaching.

    Take out the _CURSEING to reply to me
  49. Archived from groups: alt.comp.periphs.videocards.ati (More info?)

    you would have been really really! jealous of my DX4/100 then :)

    --
    Tony DiMarzio
    djtone81@hotmail.com
    djraid@comcast.net
    "Andrew MacPherson" <andrew.mcp@DELETETHISdsl.pipex.com> wrote in message
    news:memo.20040725114640.3812A@address_disguised.address_disguised...
    > In article <C_HMc.5999$qT3.4835@nwrddc03.gnilink.net>, Menothere@nohog.net
    > (RayO) wrote:
    >
    > > On the 4/66 it rocked.
    >
    > It's rather sad, but even after all these years the phrase "DX2/66" still
    > causes a little twinge of excitement in some deep, dark pleasure centre in
    > my brain. It was such an object of unaffordable techno-lust that the
    > effects have obviously scarred me for life :-)
    >
    > Andrew McP
Ask a new question

Read More

Nvidia Games Graphics