Doom3 Benchmarks out!

G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"rms" <rsquires@flashREMOVE.net> wrote in message
news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
> http://www2.hardocp.com/article.html?art=NjQy
>
> Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
> version was $345 shipped from provantage).
>
> rms
>


I thought it was a good article and it makes me happy I have a 9800 Pro
video card. However, I can't wait to see how Doom 3 plays on systems that
are a little more "real world". For example, I hope they bench it on
processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
like to see an all-round comparison with as many combinations of CPU and
video cards as possible.

Thanks for posting that link!
 

noman

Distinguished
Apr 17, 2004
121
0
18,680
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 22 Jul 2004 00:31:21 GMT, "rms" <rsquires@flashREMOVE.net>
wrote:

>http://www2.hardocp.com/article.html?art=NjQy
>
>Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
>version was $345 shipped from provantage).

Yes, 6800GT seems to be a great card to buy. The only difference
between this card and 6800Ultra is the clock speed. Reminds me of
Ti4200 in some regards.

Since I have every intention of keeping my 9800 (overclocked past Pro
speeds) at least till the end of next year, I find the Fx5950 and
9800XT scores very encouraging. At 1024x768 with very high settings
(4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

I think, my graphic card should be able to hit average of 30 fps at
1024x768 2xAA and 8xAF. That's all I need for Doom3 and the games that
will be based on its engine, for now.

As far as pricing of new graphic cards go, the next few months will be
very interesting.
--
Noman, happy with his 9800 (Pro)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

> As far as pricing of new graphic cards go, the next few months will be
> very interesting.
> --
> Noman, happy with his 9800 (Pro)

The last couple months saw some good price drops when the 6800 and x800
became available. Now you can get a 9800 PRO for under $200. I'm still
clinging to my 5200 ultra until I am forced to part with it :)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"David Besack" <daveREMOVEbesack@mac.com> wrote in message
news:cdn4hg$drl6$1@netnews.upenn.edu...
> > As far as pricing of new graphic cards go, the next few months will be
> > very interesting.
> > --
> > Noman, happy with his 9800 (Pro)
>
> The last couple months saw some good price drops when the 6800 and x800
> became available. Now you can get a 9800 PRO for under $200. I'm still
> clinging to my 5200 ultra until I am forced to part with it :)

I follow this rule (Humga's 1st Law of Graphics Card Upgrade):

Buy the new card when the performance (frame rate usually being a good
measure) drops to roughly half of that of the new card. Then you must get at
least **some** cash back for the 'old' card.

This will ensure that you'll be able to play with your the old and new games
with decent performance without costing you too much :D

Please note that the 'new' card isn't necessarily the fastest card in the
market...think about it.
 

Tim

Distinguished
Mar 31, 2004
1,833
0
19,780
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"Humga" <Humga@no-spam.com> wrote in message
news:bc-dnfpBMrZzhWLd4p2dnA@eclipse.net.uk...

>
> I follow this rule (Humga's 1st Law of Graphics Card Upgrade):
>
>

Pretty cool having a law named after you. ;-)
 

NaDa

Distinguished
Mar 30, 2004
574
0
18,980
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"NightSky 421" <nightsky421@reply-to-group.com> wrote:
> "rms" <rsquires@flashREMOVE.net> wrote:
> > http://www2.hardocp.com/article.html?art=NjQy
> >
> > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
> > version was $345 shipped from provantage).
> >
> > rms
> >
>
>
> I thought it was a good article and it makes me happy I have a 9800 Pro
> video card. However, I can't wait to see how Doom 3 plays on systems that
> are a little more "real world". For example, I hope they bench it on
> processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
> like to see an all-round comparison with as many combinations of CPU and
> video cards as possible.

GeForce 4 MX will perform like a turd stuck in toilet seat. Heck,
even GeForce 3 will drown into the quicksand. I have no idea how much
difference there is between the "medium detail"- mode and "high
detail"- mode, but I just refuse to believe that "GeForce 3" would
surf the game with high details. I couldn't even turn on all the
details in "Unreal 2" without diving into the bottom of the chart.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

ATI's OpenGL drivers aren't so great. They are workable but not great.

The only thing impressive about the new Geforce cards is instancing
support in Vertex Shader 3.0. And so far it's been used in exactly one
game, and I don't expect that to change much for a long time.

ATI hard their cards out first. Unlike NVidia, they don't need to cook
their drivers. NVidia will have to work very hard to earn back my trust.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"NightSky 421" <nightsky421@reply-to-group.com> writes:

> For example, I hope they bench it on processors 1.5GHz and up with
> GeForce4 MX and GeForce3 cards and up.

From the article:

"As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4
box with a GeForce 4 MX440 video card and having a surprisingly good
gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce
3 video card that is two years old will deliver a solid gaming
experience that will let you enjoy the game the way id Software
designed it to be."

Not a benchmark, but at least it's positive (if subjective).

Nick


--
# sigmask || 0.2 || 20030107 || public domain || feed this to a python
print reduce(lambda x,y:x+chr(ord(y)-1),' Ojdl!Wbshjti!=obwAcboefstobudi/psh?')
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"Nada" <nada_says@hotmail.com> wrote in message
news:b9c228ae.0407220517.66f1e6e0@posting.google.com...
>
> GeForce 4 MX will perform like a turd stuck in toilet seat.


LOL, I love that description!


> Heck,
> even GeForce 3 will drown into the quicksand. I have no idea how much
> difference there is between the "medium detail"- mode and "high
> detail"- mode, but I just refuse to believe that "GeForce 3" would
> surf the game with high details. I couldn't even turn on all the
> details in "Unreal 2" without diving into the bottom of the chart.


Well when I read the article, I was under the impression myself that the
game details would have to be turned down in order to get a decent playing
experience with GeForce3 and Radeon 8500 cards. As to what low detail
will actually look like, we will see. Not that I'm immediately inclined
to find out myself, of course. :)

As the release date for Doom 3 draws nearer, I for whatever reason find
myself willing to loosen up the purse strings somewhat. Still, I'm going
to wait and see if there are any technical or driver issues before taking
the plunge. I very much look forward to seeing this newsgroup next week!
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"NightSky 421" <nightsky421@reply-to-group.com> wrote in message news:<10fu4eb889gdk53@corp.supernews.com>...
> "rms" <rsquires@flashREMOVE.net> wrote in message
> news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
> > http://www2.hardocp.com/article.html?art=NjQy
> >
> > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
> > version was $345 shipped from provantage).
> >
> > rms
> >
>
>
> I thought it was a good article and it makes me happy I have a 9800 Pro
> video card. However, I can't wait to see how Doom 3 plays on systems that
> are a little more "real world". For example, I hope they bench it on
> processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
> like to see an all-round comparison with as many combinations of CPU and
> video cards as possible.
>
> Thanks for posting that link!


According to the article Doom3 will come with a time demo, so just run
the time demo with your card and start a thread with your hardware.
Then after a couple of weeks someone can put all the data in a
spreadsheet and give an accounting for the cards that are listed. What
gets me, is there is no mention of Multiplayer game play anywhere.
When I get the game this will be one of the first things I will check
out, cause it will determine the longevity of the game.

Gnu_Raiz
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"magnulus" <magnulus@bellsouth.net> wrote in message
news:gMMLc.6214$yF.4333@bignews2.bellsouth.net...
> ATI's OpenGL drivers aren't so great. They are workable but not great.
>
> The only thing impressive about the new Geforce cards is instancing
> support in Vertex Shader 3.0. And so far it's been used in exactly one
> game, and I don't expect that to change much for a long time.
>
> ATI hard their cards out first. Unlike NVidia, they don't need to cook
> their drivers. NVidia will have to work very hard to earn back my trust.
>
Sour grapes?


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.725 / Virus Database: 480 - Release Date: 7/19/2004
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

At 1024x768 with very high settings
> (4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

Those numbers were timedemos, not actual in game framerates
which would be much lower.

Jeff B
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

>
> ATI hard their cards out first.

Neither ATI or nvidia have their top of the line cards out.

Jeff B
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"magnulus" <magnulus@bellsouth.net> wrote in message
news:4EPLc.8235$yF.5657@bignews2.bellsouth.net...
>
> If you go out and buy a GeForce FX 6800 just because it runs faster in
> Doom III, you're a fool. End of line.

Couldn't agree more, especially when you consider that in three years
it's going to be selling on eBay for 40 bucks. Video cards have very very
short life-cycles.


RayO
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"rms" <rsquires@flashREMOVE.net> wrote in message
news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
> http://www2.hardocp.com/article.html?art=NjQy
>
> Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
> version was $345 shipped from provantage).

They didn't bench anything older than 5950... what a bunch of clowns.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

> They didn't bench anything older than 5950... what a bunch of clowns.

It says in the article that a broader range of CPU and GFX cards will be
checked for a future feature

--
Toby
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 22 Jul 2004 16:57:10 +1000, "Darkfalz" <darkfalz@xis.com.au>
wrote:

>"rms" <rsquires@flashREMOVE.net> wrote in message
>news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
>> http://www2.hardocp.com/article.html?art=NjQy
>>
>> Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
>> version was $345 shipped from provantage).
>
>They didn't bench anything older than 5950... what a bunch of clowns.
>

I'm wondering if the low benchmark scores on those cards are
because of the heavy DX9 shader use. Since a card such as the GF4 Ti
doesn't support DX9, I'm guessing the game will either switch to a DX8
code for effects, or just omit the shaders all together. If so, would
that mean that a GF4 Ti might get framerates that are about the same
as the DX9 cards, but just not look as good?
 

Guitarman

Distinguished
Apr 11, 2004
35
0
18,530
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"Darkfalz" <darkfalz@xis.com.au> wrote in message
news:2m96qiFj9hfbU1@uni-berlin.de...
> "rms" <rsquires@flashREMOVE.net> wrote in message
> news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
> > http://www2.hardocp.com/article.html?art=NjQy
> >
> > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
> > version was $345 shipped from provantage).
>
> They didn't bench anything older than 5950... what a bunch of clowns.

My thoughts exactly...
 

NaDa

Distinguished
Mar 30, 2004
574
0
18,980
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"Darkfalz" <darkfalz@xis.com.au> wrote:
> "rms" <rsquires@flashREMOVE.net> wrote:
> > http://www2.hardocp.com/article.html?art=NjQy
> >
> > Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
> > version was $345 shipped from provantage).
>
> They didn't bench anything older than 5950... what a bunch of clowns.


I thought it was an okay preview benchmarking article, and I'm pretty
sure that once the game is out, we'll see plenty of good
benchmarkings. Keep an eye on www.xbitlabs.com in the upcoming weeks.
I'd say that if we with our average graphics cards cut out the
anisotropic filtering seen on the 5950 Ultra benchmark table, the
framerate will most likely stay around the same speeds with 9800 Pros
and 5900 XTs. As far as the engine's flexibility goes, I'd take that
with a grain of ginger when it comes to the "high detail" modes. I
personally won't consider playing the game anything less than Radeon
9800 or GeForce 5900. Will GeForce 3 be able to swoop it with high
details? Hell, no. That dog won't hunt.
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 22 Jul 2004 04:46:24 -0500, Larry Roberts <skin-e@juno.com>
wrote:

> I'm wondering if the low benchmark scores on those cards are
>because of the heavy DX9 shader use.

iD games are OpenGL, not D3D.
--
Andrew. To email unscramble nrc@gurjevgrzrboivbhf.pbz & remove spamtrap.
Help make Usenet a better place: English is read downwards,
please don't top post. Trim messages to quote only relevant text.
Check groups.google.com before asking a question.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

Agreed.......seriously, they couldn't test a Radeon 9800 Pro?? Which was the
definitive ATI card to buy for more than a year's time........Another thing:
Is there a particular reason why these guys claim to be "Just publishing
straight up FPS numbers", and yet they dont test with AA and Filtering OFF?

Those last batches of tests leave 8x AF *ON*.......seriously, there are
plenty of gamers out there (like...ME) who never turn on AA or AF.....AF
puts more of a hit on framerates than low-level AA does.....I'm guessing
those Radeon XT tests would be higher if you turned off that 8x AF...



"GuitarMan" <usa@yourface.com> wrote in message
news:xXNLc.16656$W86.18@nwrdny03.gnilink.net...
>
> "Darkfalz" <darkfalz@xis.com.au> wrote in message
> news:2m96qiFj9hfbU1@uni-berlin.de...
> > "rms" <rsquires@flashREMOVE.net> wrote in message
> > news:tVDLc.60$or1.20@newssvr19.news.prodigy.com...
> > > http://www2.hardocp.com/article.html?art=NjQy
> > >
> > > Looks like the 6800GT is the sweet spot, if you can get it cheap (my
pny
> > > version was $345 shipped from provantage).
> >
> > They didn't bench anything older than 5950... what a bunch of clowns.
>
> My thoughts exactly...
>
>
 

noman

Distinguished
Apr 17, 2004
121
0
18,680
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 22 Jul 2004 09:52:06 -0400, "magnulus"
<magnulus@bellsouth.net> wrote:
>
> If you go out and buy a GeForce FX 6800 just because it runs faster in
>Doom III, you're a fool. End of line.
>

GeForce 6800 line works fine in other games too. They do trail behind
X800XT-PE in some DX9 games but not by much. Granted ATI still has to
optimise their memory controller (which, I read somewhere, is running
at 60-70% efficiency) and they are also rewriting their openGl drivers
from scratch. You can expect more optimisations from nVidia as well.

IMO, X800XT-PE is a better choice (if you can find it, that is) than
6800Ultra and 6800GT is better than X800Pro, given their MSRPs and
also the power requirements.

The bottomline is that these are all great cards and should run most
of the Source/Doom3/CryEngine/UT based games without any problems.
--
Noman
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On 22 Jul 2004 07:46:18 -0400, Nick Vargish
<nav+posts@bandersnatch.org> wrote:

>"NightSky 421" <nightsky421@reply-to-group.com> writes:
>
>> For example, I hope they bench it on processors 1.5GHz and up with
>> GeForce4 MX and GeForce3 cards and up.
>
>From the article:
>
> "As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4
> box with a GeForce 4 MX440 video card and having a surprisingly good
> gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce
> 3 video card that is two years old will deliver a solid gaming
> experience that will let you enjoy the game the way id Software
> designed it to be."
>
>Not a benchmark, but at least it's positive (if subjective).
>
>Nick

Fingers crossed then.

--

Bunnies aren't just cute like everybody supposes !
They got them hoppy legs and twitchy little noses !
And what's with all the carrots ?
What do they need such good eyesight for anyway ?
Bunnies ! Bunnies ! It must be BUNNIES !
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action,alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"magnulus" <magnulus@bellsouth.net> wrote in message
news:gMMLc.6214$yF.4333@bignews2.bellsouth.net...
> ATI's OpenGL drivers aren't so great. They are workable but not great.
>
> The only thing impressive about the new Geforce cards is instancing
> support in Vertex Shader 3.0. And so far it's been used in exactly one
> game, and I don't expect that to change much for a long time.
>
> ATI hard their cards out first. Unlike NVidia, they don't need to cook
> their drivers. NVidia will have to work very hard to earn back my trust.
>

The funny thing is, ATI is the company that gets caught "optimizing" their
drivers in this article. Give it a close read.

NVida made some unwise design decisions in the last round of cards. As
such, they had to make some tradeoffs in image quality to get the
performance up, basically making the best of a bad situation.

It's funny how different people can interpret the same data differently.
I've had an ATI card in my box for quite some time but I feel that NVidia
has the better product this round. If you feel the need to "punish" NVidia
for the FX series this go around I guess you can do that but I think it's
your loss.

B