6800GT vs. X800Pro...with an eye to the future

Dookie

Distinguished
Jul 21, 2004
5
0
18,510
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

Hey y'all,

I'm reentering the gaming world after a long hiatus. How long? I'm
replacing a 2xP2/300, 384mb, Voodoo3, AWE64 rig! I'm going to ask the same
question that everyone is these days, but hopefully a little more
intelligently than "DudE! My mOm say she'll h00k me up with eItheR. Which
iZ da bizzy-b0mB?" I've been reading everything I can, and I have some very
specific questions (the answer to many of which will be "only time will
tell" I suspect). I'd appreciate logical and informed responses (what? On
Usenet?). The email address herein is legit (after you remove the obvious),
if you prefer to stay out of the fray.

The new rig is an Athlon XP 3200+ with 1gb DDR400. This is not up for
debate. The price was *very* right and it's already purchased (~$225 for
CPU, cooler, case, motherboard, 400w power supply, tax and shipping). I'm
not very interested in overclocking anything. The question is which $400
GPU to put in it, the 6800GT or the X800Pro, if I'm planning to have this
box as long as I did my last. Availability is not an issue...I happen to
have both cards right here in front of me (an ATI and a PNY, both still in
cellophane) . Yes, I *am* a bitch.

So, with *only* the X800Pro and 6800GT in mind...

Performance:
We've all seen the Doom3 benchmarks. Big whoop...this is not the only game
I'll be playing. On the other hand, a great engine will get a lot of reuse.
Is it realistic to believe that ATI will a) be able to, and b) choose to fix
the OpenGL performance of the X800Pro. Or is it a) crippled by its
12-pipeline architecture and lack of Shader 3.0 support, and/or b) doomed at
birth by the promise of a near-term declocked 16-pipe card (the so-called
X800GT)?
And in the other camp, plenty of benchmarks show the two cards pretty much
neck and neck in DirectX games today, with perhaps a slight advantage to
ATI. Will 9.0c (and its Shader 3.0 support) change much? How important is
Shader 3.0 support really?

Noise:
Anybody with real world experience with both? I understand the 6800GT is
loud. I spend my days in climate-controlled server rooms, so a little
machine whirr ain't no big thing. On the other hand, the rig will be left
on pretty much all the time in a very open-architecture house. Will I hear
it in the next room?

Hacks:
Not that I'll be jacking around with my $400 toy any time soon, but it's
widely reported that BIOS flashes are a poor man's upgrade. As I understand
it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are
then tested to lower (ie: Pro / GT) standards. So the probability of
flashing actually improving anything depends on how 'broken' the individual
GPU is? Furthermore, my X800 is probably not a VIVO version, which I
understand means it is not flashable to an XT regardless? Whereas all GT's
are capable? Has anyone actually performed a flash on either of these
cards?

What else bears consideration? I've got a couple weeks to make a decision
and I know they're both great cards. Nor am I particularly loyal to (or
vengeful against) either manufacturer.

Thanks for any and all input,

Dookie
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

> Performance:
> We've all seen the Doom3 benchmarks. Big whoop...this is not the only game
> I'll be playing. On the other hand, a great engine will get a lot of reuse.
> Is it realistic to believe that ATI will a) be able to, and b) choose to fix
> the OpenGL performance of the X800Pro.

The obvious answer is "no". If they could, they would have long ago.
It's not like ATI is just now finding out they do OGL poorly.


> Noise:
> Anybody with real world experience with both? I understand the 6800GT is
> loud. I spend my days in climate-controlled server rooms, so a little
> machine whirr ain't no big thing. On the other hand, the rig will be left
> on pretty much all the time in a very open-architecture house. Will I hear
> it in the next room?


>
> Hacks:
> Not that I'll be jacking around with my $400 toy any time soon, but it's
> widely reported that BIOS flashes are a poor man's upgrade. As I understand
> it, the chipsets that don't pass muster to be part of an XT / Ultra PCB are
> then tested to lower (ie: Pro / GT) standards. So the probability of
> flashing actually improving anything depends on how 'broken' the individual
> GPU is?

Don't believe everything you hear. The x800pro CANNOT be
turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I
tried it
on my x800pro, so I know what I'm talking about. Aside from adjusting
the clocks with ATI tool, however it runs
OOTB is the best it will ever run. Of course you must use ATI tool to
adjust the clocks so the card will run at full speed, they are
terribly underclocked OOTB.

Jeff B
 

Bean

Distinguished
Nov 8, 2001
52
0
18,630
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"JB" <fake@addy.com> wrote in message
news:fYWLc.163581$XM6.52882@attbi_s53...
>
> > Performance:
> > We've all seen the Doom3 benchmarks. Big whoop...this is not the only
game
> > I'll be playing. On the other hand, a great engine will get a lot of
reuse.
> > Is it realistic to believe that ATI will a) be able to, and b) choose to
fix
> > the OpenGL performance of the X800Pro.
>
> The obvious answer is "no". If they could, they would have long ago.
> It's not like ATI is just now finding out they do OGL poorly.
>
>
> > Noise:
> > Anybody with real world experience with both? I understand the 6800GT
is
> > loud. I spend my days in climate-controlled server rooms, so a little
> > machine whirr ain't no big thing. On the other hand, the rig will be
left
> > on pretty much all the time in a very open-architecture house. Will I
hear
> > it in the next room?
>
>
> >
> > Hacks:
> > Not that I'll be jacking around with my $400 toy any time soon, but it's
> > widely reported that BIOS flashes are a poor man's upgrade. As I
understand
> > it, the chipsets that don't pass muster to be part of an XT / Ultra PCB
are
> > then tested to lower (ie: Pro / GT) standards. So the probability of
> > flashing actually improving anything depends on how 'broken' the
individual
> > GPU is?
>
> Don't believe everything you hear. The x800pro CANNOT be
> turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I
> tried it
> on my x800pro, so I know what I'm talking about. Aside from adjusting
> the clocks with ATI tool, however it runs
> OOTB is the best it will ever run. Of course you must use ATI tool to
> adjust the clocks so the card will run at full speed, they are
> terribly underclocked OOTB.
>
> Jeff B
>

Actually is is known that ATI is in the progress of redoing the Opengl
Dirvers. Betatesters have said this and also people on the Catalyst team.
Ask around the rage3d forums for more info. Yes the x800 pro can be hacked
and then have its bios flashed to be a x800xt, but it wont work on every
card. You where just unclucky. When you hack it your enabling the extra 4
pipelines. Most likley one or more of those extra 4 pipelines was defective,
and that is why it didnt work for you. The lucky ones with the working 4
pipelines got there hack to work fine. Its more of a 50/50 chance that the
hack will work.

Bean
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

> Yes the x800 pro can be hacked
> and then have its bios flashed to be a x800xt, but it wont work on every
> card.

Believe what you want, I have proof that the hack doesn't work.

Jeff B
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

> Actually is is known that ATI is in the progress of redoing the Opengl
> Dirvers. Betatesters have said this and also people on the Catalyst team.

So what are you saying, Doom3 etc. is about to suddenly run great on ATI
hardware? So the benchmarks run by ID software mean nothing??
LOL! If ATI knew how to fix the problem, they would have done so
long ago. To think otherwise is to set yourself up for a big
dissappointment.

Jeff B
 

minotaur

Distinguished
Mar 31, 2004
135
0
18,680
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

Bean wrote:

> "JB" <fake@addy.com> wrote in message
> news:fYWLc.163581$XM6.52882@attbi_s53...
>
>>>Performance:
>>>We've all seen the Doom3 benchmarks. Big whoop...this is not the only
>
> game
>
>>>I'll be playing. On the other hand, a great engine will get a lot of
>
> reuse.
>
>>>Is it realistic to believe that ATI will a) be able to, and b) choose to
>
> fix
>
>>>the OpenGL performance of the X800Pro.
>>
>>The obvious answer is "no". If they could, they would have long ago.
>>It's not like ATI is just now finding out they do OGL poorly.
>>
>>
>>
>>>Noise:
>>>Anybody with real world experience with both? I understand the 6800GT
>
> is
>
>>>loud. I spend my days in climate-controlled server rooms, so a little
>>>machine whirr ain't no big thing. On the other hand, the rig will be
>
> left
>
>>>on pretty much all the time in a very open-architecture house. Will I
>
> hear
>
>>>it in the next room?
>>
>>
>>>Hacks:
>>>Not that I'll be jacking around with my $400 toy any time soon, but it's
>>>widely reported that BIOS flashes are a poor man's upgrade. As I
>
> understand
>
>>>it, the chipsets that don't pass muster to be part of an XT / Ultra PCB
>
> are
>
>>>then tested to lower (ie: Pro / GT) standards. So the probability of
>>>flashing actually improving anything depends on how 'broken' the
>
> individual
>
>>>GPU is?
>>
>>Don't believe everything you hear. The x800pro CANNOT be
>>turned into a x800xt by the so-called '16 pipe fix', or BIOS flash. I
>>tried it
>>on my x800pro, so I know what I'm talking about. Aside from adjusting
>>the clocks with ATI tool, however it runs
>>OOTB is the best it will ever run. Of course you must use ATI tool to
>>adjust the clocks so the card will run at full speed, they are
>>terribly underclocked OOTB.
>>
>>Jeff B
>>
>
>
> Actually is is known that ATI is in the progress of redoing the Opengl
> Dirvers. Betatesters have said this and also people on the Catalyst team.
> Ask around the rage3d forums for more info. Yes the x800 pro can be hacked
> and then have its bios flashed to be a x800xt, but it wont work on every
> card. You where just unclucky. When you hack it your enabling the extra 4
> pipelines. Most likley one or more of those extra 4 pipelines was defective,
> and that is why it didnt work for you. The lucky ones with the working 4
> pipelines got there hack to work fine. Its more of a 50/50 chance that the
> hack will work.
>
> Bean
>
>

Yes because it only works on, VIVO equiped cards.
I thought that was common knowledge by now?
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

>
> Yes because it only works on, VIVO equiped cards.
> I thought that was common knowledge by now?

What do you mean, VIVO equiped cards? All x800pros are the same,
right?

Jeff B
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

In article <H30Mc.164496$Oq2.118493@attbi_s52>, fake@addy.com (JB) wrote:

> What do you mean, VIVO equiped cards? All x800pros are the same,
> right?

VIVO cards have an additional (Rage Theatre?) ATI chip and associated
stuff to allow capture etc., so no, they're not all the same.

Andrew McP
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

uh, no, you have proff that you couldn't do it, either because of an unlucky
card or alack of knowledge.

Mike

"JB" <fake@addy.com> wrote in message news:920Mc.7939$eM2.5675@attbi_s51...
>
> > Yes the x800 pro can be hacked
> > and then have its bios flashed to be a x800xt, but it wont work on every
> > card.
>
> Believe what you want, I have proof that the hack doesn't work.
>
> Jeff B
>
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

Mike P wrote:

> uh, no, you have proff that you couldn't do it, either because of an unlucky
> card or alack of knowledge.
>
> Mike
>

No, you are going by "some guy said". I actually did the mod,
you didn't. I have hands-on experience, you have nothing. Therefore,
by definition,
I'm right and you're wrong.

Jeff B
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

Mike P wrote:

> uh, no, you have proff that you couldn't do it,
>
> Mike

"Jeff, you're wrong because I say you are wrong, and for no other reason".

As long as you keep commenting from a position of
"I have no experience in this matter" and have absolutely
NO knowledge, and can only parrot what others have said,
I will continue to correct you.

So stop falsely accusing me of being a screwup, or don't and
this thread will last forever.

Jeff B
 

minotaur

Distinguished
Mar 31, 2004
135
0
18,680
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

Andrew MacPherson wrote:
> In article <H30Mc.164496$Oq2.118493@attbi_s52>, fake@addy.com (JB) wrote:
>
>
>>What do you mean, VIVO equiped cards? All x800pros are the same,
>>right?
>
>
> VIVO cards have an additional (Rage Theatre?) ATI chip and associated
> stuff to allow capture etc., so no, they're not all the same.
>
> Andrew McP

Correct.

From waht I have read everywhere, they are the same as the XT PCB.
But have 4 pipes disabled in the BIOS, flashing it is supposed to make
it perform like an XT. Going from the posts I have seen most have been
100% successful, but did spot one where a guy had trouble raising his
GPU speed.

Was going to go with one of those VIVO cards or a nVidia GT :|
Tax return time, so deciding in the next few weeks! :) & the GT is ahead
so far in my choice :|

Minotaur *8)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

The V5's FSAA methods were employed in the GF3 and higher cards. GF3's 2x
FSAA uses rotated-grid multi-sampling just like the V5, while 4x FSAA uses
ordered-grid.

Do you have an AthlonXP/64 or Pentium4? Did the P4's vastly superior Quake
III engine scores play a part in your decision?

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Andrew MacPherson" <andrew.mcp@DELETETHISdsl.pipex.com> wrote in message
news:memo.20040723013918.3452D@address_disguised.address_disguised...
> If I wasn't already on board the ATI bus I might well have been swayed by
> the Nvidia benchmarks, but I also haven't forgiven them for buying 3dfx
> and not using their vastly superior FSAA methods (slow or not). The V5
> still has a soft spot in my heart... just next to the empty place in my
> wallet ;-)
>
> Andrew McP
 

minotaur

Distinguished
Mar 31, 2004
135
0
18,680
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

JB wrote:

>
>>
>> Yes because it only works on, VIVO equiped cards.
>> I thought that was common knowledge by now?
>
>
> What do you mean, VIVO equiped cards? All x800pros are the same,
> right?
>
> Jeff B
>

VIVO
Video In Video Out

The Pro's don't have the Video In
 

JB

Distinguished
Mar 30, 2004
365
0
18,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

> VIVO
> Video In Video Out
>
> The Pro's don't have the Video In

So, since the pro's don't have the video in, and the mod works
only on cards with video in, the mod doesn't work on pro's which is
exactly what I proved with my experiment. Thanks for clearing that up.

Jeff B
 

Bean

Distinguished
Nov 8, 2001
52
0
18,630
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

"JB" <fake@addy.com> wrote in message
news:aZtMc.19970$eM2.17509@attbi_s51...
>
> > VIVO
> > Video In Video Out
> >
> > The Pro's don't have the Video In
>
> So, since the pro's don't have the video in, and the mod works
> only on cards with video in, the mod doesn't work on pro's which is
> exactly what I proved with my experiment. Thanks for clearing that up.
>
> Jeff B
>

There are pros with vivo
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

In article <%bzMc.2467$03a.1475@news04.bloor.is.net.cable.rogers.com>,
daxinfx@yahoo.com (First of One) wrote:

> The V5's FSAA methods were employed in the GF3 and higher cards.

Really? I had a GF3 and GF4200 and the FSAA on both was poor compared to
the V5. I did side by side visual comparisons with 2 PCs and the V5's x4
mode really was much better. Slow, but better.

The 9700Pro was a big improvement on Nvidia (to my eyes anyway), though
even with the x800 I'm disappointed that there's *still* no option to do
proper FSAA... some elements of the drawn picture still don't get AA'd.
That's a big flaw in the whole DX/FSAA thing at the moment.

> Did the P4's vastly superior Quake III engine scores play a part
> in your decision?

I know nothing about Q4 apart from the fact it's in the pipeline.

Andrew McP
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

I was talking about Q3. The P4 is faster than the AXP in Q3, and in all
Q3-engined games, yet it didn't prevent a lot of gamers from buying AMD
parts. The point is, Doom III is just a single game, and in the words of
John Carmack, "innocuous code changes can 'fall off the fast path' and cause
significant performance impacts, especially on NV30 class cards." It's not
wise to go for a 6800GT over the X800Pro just based on D3 scores, because
it's easy to turn the tables when it comes to other games in the future.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Andrew MacPherson" <andrew.mcp@DELETETHISdsl.pipex.com> wrote in message
news:memo.20040724213042.2792B@address_disguised.address_disguised...
> > Did the P4's vastly superior Quake III engine scores play a part
> > in your decision?
>
> I know nothing about Q4 apart from the fact it's in the pipeline.
>
> Andrew McP
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati (More info?)

"Andrew MacPherson" <andrew.mcp@DELETETHISdsl.pipex.com> wrote in message
news:memo.20040724213042.2792B@address_disguised.address_disguised...
> Really? I had a GF3 and GF4200 and the FSAA on both was poor compared to
> the V5. I did side by side visual comparisons with 2 PCs and the V5's x4
> mode really was much better. Slow, but better.

Have you compared the 2xFSAA modes? Like I said in a previous post, the
GF3's 2x mode uses a rotated-grid, just like the V5. The GF3's 4x mode uses
ordered grid, while the V5 uses rotated-grid on 4x. Rotated-grid looks
better for lines at most angles.

Then there's the additional Quincunx mode which is blurry as hell.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On Fri, 23 Jul 2004 04:13:39 GMT, JB <fake@addy.com> wrote:

>
>> Actually is is known that ATI is in the progress of redoing the Opengl
>> Dirvers. Betatesters have said this and also people on the Catalyst team.
>
>So what are you saying, Doom3 etc. is about to suddenly run great on ATI
>hardware? So the benchmarks run by ID software mean nothing??
>LOL! If ATI knew how to fix the problem, they would have done so
>long ago. To think otherwise is to set yourself up for a big
>dissappointment.

I am a proud owner of the 9800Pro.... but OpenGL has always been the
weakness of the ATI line... which is not a problem when Quake3 is
running 200+ fps... They had access to Doom3 for months...

What I've seen sofar from BETA enhancements is about 5~8fps faster...
which is still NOTHING when the 6800GT is 65fps and the X800Pro is
40fps!! (Important note: unless your CPU is an AMD64 or P4-EE - none
of this MATTERS as your CPU is the bottleneck to the GPU)

Its a problem when the $300 6800 is up there with the $500 ATI XT
version!!

I hope ATI does better on the next round... more competition!



- - - - -
Remember: In the USA - it is dangeroud to draw or write about Heir Bush in a negative way. The police or SS are called, people threaten to kill you. (What country is this again?)

- Fahrenheit 9/11 - Unless you see it for yourself, don't call it "a bunch of lies"... that would be unAmerican.
- White House blows cover of an undercover agent because her husband said there were no WMD (before the USA started the war) - her job was finding terrorist.
God bless the land of the free. Where you can burn the Constitution... Ashcroft does it every day.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 26 Aug 2004 18:33:07 GMT "Enormous Genitals" <big@dick.com> meeped :

>That, and the fact that 98% of the PC games on the market use D3D instead of
>OpenGL...


I wonder what the ratio is when you look at online players.

With RTCW, Enemy Territory, Quake 3, Call of Duty and optional with Half Life
mods..OpenGL has a stronger online presence than it may appear at first glance.
--


) ___ ______
(__/_____) /) (, / )
/ __ _ (/ _ /---( __ _/_ _
/ (_(_/ (_(__/ )_(_(_ ) / ____)(_(_/ (_(__/_)_
(______) (_/ (
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: alt.comp.periphs.videocards.ati,alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 02 Sep 2004 00:14:00 GMT, "Mike P" <mike@nbnet.nb.ca> wrote:

>I think he was referring to d3d performance with the hl2 engine

I don't believe he was.

>, and they are close.
>Not that I'm an nvidia fan, but facts are facts.

Not if you look at the CS:Source benchmarks on Firingsquad.
--
Andrew, contact via interpleb.blogspot.com
Help make Usenet a better place: English is read downwards,
please don't top post. Trim replies to quote only relevant text.
Check groups.google.com before asking an obvious question.
 

TRENDING THREADS