Sign in with
Sign up | Sign in
Your question

Remove all GF-FX cards from the buyers' guide?

Last response: in Graphics & Displays
Share
September 12, 2003 1:13:16 AM

After looking at HL2 benchmarks, I think we should think about removing all GF-FX cards from our buyers' guide.

What's your opinion?

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
September 12, 2003 1:20:58 AM

Actually I was wondering if those "optimizations" applied to the TI4200 or not?

If the "optimizations" don't apply to the benchmarks we see then the TI4200 still looks good for a few weeks but it also seems to be disappearing off the retail outlet shelves where I reside. Can only find one as "End Of Line". Or was that Out of Luck (hehe).

What's the new low end recommendation? A barebones 9600Pro. I guess they are about $150US about now. The very top of your low end price? Maybe a 9000Pro at $60US?

Mid range cards 9600Pro with dressings?

The loving are the daring!<P ID="edit"><FONT SIZE=-1><EM>Edited by Flinx on 09/11/03 09:32 PM.</EM></FONT></P>
September 12, 2003 1:23:04 AM

Wait for the results to be verified

while i dont think nvidia are going to be able to legitimatly increase the performance that much just with driver tweaking

if you change the buyers guide now its going to look like your jumping the gun a bit

at least wait untill we are certain this is the final performance for the cards (this really should only take a few days anyway since most of the hardware review sites have HL2 benchmark and the det 50 drivers)

if this is the final performance then definatly remove them
Related resources
September 12, 2003 1:40:09 AM

Myself, I agree completely. We should not be encouraging/endorsing in anyway Nvidia's cheating. It's pretty much a given that the new det. 50's will be laced with cheats and other such stuff, and we should not encourage people to buy such pos. Keep up to the Ti 4800, but exclude the fx line.

As each day goes by, I hug my 9600Pro just a little tighter.
September 12, 2003 1:41:55 AM

Remove them ASAP. Keep the Ti4200 on the reccomended budget cards list considering that the Ti4600 was almost as fast as a 5900 Ultra.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 1:49:16 AM

the only choice so far from me is getting a 9600 pro or 9800 pro.

<font color=green>If your nose <b>RUNS</b>, and feet <b>SMELLS</b>.
Then you must be born <b>UP-SIDE DOWN</b>.</font color=green>
September 12, 2003 1:51:16 AM

the design of nv3.xx graphics chips sucks at first impressions.
Remember the dustbuster.
Geforce 4 so far is the most mature chipset yet.
Geforce 3 is the second.

<font color=green>If your nose <b>RUNS</b>, and feet <b>SMELLS</b>.
Then you must be born <b>UP-SIDE DOWN</b>.</font color=green>
September 12, 2003 2:23:40 AM

New Low end should probably be the 9600 non-pro if the prices differ greatly from the Pro version. The Low end market covers very wide range of products. Its critical that we have something decent to suggest for the sub-$100 buyers. The $100-$150 rangeis almost a completely category from the sub-$100 market. Its really tempting to offer the Radeon 9600 to the low end. With the GeForce FX scores where they are at, the Radeon 9600 or 9500 is MUCH better value and a MUCH better product than the GeForce 5900 Ultra. I know it doesn't make any sense, I'm so bewildered by the fact that cheapo components can completely blow away Nvidia's "bells & whistles" top-notch offering.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 2:57:52 AM

Having seen how GFFX performs with 3dmark03, Half-life 2, Halo and the new Tomb Raider game I don't think anyone can safely recommend it to anyone. Even if it does turn out to be fine with new drivers or whatever, I certainly wouldn't want to risk advising someone to blow money on it right now. I say play it safe, and take it down - I don't think the ATi cards are going to disappoint even if nVidia manage a miracle and pull their technology out of the fire in the future.
September 12, 2003 12:38:11 PM

I would say no, but perhaps make a note of it somewhere in the guide... Also, I'd wait to see what the det 50s actually do once they come out, IQ and speed wise... I'd really like to see performance numbers on SWG/EQ2/AC2... shrug.

Shadus<P ID="edit"><FONT SIZE=-1><EM>Edited by shadus on 09/12/03 08:40 AM.</EM></FONT></P>
September 12, 2003 1:11:59 PM

Spitfire, it would be very easy to edit your post and make a poll out of this. But if you don't want to, or you have reasons for doing otherwise I guess that's fine too.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 1:19:58 PM

But for the time being, the guide is very popular and serves video customers on a daily basis. It might take a few weeks before Detonator 50 is fully benched and thoroughly tested for Image Quality. I say we should have considered removing them when the Tombraider Benchmarks came. Let's not take any chances. If we want this community to be reputable, we have to make sure we reccomend something we KNOW that works well. Otherwise, taking a risk, crossing our fingers, & holding out might make the THGC Graphics community suffer and lose its currently good repore. Its very possible that "Cheaptimizer 50" MIGHT fix things up but should we really take the chance?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 1:25:22 PM

Its only going to get worse as more DX9 titles come out that support all the features properly. The FX line is going to suffer more and more, and if people are not informed about its true performance, then they will be buying hardware than isn't up to the high standard we all expect from our £300+ cards....

<A HREF="http://service.futuremark.com/compare?2k1=6988331" target="_new"> 3D-2001 </A>
<A HREF="http://service.futuremark.com/compare?2k3=1283170 " target="_new"> 3D-03 </A>
<font color=red> 120% overclocker </font color=red> (cheapskate)
September 12, 2003 1:51:35 PM

:redface: I've never spent more than $100 for a video card before. :redface:

Agreed: <i><b><font color=green>THE MASSES MUST BE INFORMED!</i></b></font color=green>

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 2:07:03 PM

If it's an unbiased guide, you don't really have a choice... all video cards should be present.

But on the same token, the guide is a living document and should be updated frequently with information. Wether or not the FX's should be recommended as highly should be reconsidered.

If the Det 50's give decent performance... even at a quality loss... then this information should be taken into account. Perhaps some people don't care as much about IQ as they do framerates, so maybe the Det 50s can deliver that to them. That's not my opinion, but that information should be delivered in the guide.

Keep up the good work,


------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2000+
3dMark03: 3529
September 12, 2003 2:26:22 PM

Yes they should be listed in the guide. Slow DX9 support should be listed in the cons for all the FX cards. As for as reccomending them, that's a whole 'nother ball park. I reccomend any of the ATI R300 based chips for most market segments. For the sake of nVidia owners who purchased FX boards a few months ago and can't take them back, the lost IQ in Cheaptimizer 50 drivers might be worth the perfomance gain, depending on how big of a gain there is, and how much IQ is lost. If a little difference in IQ means leaps and bounds in performance, that'll be the only worthwhile route current nVidia owners will have. Sacrificing IQ to make a game just playable for cutomers that were ripped off is a little different than sacrificing IQ to gain a lead on the top of the charts as was the case in UT2K3 and 3Dmark.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 3:49:54 PM

Keep them, but with a clear warning that they do not perform well in DX9.

I don't think any FX card is a good buy right now. Because even if nVidia boost the performance in Detonator 50, theses boost will probably not be application to all games. It seems that DET 50 will have "optimisations" for HL2 engine. But, will those optimisation will affect all the other games? If not, the DET 50 will be worthless for people who don't play games that benefits these optimisations.

Now, the FX cards are all overpriced. If they cut price on them, they might be an interesting alternative on low/mid market.

But not in HIGH-END, because if you have the money to spend on the best GPU, the only choice is ATI Radeon 9800 PRO, nothing else!

I hope all the opinions in thsi thread will improve the buyer's guide!

--
Would you buy a GPS enabled soap bar?
September 12, 2003 4:19:44 PM

If you keep them, I'd say something at the bottom of each fx card that goes something like this "Card not recommened due to extremely poor dx9 performance. Newest drivers use cheats to increase framerates....yadda yadda and such". Make sure the person looking at this understand very clearly what the situation is.

As each day goes by, I hug my 9600Pro just a little tighter.
September 12, 2003 4:47:04 PM

Just because theyre crap doesnt mean they have to be completely removed from the guide. Remember that HL2 isnt the only game out there, and that other games will run much better with nvidia cards.

Try not to base the guide on just the newest game out, but look at how each card performs in all games. Anyway, for all we know the Det 50's may make a MASSIVE Improvement in fps (although if they did it would look [-peep-] lol, nvidia's idea of optimisation, cut the detail).

<A HREF="http://service.futuremark.com/compare?2k1=6752830" target="_new">Yay, I Finally broke the 12k barrier!!</A>
September 12, 2003 4:49:52 PM

We're not talking removed from the guide. we're talking removed from the "reccomended list".

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 4:55:08 PM

Making mention of the poor DX9 performance and good DX8 performance, and better Doom ]|[ performance in the guide which runs off of OpenGL. However, several other here have made mention that Carmack is busting his arse to incorporate the Nvidia vendor-specific extensions into the game, a hefty task that most programmers wouldn't bother with. Therefore, I don't really see DOOM ]|[ as a representative of overall OpenGL performance when compared to ATi's stuff.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 6:20:19 PM

Quote:
As each day goes by, I hug my 9600Pro just a little tighter.

You are so cute! :wink:

--
Would you buy a GPS enabled soap bar?
September 12, 2003 7:26:09 PM

Like I said, make a note of it and that we're waiting to find out what is going on exactly, but I would leave things as they stand presently with just an addendum. If someone can't pay attention to a guide that short and a note or two with it they deserve the 5200 non-ultra or 420mx they get...

Shadus
September 12, 2003 7:33:55 PM

> Therefore, I don't really see DOOM ]|[ as a
> representative of overall OpenGL performance
> when compared to ATi's stuff.

Thing is, most games are ENGINE based, they aren't coded from the ground up. If d3 engine is opt'd for NV extremely well then anything using that engine will be fine with NV. There are only a few major engines in the industry. If more opt well for NV than ATI it will probally perform better, if they go the more standard dx9 route or opt for ati more, ati will do better.

Basically this whole thing to me is a bunch of crap. Quite literally. Member when sse was introduced in processors? Toasted when not compiled for it, won pretty handily when it was. I think nvidia was trying basically the same thing... and it backfired. Then they tried to cover up with cheats... and got caught. Shrug.


Shadus
September 12, 2003 7:58:18 PM

That's true too about the engine. ID will license the Doom 3 Engine to many companies to build games. I'm rather dissappointed by all this, because it looks like the Source Engine won't get all the attention from the mod community. Would the mod community develop a mod like CS2 or DOD2 or TF2 that will only run well on ATi hardware? It's not valve's fault that Id's Carmack worked countless hours optimizing the engine to use nvidia's vender specific extension paths instead of using the standard GL drivers. I was really hoping the Source Engine would get the attention of the mod community just like the original Half-Life. I still have hope though.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 12, 2003 10:38:10 PM

More time. I'd wait until at least NV gets their Detonator 50s out to the public.

I think the whole thing smells, funny how NV got their rel. 50s out to reviewers already but Valve insists that they don't use them for testing.
Regardless of what anyone says, I think theres some kind of ATI/Valve connection working out here.

But it should be noted in the guide how important shader performance is for DX9 games.

When the 9600Pro is beating a 5900ultra in pure DX9 mode you know that NVs design is lackluster, yet not flawed.
My vote is for yes, remove from the recommended list only after rel.50s.
If they arent a complete joke, and don't cut a unnoticable in-game loss of IQ. I feel like the last round of NV 'unfair modifications' that using photoshop to tell the difference really isn't valid.
If they can get performance up with no noticable difference in IQ, givem their due! :smile:

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500
September 13, 2003 12:49:40 AM

When should Cheaptimizer 50 hit the public?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 1:34:20 AM

without consulting me? tsk tsk tsk, just because i'm extremely depressed cause i got a defective P4 and a video card on back order doesn't mean i can't make decision. i'm disappointed :D 

j/k man~~~

Once the Det 50 come out, i'll reramp the entire guide if necessary. Spitfire, this is what u need to do... in big writing on top of the FAQ

DUE TO DX9 PROBLEMS NVIDIA IS FACING THE BUYING GUIDE IS NO LONGER ACCUATE, PLEASE WAIT UNTIL THERE IS CONFORMATION ON PERFORMANCE OF THE GEFORCE FX SERIES, THANK YOU AND COME AGAIN ;) 

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
September 13, 2003 1:40:04 AM

i'm one minute late -_-

does as i say lolz



Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
September 13, 2003 3:33:21 AM

THANK YOU Spitfire_x86 & CoolSquirtle!

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 3:41:44 AM

Quote:
some kind of ATI/Valve connection

Um yeah, they are bundling their game in OEM cards, logically it is a connection.

Quote:
Regardless of what anyone says, I think theres some kind of ATI/Valve connection working out here.


Gabe Newell said:
Quote:
Ask Microsoft if they think we have cooked these numbers or failed to invest in optimizations for all hardware.


So now I tell you, ASK MICROSOFT!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
September 13, 2003 6:02:20 AM

Quote:
So now I tell you, ASK MICROSOFT!


Should I ask microsoft why Valve STRONGLY did not want the det.50s to be used by the media for testing?
:tongue:

I dont think any #s were cooked n booked. Please refer to my above statement on the strange circumstances involving these benchmarks.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500
September 13, 2003 6:07:04 AM

LOL.
NV said before HL2 is released they will be available public.

I hope you guys dont get to silly acting on this thing. You're going to look goofy when NV returns.

They are a proud company and these internal mistakes are probably going to make a grade A company like NV come back incredibly strong.

Probably are the last company I'd hedge my future bets against.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Geforce 3, Audigy, Z560s, MX500
September 13, 2003 6:11:16 AM

I hope that very little IQ is lost if there are tremendous boosts in performance in Cheaptimizer 50. If Cheaptimizer 50 can't save nVidia in this product line, than probalby nohing can.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 6:18:52 AM

Unfortunately, I agree.
You know every trick they have is going to be on the table now.

/sadface


On the brighter side, what if NV drops prices way below ATIs?
The line would be killer if the price was right.

/happyface

:tongue:

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500
September 13, 2003 6:22:53 AM

Thats silly considering the timeframe in which the det50s are going to be available.

But whatever its your guys' buyerguide!

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500
September 13, 2003 6:24:13 AM

Yes, as an uber-fast DX8 part,0 the 5900 Ultra should sell well in the $100 - $130 range. That's be a reasonable price, don't you think considering how the 9600 Pro blows the 5900 Ultra out of the water.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 6:41:27 AM

Quote:
Should I ask microsoft why Valve STRONGLY did not want the det.50s to be used by the media for testing?

Actually a recently released statement by Gabe affirmed these drivers removed fog effects and supposedly had unfair optimizations (extreme). Removing effects warrants denial of usage anyday for me!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
September 13, 2003 6:44:24 AM

Kinney, I think Eden's gotta point there this time.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 7:10:46 AM

Haha, I knew you guys couldnt resist getting carried away..
At $100 the 5900ultra would make even blinking before purchase the dumbest mistake in your life.

Before you guys swagger away with ATI pride... I'm interested in seeing how a sequel to the #1 most influencial game in the history of the PC, Doom 3, treats your ATI 9600 Pros.

NV is probably virtually in bed with Carmack by now.
And I'm hoping my 9800NP isnt deflated on D3 release as the NV are with HL2.

/shrug

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500
September 13, 2003 7:12:47 AM

Oh, well then thats pretty crappy of NV.
edit- Its not so much the magnificent point he made, he just read a bit of news I missed. So the scores still Kinney 2, Eden 0.
Just playin eden. :tongue:

I'm sure the public release will have fog enabled. Pretty hard to optimize fog, what the hell are they going to do?
Dither it? LOL.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100 in Dual DDR, Radeon 9800NP, Audigy, Z560s, MX500<P ID="edit"><FONT SIZE=-1><EM>Edited by kinney on 09/13/03 02:16 AM.</EM></FONT></P>
September 13, 2003 7:27:01 AM

So, you finally got your 2nd 9800 Non-pro ordered. Excellent!

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 12:49:47 PM

It seems to me that if HL2 doesn't turn out well for NV after the new drivers are out how well doom III plays may be a moot point. Between what we enthusiast buy, and perhaps more importantly what we/techsites recomend to the people we know, the FX line could be in serious trouble. Even if everything works out OK, I'll still be annoyed with NV for delaying my games. HL2 should be in my hands right now if it wasn't for all this FX mumbo jumbo, and Carmack is in bed with NV because he wants the game to run on as many computers as possible. He currently doens't have to get in bed with ATI to get that. Come to think of it, if NV didn't require all that extra work, doom III would be in my hands as well. Damn them again.
September 13, 2003 2:52:37 PM

Last night, I was considering the same possiblity that the nVidia trouble were at least partly responisble for the delay of the upcoming games. I would love to see HL2 & Doom 3 in my own hands as well. :frown:

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
September 13, 2003 9:21:06 PM

Teehee,
Quote:
NV is probably virtually in bed with Carmack by now.
And I'm hoping my 9800NP isnt deflated on D3 release as the NV are with HL2.

I found another real fact I can prove you against, on this one too... *jumps on toes around, lalalal*

The fact that nVidia provided the Doom III testbed, and ATi publically stated they were never told about this to at least prepare their cards for it (though I must wonder why should they, I thought game-specific opt isn't a good thing, though I am sure NV did it), while nVidia had every single chance in that system to rig it up for their cards.
These D3 benches are absolutely grains of sand in the Egyptian desert! They hold no meaning. When THG tests with their own system, with true public drivers, Doom III, chances are things are gonna change. PLUS the fact Carmack is optimizing for both, not just NV. He is making sure each gets the best path. But I bet it also takes him 5 times more for NV cards' coding.
So really, anything goes. And chances are, Doom III will play at 16-bit precision (with visual banding) on FX cards, hence boosting them.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
September 14, 2003 3:43:07 AM

$100-130??? WAY overboard, not EVERYTHING revolves around half-life 2. I personally am an ATI fan, and would probably never go nvidia, but I guess I'll wait and see. Its not like any new drivers or whatever will influence any purchases I make anyway.

O, and anyone who says Valve and ATI have a deal is a dumba$$. Valve is a business, and they aren't going to prevent the majority of gamers from being able to use thier product intentionally.
September 14, 2003 4:12:42 AM

But its more than just Half-Life 2 now. Half-Life 2 is a very non-biased representative example (as well as Tomb Raider) of how Direct X 9 games are going to run on these cards. The only reason why Nvidia has done so well with Doom 3 is because John Carmack uses specialized proprietary vendor specific codes on nVidia instructions sets. Otherwise Doom 3 would run like crap on Nvidia cards. John Carmack has spent much more time tackling the chore of using these vendor specific codes that take so many countless hours of time to program in conjunction with one another than standard OpenGL. ATi can use the standard GL extensions and run like a champ. John Carmack has also been having to sacrifice some image quality for Nvidia cards like lowering the Floating point Precision to 12-bit among other things. John Carmack has stated that Nvidia's FX line struggles with Standard OpenGl. Gabe Newell has clearly demonstrated that Nvidia's FX cards can't do DX9 well at all. IMO the 5900 is inferior technology to the 9600 Pro. I didn't used to think this way, but now Nvidia's shortcomings have been revealed and the truth uncovered - R3xxx squashes NV3x <b>ACROSS THE BOARD</b>. Nvidia used to have such good products too in the past. It's really ashame that would blatantly have the intent of cheating their customers. I never imagined Nvidia would pull such a scandal. I don't trust them anymore. I'm not a fanboy either, I'm just representing facts as I perceive them.



My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
!