Remove all GF-FX cards from the buyers' guide?

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
After looking at HL2 benchmarks, I think we should think about removing all GF-FX cards from our buyers' guide.

What's your opinion?

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 

Flinx

Distinguished
Jun 8, 2001
1,910
0
19,780
Actually I was wondering if those "optimizations" applied to the TI4200 or not?

If the "optimizations" don't apply to the benchmarks we see then the TI4200 still looks good for a few weeks but it also seems to be disappearing off the retail outlet shelves where I reside. Can only find one as "End Of Line". Or was that Out of Luck (hehe).

What's the new low end recommendation? A barebones 9600Pro. I guess they are about $150US about now. The very top of your low end price? Maybe a 9000Pro at $60US?

Mid range cards 9600Pro with dressings?

The loving are the daring!<P ID="edit"><FONT SIZE=-1><EM>Edited by Flinx on 09/11/03 09:32 PM.</EM></FONT></P>
 

Cassius105

Distinguished
Sep 6, 2003
25
0
18,530
Wait for the results to be verified

while i dont think nvidia are going to be able to legitimatly increase the performance that much just with driver tweaking

if you change the buyers guide now its going to look like your jumping the gun a bit

at least wait untill we are certain this is the final performance for the cards (this really should only take a few days anyway since most of the hardware review sites have HL2 benchmark and the det 50 drivers)

if this is the final performance then definatly remove them
 

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
Myself, I agree completely. We should not be encouraging/endorsing in anyway Nvidia's cheating. It's pretty much a given that the new det. 50's will be laced with cheats and other such stuff, and we should not encourage people to buy such pos. Keep up to the Ti 4800, but exclude the fx line.

As each day goes by, I hug my 9600Pro just a little tighter.
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Remove them ASAP. Keep the Ti4200 on the reccomended budget cards list considering that the Ti4600 was almost as fast as a 5900 Ultra.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

jmecor

Distinguished
Jul 7, 2003
2,332
0
19,780
the only choice so far from me is getting a 9600 pro or 9800 pro.

<font color=green>If your nose <b>RUNS</b>, and feet <b>SMELLS</b>.
Then you must be born <b>UP-SIDE DOWN</b>.</font color=green>
 

jmecor

Distinguished
Jul 7, 2003
2,332
0
19,780
the design of nv3.xx graphics chips sucks at first impressions.
Remember the dustbuster.
Geforce 4 so far is the most mature chipset yet.
Geforce 3 is the second.

<font color=green>If your nose <b>RUNS</b>, and feet <b>SMELLS</b>.
Then you must be born <b>UP-SIDE DOWN</b>.</font color=green>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
New Low end should probably be the 9600 non-pro if the prices differ greatly from the Pro version. The Low end market covers very wide range of products. Its critical that we have something decent to suggest for the sub-$100 buyers. The $100-$150 rangeis almost a completely category from the sub-$100 market. Its really tempting to offer the Radeon 9600 to the low end. With the GeForce FX scores where they are at, the Radeon 9600 or 9500 is MUCH better value and a MUCH better product than the GeForce 5900 Ultra. I know it doesn't make any sense, I'm so bewildered by the fact that cheapo components can completely blow away Nvidia's "bells & whistles" top-notch offering.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
Though the current support for removing GF-FX cards are very strong, I'm waiting for more votes to make the final decision.

I think I'll make the final decision by tomorrow.

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 

rain_king_uk

Distinguished
Nov 5, 2002
229
0
18,680
Having seen how GFFX performs with 3dmark03, Half-life 2, Halo and the new Tomb Raider game I don't think anyone can safely recommend it to anyone. Even if it does turn out to be fine with new drivers or whatever, I certainly wouldn't want to risk advising someone to blow money on it right now. I say play it safe, and take it down - I don't think the ATi cards are going to disappoint even if nVidia manage a miracle and pull their technology out of the fire in the future.
 

shadus

Distinguished
Apr 16, 2003
2,067
0
19,790
I would say no, but perhaps make a note of it somewhere in the guide... Also, I'd wait to see what the det 50s actually do once they come out, IQ and speed wise... I'd really like to see performance numbers on SWG/EQ2/AC2... shrug.

Shadus<P ID="edit"><FONT SIZE=-1><EM>Edited by shadus on 09/12/03 08:40 AM.</EM></FONT></P>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Spitfire, it would be very easy to edit your post and make a poll out of this. But if you don't want to, or you have reasons for doing otherwise I guess that's fine too.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
But for the time being, the guide is very popular and serves video customers on a daily basis. It might take a few weeks before Detonator 50 is fully benched and thoroughly tested for Image Quality. I say we should have considered removing them when the Tombraider Benchmarks came. Let's not take any chances. If we want this community to be reputable, we have to make sure we reccomend something we KNOW that works well. Otherwise, taking a risk, crossing our fingers, & holding out might make the THGC Graphics community suffer and lose its currently good repore. Its very possible that "Cheaptimizer 50" MIGHT fix things up but should we really take the chance?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

speeduk

Distinguished
Feb 20, 2003
1,476
0
19,280
Its only going to get worse as more DX9 titles come out that support all the features properly. The FX line is going to suffer more and more, and if people are not informed about its true performance, then they will be buying hardware than isn't up to the high standard we all expect from our £300+ cards....

<A HREF="http://service.futuremark.com/compare?2k1=6988331" target="_new"> 3D-2001 </A>
<A HREF="http://service.futuremark.com/compare?2k3=1283170 " target="_new"> 3D-03 </A>
<font color=red> 120% overclocker </font color=red> (cheapskate)
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
:redface: I've never spent more than $100 for a video card before. :redface:

Agreed: <i><b><font color=green>THE MASSES MUST BE INFORMED!</i></b></font color=green>

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

cleeve

Illustrious
If it's an unbiased guide, you don't really have a choice... all video cards should be present.

But on the same token, the guide is a living document and should be updated frequently with information. Wether or not the FX's should be recommended as highly should be reconsidered.

If the Det 50's give decent performance... even at a quality loss... then this information should be taken into account. Perhaps some people don't care as much about IQ as they do framerates, so maybe the Det 50s can deliver that to them. That's not my opinion, but that information should be delivered in the guide.

Keep up the good work,


------------------
Radeon 9500 (modded to PRO w/8 pixel pipelines)
AMD AthlonXP 2000+
3dMark03: 3529
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Yes they should be listed in the guide. Slow DX9 support should be listed in the cons for all the FX cards. As for as reccomending them, that's a whole 'nother ball park. I reccomend any of the ATI R300 based chips for most market segments. For the sake of nVidia owners who purchased FX boards a few months ago and can't take them back, the lost IQ in Cheaptimizer 50 drivers might be worth the perfomance gain, depending on how big of a gain there is, and how much IQ is lost. If a little difference in IQ means leaps and bounds in performance, that'll be the only worthwhile route current nVidia owners will have. Sacrificing IQ to make a game just playable for cutomers that were ripped off is a little different than sacrificing IQ to gain a lead on the top of the charts as was the case in UT2K3 and 3Dmark.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

TheRod

Distinguished
Aug 2, 2002
2,031
0
19,780
Keep them, but with a clear warning that they do not perform well in DX9.

I don't think any FX card is a good buy right now. Because even if nVidia boost the performance in Detonator 50, theses boost will probably not be application to all games. It seems that DET 50 will have "optimisations" for HL2 engine. But, will those optimisation will affect all the other games? If not, the DET 50 will be worthless for people who don't play games that benefits these optimisations.

Now, the FX cards are all overpriced. If they cut price on them, they might be an interesting alternative on low/mid market.

But not in HIGH-END, because if you have the money to spend on the best GPU, the only choice is ATI Radeon 9800 PRO, nothing else!

I hope all the opinions in thsi thread will improve the buyer's guide!

--
Would you buy a GPS enabled soap bar?
 

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
If you keep them, I'd say something at the bottom of each fx card that goes something like this "Card not recommened due to extremely poor dx9 performance. Newest drivers use cheats to increase framerates....yadda yadda and such". Make sure the person looking at this understand very clearly what the situation is.

As each day goes by, I hug my 9600Pro just a little tighter.
 

tombance

Distinguished
Jun 16, 2002
1,412
0
19,280
Just because theyre crap doesnt mean they have to be completely removed from the guide. Remember that HL2 isnt the only game out there, and that other games will run much better with nvidia cards.

Try not to base the guide on just the newest game out, but look at how each card performs in all games. Anyway, for all we know the Det 50's may make a MASSIVE Improvement in fps (although if they did it would look [-peep-] lol, nvidia's idea of optimisation, cut the detail).

<A HREF="http://service.futuremark.com/compare?2k1=6752830" target="_new">Yay, I Finally broke the 12k barrier!!</A>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
We're not talking removed from the guide. we're talking removed from the "reccomended list".

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Making mention of the poor DX9 performance and good DX8 performance, and better Doom ]|[ performance in the guide which runs off of OpenGL. However, several other here have made mention that Carmack is busting his arse to incorporate the Nvidia vendor-specific extensions into the game, a hefty task that most programmers wouldn't bother with. Therefore, I don't really see DOOM ]|[ as a representative of overall OpenGL performance when compared to ATi's stuff.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

shadus

Distinguished
Apr 16, 2003
2,067
0
19,790
Like I said, make a note of it and that we're waiting to find out what is going on exactly, but I would leave things as they stand presently with just an addendum. If someone can't pay attention to a guide that short and a note or two with it they deserve the 5200 non-ultra or 420mx they get...

Shadus
 

shadus

Distinguished
Apr 16, 2003
2,067
0
19,790
> Therefore, I don't really see DOOM ]|[ as a
> representative of overall OpenGL performance
> when compared to ATi's stuff.

Thing is, most games are ENGINE based, they aren't coded from the ground up. If d3 engine is opt'd for NV extremely well then anything using that engine will be fine with NV. There are only a few major engines in the industry. If more opt well for NV than ATI it will probally perform better, if they go the more standard dx9 route or opt for ati more, ati will do better.

Basically this whole thing to me is a bunch of crap. Quite literally. Member when sse was introduced in processors? Toasted when not compiled for it, won pretty handily when it was. I think nvidia was trying basically the same thing... and it backfired. Then they tried to cover up with cheats... and got caught. Shrug.


Shadus