Sign in with
Sign up | Sign in
Your question

Geforce FX-The P4 of graphic cards?

Last response: in CPUs
Share
January 28, 2003 8:36:02 PM

I read the article about the geforce fx and I was very disappointed in the performance gain that the fx had against the 9700 pro. It ultimately proved only slightly better than the 9700 pro and its GPU has a whopping 175 MHz lead. Now this sounds vary familiar. Sounds like the good ole AMD VS. INTEL performance per clock cycle debate. It seems to me that NVIDIA has went the way of raw GPU power (like Intel) rather than architectural superiority (like AMD). But then again if you take a look back you will see that when new generation of technology emerges from a graphics company, normal driver problems and weaker performance usually come with it. Take the Geforce 3 for instance. When it was first released, it actually was being beat in several benchmarks by its its little brother the geforce 2 ultra. I’m hoping for NVIDIA’s sake that this is the case for the FX. If so, than we have yet to see what the FX can really do. Any comments?

More about : geforce graphic cards

a b à CPUs
January 28, 2003 8:52:35 PM

Comments? Like the P4 Willimette, it appears to be a dead end from the outset, but will likely contribute to a decent product in the future.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
January 28, 2003 9:00:45 PM

nvidia as go beyond peformance or benchmark this should be follow be others.

Just next to the lab and the bunker you will find the marketing departement.
Related resources
January 28, 2003 9:02:08 PM

What are you saying?

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
January 28, 2003 9:21:36 PM

I would say nVidia did indeed make a "P4". They concentrated too much on including fancy enough features and not enough on performance for modern software. They went above and beyond the call of duty for DX9, instead of 96-bit FP precision, 128-bit, instead of the minimal programmability, they went well beyond the specification calls for DX9. What does this mean for modern games? Absolutely nill. In the future, as software comes out that uses fully everything on the NV30, they'll look better than they would on the 9700 Pro and programmers will be able to do more. However, the downside is, we'll probably never, ever see any of these features see the light of day because, well, it is above the lowest common denominator and you know how programmers feel about going above the lowest common denominator.

"We are Microsoft, resistance is futile." - Bill Gates, 2015.
January 28, 2003 9:26:12 PM

The way u pose it, NV did a smart thing. As long as the card will run current games well, it's good enough. Then, b/c features, it'll be able to last some time. People are just very nearsighted to recognize this sadly.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 28, 2003 9:48:37 PM

What? It is loud and when you crank up anti-aliasing and anisotropic filtering the 9700Pro beats the FX in many benchmarks.

I think nVidia developed the FX and then realized that ATI had something really good in the 9700Pro. nVidia had no choice but to overclock the FX to get close to competing. This of course meant they had to incorporate the noisy cooling system.

<font color=red>The solution may be obvious, but I can't see it for the smoke coming off my processor.</font color=red>
January 28, 2003 11:33:20 PM

I equate gameplay to not only graphics, but also to sound. The FX will drown out my speakers.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 29, 2003 12:16:59 AM

The problem is that you simply cannot do raw power with GPUs, unlike CPUs.
They are not hyperpipelined like the P4 at all!
GPUs have been around for 3 years now, and in those years they have climbed to only 500MHZ, and at what, 0.13m! To say they can take this route is hypocritical, as you KNOW they cannot go on like this.
The NV30 went this way and is at a heat output that makes AMD feel like a cherry to the tomato that nVidia is.

According to davepermen in the graphics board, he seems pretty confident the NV30's "new" features are not all that. The NV30 is also equipped very similarly to the R300, and in fact takes some of its performance enhancing features. Therefore both are so on par in technology wise of the components, that for nVidia to gain much ground on any image enhancing technique, bandwidth saving, graphics enhancer technology is not too realistic.
It will most likely be most succesful, as a Quadro FX, as its performance on nVidia's website seems to be promising in that professional segment.

Besides why are we discussing this here really, we're only relating it to the Wilamette and then we are beginning a graphics card discussion...

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 29, 2003 1:28:09 AM

Have fun with that idea, it appears its all you care about and not the actual features of the card. As Eden has wisely realized too, the fx holds much promise in the professional area.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 29, 2003 1:38:28 AM

Nobody is complaining that the card isn't fast in features and such. Everybody is complaining about the noise and heat it generates. The card's features look nice and all but the noise/heat it generates isn't so nice. That and the card doesn't even blow away the nearest competitor. Something nice might evolve out of the card, but the product itself is bad. The northwood evolved out the the willy, but I don't thing people were to happy with the willy when it came out. When those bad factors outweight the good factors, the card is bad.
January 29, 2003 1:48:11 AM

well i would agree that the geforcefx is comparable to a p4 in theory but in the real world there is one difference...

the p4 doesn't suck.

"There is no dark or light side, only power and those too weak to seek it."
January 29, 2003 2:17:13 AM

The professional area has nothing to do with it. This is a consumer website with consumer reviews. The fact is that the FX holds very similar features as the 9700 pro and very little performance gain in the consumer market.
January 29, 2003 2:22:50 AM

Ho damn, wonder what dem reviews about the glx1 under linux and the 980xgl were about?

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 29, 2003 2:23:12 AM

Yes, comparing it to the P4 was just theoretical. I was simply stating that a ATI 325mhz chip had practically the same performance as a 500MHZ NVIDIA chip. That’s all. And your right I wasn't downplaying the P4 I was just stating that obviously NVIDIA is going a different route.
January 29, 2003 1:27:19 PM

It is so poetic though... Nvidia starts using 3dfx technology and look where they are. the GFX is the Voodoo5 of the Geforce Series...

Im sorry folks, I have been an Nvidia fan since they beat 3dfx back in the day with the Geforce DDR but this new card is plain and simply pathetic. they need to go back to the drawing board.

"There is no dark or light side, only power and those too weak to seek it."
January 29, 2003 2:51:11 PM

On paper, the FX is a knockout. Getting it laid out and out on PCB is another story. This also can said for Perhalia.

Give it time to mature some, when it hits the street Im sure it will give significant performance increase. I expect Nvidia to hold back on total performance like the det 4's when ATI released the 9700pro.

I can see Nvidia playing possum and showing low hand to get ATI to set its sights lower on release of the R350

<b>"Granted I dont own a P4. But I read enough stuff and waste enough time on forums newsgroups IRC and computer news sites that I proberly know more then if I DID own a P4." -vk2amv</b>
January 29, 2003 3:20:37 PM

Quote:
NVIDIA has went the way of raw GPU power (like Intel) rather than architectural superiority (like AMD).

sorry to tell you, but saying that the Athlon XP is superior in architecture than the P4 is fanboyism at its purest form and has nothing more to do with objectivity.

This statement is totally nonsense because one really can't anymore compare current CPUs like it was possible with K6 and older Pentiums. The reason why a 1Ghz Itanium beats a 1.6GHz Opteron without problems is not because it has superior architecture (is there even something like a superior architecture?), but simply because the Itanium has a different architecture than the Opteron. A different architecture with different methods to reach its goal(s)! But maybe the developpement process of the P4 applies to the GF FX (at the initial release low performance but over time Intel tweaked it heavily). For the time being we don't know at what domains the nv30 core will have its advantages over other chips at which time in the future... the card is too new to know that.

Considering the before-said, I'm not sharing the opinion like others that maybe a 256-bit bus will give the GF FX an big increase performance, nor do I think that new and improved drivers will change the performance of the nv30 dramatically (5% are realistic, but not until the summer). But I really hope that nvidia really has designed this card with the next generation of games im mind which are not yet available (DX 9.x and extensive use of shaders). Maybe only games like Doom III (not Unreal 2 because its a game of the current graphic generation) or benchmarks like 3dmark 2003 coming out any time soon will show the real performance of the GFFX... There must be a reason why the shader part of the GF FX is twice as complex as the one on a Radeon 9700...
January 29, 2003 4:13:56 PM

Im not impressed with the FX at all...it is more comparable to the release of the ati 8500 back when the gef3 ti500 was the king(since it is a late release and doesnt offer much of an advantage over the current competition and will be more expensive) ....that I really disagree with the p4 comparison since the p4 is a very cool running processor...compared to the turbo engine driven cooling system of the FX :smile: ......the big positive aspect for the gef line of cards are the superior drivers that are released for the hardware, unlike the radeons trouble plaged driver set. Hopefully ati will straighted their drivers out.
January 29, 2003 4:53:27 PM

A different architecture? Of course is has a different architecture? It has an architecture that performs better per clock cycle, which in my opinion has a better architecture. I think we can all agree that if something performs better than something else, that something is superior…right? I Thought this was common sense. Are you feeling ok? It seems to me that not only did you judge me incorrectly but you also informed us that you indeed are an Intel fan boy your self. In my opinion, with all those high stats of the FX, it should perform better than it does, it was rather disappointing, I was really hoping for a major comeback from NVIDIA.
January 29, 2003 5:01:19 PM

Just to refresh your memory, the 8500 came out barely before the TI-500. And the 8500 was doing very well until a week after its well known bumps, such as its quake III trick for getting higher frame rate by lowering quality, and NVIDIA’s release of a new set of drivers. Basically in a matter of a couple weeks the 8500 was down to the same level as the geforce 3 was and stayed there until the release of the ti-500 when it was beaten, of course the gap between the two cards slowly got smaller as drivers got better for ATI. Just a little history for ya.

Just to let you know, tom has been using the 9700 in most all of his CPU benchmark sets because of its excellent driver compatibility. The driver problems of ATI are basically obsolete.
January 29, 2003 5:42:40 PM

Quote:
Im not impressed with the FX at all...it is more comparable to the release of the ati 8500 back when the gef3 ti500 was the king(since it is a late release and doesnt offer much of an advantage over the current competition and will be more expensive) ....that I really disagree with the p4 comparison since the p4 is a very cool running processor...

For the record: I'm very disappointed about the GF FX! Not that I wanted to buy one, I'm perfectly happy with my Ti4600 which will play smoothly all games coming out this year with all features enabled. But nevertheless I'm almost shocked that the GF FX cannot stand up to any hype which has surrounded it since its initial annoucement: same performance as the r300, hot running, expensive, over the top power consumption, noisy, not being able to keep up with ATI's AA/AF Quality and so on. Nvidia has much explaining to do in the upcoming weeks/months.

Concerning the P4 - GFFX comparision, yes one can't compare heat generation of both chips with each other but the comparision could work because of the bad performance of the first P4s, and the later great performing Northwoods. Maybe nvidia will succeed also, who knows?

Quote:
It has an architecture that performs better per clock cycle, which in my opinion has a better architecture.

Taylanator, if you are only considering IPC, then yes, AMD has the better architecture. But considering many other facts, AMD loses big time: heat generation, clockspeed autothrottling, SSE2 and QDR but above all its long durability. Intel just confirmed today that the P4 architecture will eventually end up in 2005 running at 10.2GHz! Now compare this with the Athlon XP/Barton 2.0-2.2GHz (2800+ - 3200+) end of cycle dead end. Of course MHz is not everything (does one still need to repeat this?) but 10.2GHz speak for themselves in terms of architecture design (remember the P4 started at 1.4GHz!).

Concerning the Intel fanboy thingy, if AMD gets the Hammer out anytime this year and it really rocks, I will praise AMD... but for the moment I see no reason for this.

I will never root for the underdog, but only for the best solution available at that time :) 

Edit: Maybe the reason why the GF FX performs so bar on with the Radeon 9700 is only because most games/benchmarks are more CPU limited than GPU... We'll see this one, faster CPUs are arriving :) 
January 29, 2003 5:56:51 PM

Word has it that the R350 is coming out along with the FX. IF Nvidia is actually trying to play a low hand then they need to fire their entire marketing dept along with the PR people and the people responsable for the cooling.

Many people aren't about to wait another couple months for Nvidia to get their act together with drivers when ATI has a solid card on the market. I would say that unless Nvidia releases details about a quiet FX, it'll fail miserably. 100,000 are supposedly going to be for sale initially. 100,000 suckers are going to buy this thing?!

I've already abandoned Nvidia for my, in progress, upgrade.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 29, 2003 8:03:03 PM

What will be next a 16 layer card after a 24 layer boards with huge fan.They have to move to socket like if they want to be able to produce faster VPU but resulting by getting AGP port obsolete and do a big change in the mainboards abd controleur


Give it time to mature some, when it hits the street Im sure it will give significant performance increase. I expect Nvidia to hold back on total performance like the det 4's when ATI released the 9700pro.

NV30 is for quality experience immersion unlike the RADEON 9700 that the reason a have never like the RADEON 9700.I cool to have a 256 bit DDR but does my screen look better no.


Just next to the lab and the bunker you will find the marketing departement.
January 29, 2003 10:47:21 PM

Quote:
A different architecture? Of course is has a different architecture? It has an architecture that performs better per clock cycle, which in my opinion has a better architecture. I think we can all agree that if something performs better than something else, that something is superior…right? I Thought this was common sense. Are you feeling ok? It seems to me that not only did you judge me incorrectly but you also informed us that you indeed are an Intel fan boy your self. In my opinion, with all those high stats of the FX, it should perform better than it does, it was rather disappointing, I was really hoping for a major comeback from NVIDIA.


You seem to have this notion that maximal clockspeed is independent of the architecture of an MPU. This is not true. If an architecture allows for a really high per-clock performance but severely limits the scalability, it isn't neccessarily superior to an architecture with low per-clock performance but is highly scalable, provided the same processing technology.
Considering that the current P4 is able to perform better at its top scalability using a less expensive and less drastic processing method (6 metal layer design vs 9 for the Athlon T-bred, both .13 micron with copper interconnects), I'd call that architectural superiority.

"We are Microsoft, resistance is futile." - Bill Gates, 2015.
January 30, 2003 12:14:27 AM

I suppose people then should not have bought gf3? Funny how nv released several times drivers that gave considerable performance boost, up to 25% in some cases. I'm surethe same will happen w/ the fx....if I wasn't happy w/ my quadro dcc I'd go out and buy the quadro version, hook it up to a water cooler and oc the beast.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 30, 2003 1:24:58 AM

Again who cares about drivers, in this case this is all about reputation. I repeat, ATi release a NEW GENERATION card, unlike the usual trend that nVidia set, AND did not have horrendous performance or compatibility issues. The GF3 was a new technology yet sucked at first due to drivers.

What this shows is that nVidia's driver team is either underpaid as of this year, or they are not what they were made out to be.

EDIT: Personally I don't see where the advantage of better performance with drivers lies lol, it will heat up considerably more!

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 01/29/03 10:25 PM.</EM></FONT></P>
January 30, 2003 1:30:20 AM

i bought, and still use a gf3 because it was the cheapest, most minimum fully DX8 supporting card, even though the gts ultra was faster.. the gf3 also happened to get faster and thats great.

but i base off of DX versions available at the time, simply because (and lets be realistic) any current card has plenty of speed for games unless your really anal about FSAA or tweaking your image quality out or benching when you should just be playing the damn games.

so the card for me would be the 9500 pro.. until the 9500 pro was released I would have to say it wouldve been the gf3 or 4200 (4200 if the price difference wasnt much).

9500pro=DX9 support on the cheap
hence should run doom3 or DX9 based games fine

but in the fx case, I was actually holding out on a 9500pro for the FX, thinking it would be the end all card.

Athlon 1700+, Epox 8RDA (NForce2), Maxtor Diamondmax Plus 9 80GB 8MB cache, 2x256mb Crucial PC2100, Geforce 3, Audigy, Z560s
January 30, 2003 4:41:41 AM

You're trying to tell me that people will buy an overclocked dustbuster even with a few extra points of performance?

I don't think so. This card needs new cooling and to reduce heat. That's the most important.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 30, 2003 4:43:14 AM

In order to run the FX, you need to watercool it, get better drivers, and/or dynamat your case?

Sounds like a plan....

For the rest of us though there are alternatives.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 30, 2003 4:43:29 AM

:) 

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 30, 2003 8:40:45 AM

::steps to the podium::

COUGH

::reads prepared speach::

I am a man of my beliefs, I am loyal to what I feel works, in the ati nvidia wars of old, I found nvidias driver stability, and ease of use, coupled with its good costs to be a valid reason to reccomend a geforce over any equvalent ati card(pre 9700 which came out after my departure and was bar none the fastest gpu available till the fx was released).

Having said that I have a few comments on the nv30

1: its faster than the raDEON 9700(when all benchmarks are averaged it is indeed the performance champ by a small margin)
2: it has better driver support(but ati is firm in its driver commitment and improves on a weekly basis)
3: it is and probably always will be more expensive than the 9700
4: It will grow as they optimise the drivers


having said all that, I will be buying a 9700 with my tax returns, the fx is too loud, its too expensive, and the ati driver issues are not worth the noise and cost to me.

I feel nvidia has stumbled in the competition, the fx is too little too late, I respect them as a company, and will eagerly await any revisions or upgrades they release for that line, but as of now, and for the forseeable future, a moderatly overclocked radeon9700(read 5-10% if possible((havent looked into yet)) is a far better value.

PS: eat that anyone who ever insinuated I was an nvidiot.

::leaves podium::

:wink: Heatsinks, if you dont overclock, use the <b>STOCK!</b> :wink:
January 30, 2003 10:43:53 AM

Booya!

Ladies and gentlemen, Matisaro has left the building!


--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 30, 2003 2:07:21 PM

juin, i don't understand what your point is... ATI cards have typically offered better image quality in both 2d and 3d. (see anandtech's review). and a 256 bit interface is integral for offering better image quality in games because it allows better framerates with AA and AF cranked up.

"There is no dark or light side, only power and those too weak to seek it."<P ID="edit"><FONT SIZE=-1><EM>Edited by vegandago on 01/30/03 11:17 AM.</EM></FONT></P>
January 30, 2003 2:14:09 PM

to people touting "wait for driver updates:"

Have you LOOKED at the card??? allow me to repeat.

HAVE YOU LOOKED AT THE CARD? be realistic here folks. the piece of hardware, regardless of whatever performance it may add, is completely and utterly ridiculous. Nvidia has overclocked their own card in order to compete. How can you argue with that? If what Nvidia has done is not overclocking then please show me what an overclocked GFX is going to look like, that sure will be interesting.

If you want to spend $400 on a GFX it should only to be to rip that cooler off and put it on a 9700... overclock the 9700 the way the gfx is... then let's see the numbers.

"There is no dark or light side, only power and those too weak to seek it."<P ID="edit"><FONT SIZE=-1><EM>Edited by vegandago on 01/30/03 12:31 PM.</EM></FONT></P>
January 30, 2003 4:04:19 PM

Originally, I got the gf3 basically as soon as it came out...I was the first one my entire area to have it. It was stable, and later got performance boosts. However, it WAS STABLE upon release- I certainly didn't have trouble with it, nor did my other friends who got it soon after. Can't say that about the R8500 for sure.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 30, 2003 4:11:08 PM

Eden, I can't beleive what you're saying. It was a little bit of time ago when ati people were all saying u have problem x, just wait for the next driver. Drivers do matter actually, and they'll give the fx sizeable performance boosts later on. I don't see anyone really bitching about the continuation of .15 process for r350, but I suppose it's cause most of u here are ati only.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 30, 2003 4:13:07 PM

dh, that is what I may do, not what is necessary. Water cooling is my choice....if I felt like it I could even buy a million dollar system to keep liquid oxygen and thus cool my system- what I do is not what others need to do.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 30, 2003 4:23:08 PM

flamethrower why are you even bringing up the 8500??? nobody is refuting your arguments about it... by the way i'm not an ATI person. I'm a game person, I like to play games and I like them to look nice. I have been over the years the owner of cards from 3dfx, nvidia, and ATI (in respective order).

the point of everything said here is not necessarily whether or not the GFX is going to be the best (although seems arguable) but rather we are discussing how they got there. They got there the same way the voodoo5 "got there." You catch my drift???

I am going to restate

HAVE YOU LOOKED AT THE CARD??? Have you ever, in the history of graphics cards, seen anything that ridiculous come out of a retail box?
"There is no dark or light side, only power and those too weak to seek it."<P ID="edit"><FONT SIZE=-1><EM>Edited by vegandago on 01/30/03 01:25 PM.</EM></FONT></P>
January 30, 2003 5:44:26 PM

humm I got called an intel fan boy for saying they do not run hot compared to the fx?....that would be a first...what relavence that statement has to being a fan boy I dont know...actually I have only owned one intel system since I started building systems four years ago...

But besides bs, ati does still have some driver quirks with the caty3.0. After I upgraded I would get hang up in games in opengl followed by loud screeches in my speakers...I found this to be a common complaint and their new drivers which are certified are supposed to correct this problem..humm I hope they figure it out in their next release. Besides that I love my ati card because the 2.5's work great for me w/ direct9x.
January 30, 2003 7:24:09 PM

Where did I even bring up the R8500?

I am bringing up the topic of ATI's R300's new technology bringing and overall performance and stability against nVidia's GF3's new tech and overall performance at debut.
nVidia was poorer at that time, now they did the same with the FX yet another company brought the tech before them and had no probs squeezing amazing performance out (up to 3 times in image quality tests as well)

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 30, 2003 7:29:28 PM

I was saying at the current junction of competition, drivers mean nada, it either performs well to sell right now, or it does not.
The FX did not get the right drivers and because in this case (was not the same when the GF3 came out) ATi is dominating the market well, and has a big horde who was waiting to see if the FX is worth it, if not then off to ATi, they will win even more because their card is established well now, stable, definitly NOT HOT and NOISY, and performs either close to, or on par, if not better sometimes.
At this critical point which was not the same back with the GF3's launch against ATi's offerings, chances are drivers will not matter, it's whether the card from either company is stable, runs games well, and has features and ease of use. Apparently nVidia's new card adds two bad variables: noise and heat. And in the end, who do you think wins?

Quote:
I don't see anyone really bitching about the continuation of .15 process for r350, but

I am not because ATi stated they will have refined its power consumption, and in the end it should not be using any noisy cooling. I will wait to see though, because the FX was REVEALED with 0.13m with FX Flow, while ATi only said they are continuing on 0.15m without yet showing us if it's bad or not. Why should I complain now?

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 30, 2003 11:45:16 PM

Allright, somebody brought it to my attention in the Graphics card section and I posted a handfull of information that I later found.

Results: PNY, BFG, and ABIT all appear to be releasing the dustbuster version.

For an additional cost, ie 550-650 Euros (normal vs ultra), you can buy the Gainward FX card that boasts a cooling scheme at 7 (seven) db. I have no clue how they'll get rid of the heat with a fan at 7db. The press release is dated February 1st.

<A HREF="http://www.gainward.de/gwnewsite/gweurope/gwcontent/gwg..." target="_new">Gainward FX 5800</A>
<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by dhlucke on 01/30/03 05:47 PM.</EM></FONT></P>
January 30, 2003 11:53:52 PM

look @ specs. It's clocked lower. Hmm, I'm thinking couldn't one replace the fan w/ a more efficient one there?

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 30, 2003 11:59:50 PM

They clocked it lower? I just read about the cooling. I'll go back and check. Either way, even if it's clocked lower it'll compete better against the dustbusters.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 31, 2003 12:05:36 AM

No, the ultra is 500/1000 and the standard is 400/800, both boasting the cooling at 7db.

If you're going to spend that much money for something that doens't really even run better than a 9700 pro if you use the aa and filtering, why should you have to spend more money to make it quiet and cool?

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 31, 2003 12:36:38 AM

It runs better when not using aa and af, and I can't stand either of em to be honest (they look to me like degraded image). Also, if u'll notice, when u run around and the objects r moving their position on screen, u don;t see ridgies. I'm more interested in its features anyway. Now do u approve of the card a lil better of it has such a quite fan?

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 31, 2003 12:56:18 AM

Of course. I love gainward. I'm just really curious to see it in action.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 31, 2003 1:45:35 AM

At least Nvidia is trying to do something about their hairdryer. How successful it will be, we'll have to see. Just hope it isn't something stupid like you have to buy an 800mm fan.
January 31, 2003 1:53:34 AM

Hehe, nv prolly released the orig for the l33xx0r$ g4(V)3R but then realized they weren't buying into it so they said fine fine, smaller and quieter. It ain't gonna be a 800mm fan, it's gonna be ur very own car engine somehow made quite...just remember to keep ur windows open cause that v16 really uses and spews gas!

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
!