Sign in with
Sign up | Sign in
Your question

The stupidest card ever released

Last response: in Graphics & Displays
Share
January 27, 2003 6:42:46 PM

NVIDIA GEFORCE FX:

Weight: <b>1.32</b> lbs (Radeon 9700 Pro is .49 lbs)
Temp: <b>68C</b> on heatsink OUTSIDE the case
Energy: <b>75 watts!</b> (Radeon 9700 Pro is 54 watts)
Noise: <b>77 dBA</b> (completely out of control)
Overclockable: <b>No</b>
Performance: Negligable compared to Radeon 9700 Pro


For those of you not sure what 77dBA is, here's a practical comparision: <b>It's like a car passing you at 50 mph, the dialtone of a telephone, chamber music in a small auditorium; or a Garbage disposal, dishwasher, average factory, freight train (at 15 meters)</b>

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 27, 2003 6:47:31 PM

perhaps when the retail version is released you can make judgements but give nvidia the benifit of the doubt, they have never done anything stupid before. I REALLY doubt that they will strap a 77dba fan on the retail version and if they do then I will want it even more...

:evil:  Wow, if he's here who's running hell? :evil: 
January 27, 2003 6:53:46 PM

No, no, no, I don't buy that. There's no positive reason to come out EARLY with this cooling system on there. It's just BAD press. If they were coming out with something else, then it would have been better to wait for people to review the card. I think they're in trouble.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
Related resources
January 27, 2003 7:02:18 PM

true they made a mistake letting reviewers get their hands on those noisy little buggers but it is more than possible to put a quieter fan on the retail versions, it may cost more but they will NOT release a card that puts out 77dba, it wont happen. I think the whole cooling solution is flawed though, the thing puts too much heat into the case and having the fan in a circut like that limits airflow and adds noise but then again if it wasnt in a circut all that heat would go into the case and some people's computers just couldnt handle the extra heat, mine could but the average user would be out of luck. those chips have to run pretty hot to be able to compete and that ram is new stuff I'm sure they will get all the kinks worked out though, with every batch of those chips they make the quality is improved, temperatures will go down and frequencies will increase. I am holding out hope for nvidia, they have a couple of weeks before they have to launch them, maybe, just maybe they will pull through.

:evil:  Wow, if he's here who's running hell? :evil: 
January 27, 2003 9:00:44 PM

I think ur a little short sighted there. First, NV will prolly do something about the noise level and cooling. Next, in terms of performance, I'm excited about this card for several reasons. First, IIRC, the GF3 wasn't too great when it first came out either in terms of performance in CURRENT games, but then again they run just fine w/ a GF3 anyway. Thus, we should see some considerable increase in performance later when its features are used more, and also NV develops better drivers (u surely remember some of the drivers that came out for gf3 that gave it sizeable performance boosts of 25% or more). Second, it has a very high accuracy so that I think renderers will finally make use of the vid card and prossibly use vid card and cpu together or vid card alone to render more efficiently.
That said, I just listened to the r9700vs the gf fx on my speakers (both set to same sound level) and honestly, it's not that bad- I've got that fans in my comp that make a lot more noise but b/c my case, u can barely hear. It's definetly got some potential....one of tom's benhes showed the fx doubling the performance of the 9700. What I hope they do though is increase bandwith....256 bit DDR2 would add considerably more performance imo.

Edit: Does weight really matter that much? I mean measures can be taken to attach it to the case if necessary. I doubt people hold their comp up while they use it....mine weighs around 80 pounds and it's fine.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti<P ID="edit"><FONT SIZE=-1><EM>Edited by flamethrower205 on 01/27/03 06:02 PM.</EM></FONT></P>
January 27, 2003 11:20:03 PM

I hope both you and Papasmurf are right. I think a better FX will lower prices more as they fight it out with ATI.

Unfortionately, I think you both are dead wrong.

Have you read the reviews at other sites? Tom's is overly optimistic if not biased. Almost nobody is optimistic.

To actually claim that the card isn't that noisy is ridiculous. Did you calibrate the beeps to that which you would normally hear from your computer? That card is a vacuum cleaner. I refuse to accept this type of noise and I have a feeling most would agree.

Did you see the temps it was reporting? That's with the vacuum cleaner attached, and the setup on a testing board outside of a case: 68C! How do you propose the drop temps without lowering clock speeds? Maybe they'll lower the clockspeed, and improve some of the temp, noise, and weight issues, but at what expense? I think it'll be very interesting to see how Nvidia works its way out of this one.

I did't think it was as easy as attaching another cooler when you have temps that high. I hope I'm wrong.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 27, 2003 11:36:53 PM

Another thing, what happened to the 48GB/sec of bandwidth? Can someone explain to me how this really works?

The GeForceFX actually has 16 GB/sec
The 9700 Pro has 19.8 GB/sec.

And I still can't believe that you don't think that card is a noise maker.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 27, 2003 11:41:30 PM

I'm saying in my case, which one would think is bulletbroof cause of the way it's built it won't be heard. There's a real good insulation in mine. In terms of the bandwith, the fx has supposedly 4:1 compression so that it can effectively send 4x16gb/s (although 48gb/s is 3x16........). I am to be honest pretty dissapointed about this.....if NV would fvcking put 256bit DDR2, it's really let the fx perform imo.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 27, 2003 11:53:53 PM

where is the mp3 at?
I cant find it anywhere

:) 
a b U Graphics card
January 28, 2003 12:09:13 AM

I've been telling everyone that this thing is factory overclocked, but they keep telling me I'm wrong, that it's part of the "design". Well, Intel raised their core voltage of the PIII to 1.75v from 1.65v to gain stability at high clock speeds. They also released a lager cooler to compensate. And what do we call raising the core voltage to increase stability at higher clock speeds, and increasing cooling to compensate? Overclocking! Now, most AMD guys will admit that this is true of the PIII, but won't admit that this is true of the FX. Why? Probably dislike for Intel as a company slated their viewpoints. But I'd just like to say, my PIII WAS factory overclocked a small amount, and this FX IS factory overclocked by a HUGE amount.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
a b U Graphics card
January 28, 2003 12:09:13 AM

I've been telling everyone that this thing is factory overclocked, but they keep telling me I'm wrong, that it's part of the "design". Well, Intel raised their core voltage of the PIII to 1.75v from 1.65v to gain stability at high clock speeds. They also released a lager cooler to compensate. And what do we call raising the core voltage to increase stability at higher clock speeds, and increasing cooling to compensate? Overclocking! Now, most AMD guys will admit that this is true of the PIII, but won't admit that this is true of the FX. Why? Probably dislike for Intel as a company slated their viewpoints. But I'd just like to say, my PIII WAS factory overclocked a small amount, and this FX IS factory overclocked by a HUGE amount.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
January 28, 2003 12:25:10 AM

crashman do you know where the mp3 is?
Or a link to the download site on tomshardware to download other stuff from also.

:) 
January 28, 2003 12:36:20 AM

It's in the review under the cooling section. I don't think it's on the first page of the cooling section though so go through them all.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 28, 2003 12:37:28 AM

I have the SX1040 which is pretty solid but nothing other than dynamat would save my ears. I don't think that's practical.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 28, 2003 12:46:26 AM

d00d you can't be this blind, just surf the article, if you even read it!
It was clear and written out well to be seen and noticed when reading the article.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 28, 2003 1:01:33 AM

Just like Dh, while I hope you are right, I think you are dead wrong.

I mean dude, do you hear yourself?
This is a psychologist's dream to barge in and rake some cash on you, you're in the best of the clear denial!

Quote:
it may cost more but they will NOT release a card that puts out 77dba, it wont happen

So sue the companies who already have taken pre-orders and the many who foolishly ordered the cards.

Quote:
but it is more than possible to put a quieter fan on the retail versions,

Of course it is, so as putting a Pentium 133 cooler on an Athlon. But do you think it's a sane choice when it already runs at 68ºc outside temp? The thing could likely burn a PCI card near it.

Quote:
I think the whole cooling solution is flawed though, the thing puts too much heat into the case and having the fan in a circut like that limits airflow and adds noise

It has TWO BACK PANEL air holes man, this is a first for any expansion card ever, IIRC!
It has an exhaust and intake which take air from OUTSIDE, pure fresh, not dehydrated heatsink fan air. The cooling solution may be noisy but from its looks, I think this is the best of the best of air cooling, and in fact probably the graphic card overclocker's dream if it was used on an R300 currently.

Quote:
those chips have to run pretty hot to be able to compete

I wish I could agree, but 0.13m should be stating otherwise.

Quote:
that ram is new stuff

So what, nVidia opted to try and max DDR-2's potential already with an insane clock bump of 190MHZ DDR over the R300, giving a total of 1MT. There is so little to improve, the core clock itself was bumped insanely for what, 25 Million more triangles per second? This is like 0.7 triangle per hertz, WEAK compared to 1 triangle per hertz of the R300. If nVidia worked right, they would've gone with ATi's 256-bit width memory bandwidth, (yet another advantage ATi has, and YET their card does not output Nasa's shuttle's power consumption and require vaccum cleaners), clocked the DDR-2 at much less and kept the potential and headroom it has for later for the sake of competition.
Again the RAM has nothing to really do in your argument for optimism.

Quote:
I'm sure they will get all the kinks worked out though, with every batch of those chips they make the quality is improved, temperatures will go down and frequencies will increase

It's not about if they can do it, I am sure they can, it's about just how much can it really go still?
If the R300 has stood by 400MHZ core clocks max at 0.15m, and the 0.13m from nVidia started at 500MHZ and has insane heat output and temperatures, just how far can it still go? Anandtech was able to overclock by about 8%, up to 540MHZ before the card actually THROTTLED. Sorry but temps will definitly not go far down, the GPU is already on an insane amount of PCB layers, and unless they do like AMD's Tbred B revision and add silicon layers, I just don't see where they can improve in their process, that will lower by more than 30% their current heat and temperatures. I believe the 0.13m process can go as far as 600MHZ core.

Quote:
they have a couple of weeks before they have to launch them,

As I said, I think you're simply off the wall, though I will hold you on to your small hope.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 28, 2003 1:17:55 AM

Quote:
also NV develops better drivers (u surely remember some of the drivers that came out for gf3 that gave it sizeable performance boosts of 25% or more).

That's where I get pessimistic. ATi's R300, when it came out, was perfectly stable, performed extremly well. And on top of all this, like the Geforce 3, it brought a new generation technology in it, YET ATi did not make it run at or below current cards.
If nVidia's driver team was still as good as before, they would have worked on it before they released the FX, as there is already a technology out on the market with what the FX is made out of.
Quote:
Second, it has a very high accuracy

Last I checked the R300, DirectX 9 was all that.

Quote:
it's not that bad-

I am with Dh, you are simply deaf and blind -figuratively-, blinded by your fanatism. I can imagine the noise, and believe me, even an isolated case will do nothing. The Retail AMD fan for an AthlonXP 2000+, isolated in a very nice Antec SLK3700 case with only the back which has air holes, was still heard fairly well.

Quote:
....one of tom's benhes showed the fx doubling the performance of the 9700.

In a synthetic one, which is in my book not gonna represent reality any soon. In fact the FX lost in MANY benches of 3d Mark 2001 to the R300.
Quote:
256 bit DDR2 would add considerably more performance imo

Bluntly said nVidia are a bunch of IGNORANTS for actually going with the idea of 16GB/sec of bandwidth. No company in their right mind would have done this, at this critical point of competition, especially when they now caught up to ATi's bandwidth saving technology, putting them nearly on par per byte.
Quote:
Does weight really matter that much? I mean measures can be taken to attach it to the case if necessary.

You are thinking selfishly and not for the mass boy, look at the rest of the forum, no one is willing to put up with something which can drag the weight on the mainboard (as if high end HSFs aren't already doing this), NOR willing to up with the noise.

Yes there is optimism to the card, but it lies not in any improvement which is on the scalar part of things, i.e. higher clocks. It lies in EFFICIENCY as Poobaa said in the OTHER, lower noise, lower clock speeds, more efficient IPC.
Even drivers which can make the card significantly better will not make it 60% more viable to be bought when its current heat output and noise generation are simply unbearable.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 28, 2003 1:19:32 AM

Quote:
which one would think is bulletbroof cause of the way it's built it won't be heard

Yeah right, have you read what they said?
It was HEARD FROM ANOTHER ROOM!
Be it inside an isolated case, be it bulletproof, it CAN and WILL be heard.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 28, 2003 2:17:27 AM

Speaking of overclocking the FX, I wonder if the card they sent Tom's wasn't specially picked from the batch. I suspect this because if you look at anandtech's benchmarks, or the German site, the FX doesn't preform nearly as well as the one Tom had. I am not accusing Tom of anything, I am just wondering if nVidia sent a faster card to one of the most respected hardware sites.

Anyone else notice that?

<b>Just because I like AMD or Intel more at a time because of one product compared to another, does not make me a fan boy, it makes me a person who is able to make a descision for myself.</b>
January 28, 2003 2:36:01 AM

I think we're dealing with a Williamette fiasko and I'm not so sure THG can write reviews anymore. It's been a while since a good one.

<A HREF="http://www.tomshardware.com/graphic/20030127/images/too..." target="_new">I like this</A>

I don't like the conclusion. How can we call the FX a "giant step forward"? How can they claim that "enthusiasts will, without a doubt, love the GeForceFX 5800 Ultra"? Only the biggest fanboys are enthusiastic. Everyone else thinks it's a major disapointment.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 28, 2003 2:52:39 AM

sorry Eden but I didnt stick around to find the mp3 or read down the review after seeing the benchmarks and heat and power consumption and how heavy (even for 2 lbs.)itll be on a agp slot.


:) 
January 28, 2003 2:55:27 AM

I got soundproofing in me case too.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
a b U Graphics card
January 28, 2003 3:46:44 AM

I don't think they picked a "ringer" for Tom's as it would have shown up when they tested the clock speeds. I suspect it's simply the way it was set up, the board they used, the memory, the CPU, the driver revision, anything could explain the better optimisation seen on Tom's review.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
January 28, 2003 4:47:46 AM

I would have make it suck air from inside the case and blow it out the back, since the two are side by side, you end up sucking back in some of the hot air anyway. If you have your case like most people, about a foot or two away from the wall on the floor, then you suck back in even more of that hot air. More and more cases coming on the market with a built in case fan on the side, so it should be enough to supply the card with fresh air. Also hope you don't have pets, or fur will clog that thing up pretty fast, even faster if "Felix" rub against it just as you start playing and the fan speed up (so that's what a furless cat look like :lol:  ).

Since this thing generate loads of heat, even a dual fan setup on it wouldn't work (most likely, but it would be more silent).

I'm happy that I got this 9700PRO instead of waiting for the FX "aka" Hoover on crack. The gain of performance is so little that it's not worth to put up with it, not to me anyway. If this was the best thing nVIDIA able to come up with 6 months after the July 17, 2002 9700PRO release, then I feel sorry for them.


p.s. my box still running fine, even though I can' see the fins of the CPU sink anymore, no clue how does this CPU getting cooled but somehow it manage (Battlefield 1942 running fine, no throtle down at all during play). Got compressed air yesterday, going to do some much needed cleaning of this setup (should I take pictures just to show?) and if I don't forget, order some filters.

<b><font color=blue>Press 1 if you want to be on hold, 2 for disconnect, 3 for a representative who will put you on hold before disconnecting.</font color=blue></b>
January 28, 2003 5:09:18 AM

Your cat would be seriously upset if it rubbed up against the case. The temperature at the exhaust is pretty high.

I have noticed that both PNY and BFG are allowing pre-ordering of GeForce FX's that are identical to the reviewed designs. So anybody who was hoping they were going to fix the card before release will have to think again.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 28, 2003 5:55:10 AM

flame, I know you have good experience with nVidia. This appplies for me, too. But I must say like the original poster, nVidia has released the stupidest card ever

<b> "You can put lipstick on a pig, but hey, it's still a pig!" - RobD </b>
January 28, 2003 6:41:27 PM

Current FX cards:

<A HREF="http://www.pny.com/home/products/Vcard_fx.cfm" target="_new">PNY</A>

<A HREF="http://www.bfgtech.com/fx_product_gfx.html" target="_new">BFG Tech</A>

Note the different PSU requirements.


<A HREF="http://news.zdnet.co.uk/story/0,,t269-s2126140,00.html" target="_new">Here's an article where Nvidia claims the cooling is an innovation:</A>

Quote:
The cooling system is a crucial innovation, said Nvidia senior product manager Geoff Ballew, allowing the 125-million-transistor chip to run at full throttle without burning a hole in the PC's casing.
.
.
He noted that PC processors such as the Pentium 4 are getting hotter with each new generation and will need similar cooling mechanisms before long.




<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 28, 2003 8:09:23 PM

the fx is has better color precision, and to be honest, the fire gl x1 sucks performance wise, I mean especially if it's bested by a previous gen card....anyways, I still see quite a bit of potential in this card and will remain optimistic about it and its features.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 28, 2003 8:31:46 PM

color precision not really much bether than the radeon9700pro.

128bit color against 96bit color. both have floatingpoint with enough precision (more than shrek used).

stop believing nvidias hype of new technology. they just have more pixelshader instructions, and real branches in vertex shaders. thats _ALL_ new they have. _REALLY_.

oh, i forgot. they as well have the highest voltage usage, the most heat producing thingy ever, the biggest cooler, the fattest board, the most weight board, and simply the most stupid design ever.

"take a look around" - limp bizkit

www.google.com
January 28, 2003 8:43:55 PM

It'll also render faster. Considering the GF4 quadros can best the fireglx1....

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 28, 2003 9:28:52 PM

Indeed. I have been waiting for this board anxiously. Saving up my money in anticipation. This is pure and utter pullshit. Thats right, PULLSHIT. If you are impressed by this board from the demos alone...take a look at the directx9 demos for ATI 9700PRO. MY roommate has it and they are stunning. Just as sexy and pretty as the FuX.

Now. before the fanboys utter another word of praise for this monstrosity, i would like the said fanboys to suck the exhaust of this board if they love it so much.

"my case is insulated, its not that bad" RIIIIIIIGHT, i would be afraid to place my tower next to a wall for the fear of peeling the plaster off. But judging by the racket, i would have to encase my whole computer into a brick wall just to get it to manageable volumes.

This is a disgrace folks. No excuse for such a monster. Ati has a quieter, lighter, more efficient product for a much better price and its made by canadians.

I think that nvidia felt the squeeze by ATI and the only way they can compete with their product is by overclocking theirs to the limit and attaching a vacuum to it. The product is rushed, just like that 1.13 ghz pentium 3 intel released with the pressure from AMD.

I think that today should be a new holiday, the day of the underdog. amd and ATI score big, and im smart enough to take advantage of that (and you should be too).

And if ATI is smart, they will reduce the prices of their radeon right about NOW. They allready have a large inventory, and right now there is a huge incentive for users to clear that inventory out. Some price cuts would kick off a huge shopping spree.

Seems to me Nvidia is heading the same path 3dFX took not long ago, if they think that users will put up with idiotic product features for the sake of buying nvidia products.

Damn it, i want to use htm here.
Man....damn....may rambus take your soul.
January 28, 2003 10:11:02 PM

First let me confess to being a bit of a NV fanboy, mainly stemming from crappy ATI products. However I have to agree with sentiment that this FX is the dumbest card ever. If this card runs this hot outside of a case, what will happen when it is placed in a small atx case? I could just picture Dell dropping this card into one of their systems then getting sued for frying someone's dog or cat. The sound issue may be fixable via a larger heatsink/heatpipe, but disipated heat energy still must go somewhere.

Profit-wise, NV has a serious problem on their hand, with a 12 layer design and an elaborate cooling solution. Personally I think NV should have focusesed on improving their memory bandwidth rather than increasing the clock cycle, but what do I know, I am just an ignorant consumer.


If I paid as much attention to people as I do computers I would be married.
January 28, 2003 10:17:42 PM

I have purchased the following graphics cards from nVidia...
TNT
TNT2-32
GeForce2 GTS
GeForce3 Ti500.

I have never owned an ATI card, other than the one that came stock in my old Sony Vaio laptop. I have been holding off on purchasing a new graphics card in anticipation of the release of FX. I do believe it is feasible for the nVidia drivers team to eek out another 5-10% performance from this card by tweaking the immature drivers (and perhaps a 15-20% increase in the woeful performance of memory-intensive AA and AF settings--note that FX is much farther from it's theoretical fill-rate than 9700 Pro, which means there's room for improvement.) However, I must say I am going to hold off a little longer on buying my next graphics card. I am not going to buy the FX unless nVidia pulls a rather large rabbit out of their collective hat by the time of the retail release of this card.

I'm not a fanboy of either company, but I've always been more than satisfied with nVidia's quality and driver support. If I am satisfied with a product, especially in terms of performance, stability, and driver suppport, I will stay with the known entity until something makes me change my mind. Believe it or not, the excessive noise and power consumtion of this card bothers me much more than than the disappointing performance. To me, benchmarks are useful but they are just theoretical indicators of how well your card CAN perform. I want my graphics card to perform well in the games that I play, normally FPS games like Unreal, Battlefield 1942 and the forthcoming Doom 3 (of course.) I might yet consider buying the FX simply based upon nVidia's history of extracting every last FPS from their products, but the high power consumption and ungainly proportions of the part are deal-breakers for me. I'm now thinking my next graphics card will be ATI's R350.

That being said, I have a theory. I believe nVidia designed the FX under a false assumption: that the Radeon 9700 Pro would be 10-15% slower than it actually turned out to be. nVidia believed that FX would be the next logical performance leader on their development roadmap, despite its outdated 128-bit memory interface. If I have any gripe with nVidia as a company, it is this: Like Intel, nVidia likes to milk the market by releasing products that are only as fast as is absolutley necessary to claim performance dominance. Well, that tactic may bite them this time. ATI is game. nVidia should have known, since is ATI is a much more formidable opponent relative to nVidia than AMD is to Intel. nVidia's roadmap is covered with coffee stains. It's time for nVidia to expedite (or revamp if necessary) the path to a 256-bit memory interface for FX. They could have released FX at a lower clock and memory speed, eliminating the need for exotic cooling solutions, and the FX would likely still have beaten 9700 Pro handily.

As it is, nVidia did not do this, so we are left with a very 3DFX-esque product release for a product that has barely superior performance, but not nearly superior enough to justify the hype.

Sorry, nVidia. The glitz has started to wear off.
January 28, 2003 10:37:32 PM

I'm finding it hilarious that u don't get what I'm saying about why I like the card. U guys fail to realize that there is no need to make a card that runs current games at a billion fps- why the hell would one need that? My practically 2 year old quadro dcc serves me well in running everything fine! Instead nv has loaded it with stuff that'll be used in the future, and that is where the true performance gain lies. Once again, as stated previously, the gf3 wasn;t that great when it came out, but 2 years later it's still a good card. FYI, my case is not the way it is when it initally came- in other words it's modified. I have put soundproofing in it. Yes it blows hot air out, but u're real moron to stick ur head there anyway.....my current comp will make ur face red if u put ur head behind the exhaust fans.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 28, 2003 10:56:08 PM

What features are worth all the problems with the card?

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 28, 2003 11:09:02 PM

Quote:
They could have released FX at a lower clock and memory speed, eliminating the need for exotic cooling solutions, and the FX would likely still have beaten 9700 Pro handily.


ya right and ill be using my dick as a pogostick to get to work tomorrow.

But yes, if they sold geforceFX's at the originally planned clock speeds LAST november it would have been recieved by reviewers and the public far better. It is a disapointment compared to the 9700pro but its not a bad card, but with the overclocking and insane cooling it BECAME a bad card. They should have swallowed their pride on this one a long time ago.
January 28, 2003 11:25:41 PM

I consider my self a Nvidia fan as well having purchaced a TNT, Geforce, Geforce3 Ti500, and an Asus A7N8X deluxe with the Nvidia nForce2. I also feel that the new Geforce FX is not a good product. But the reason I feel it is not a good product is purely due to the fact that it produces WAAAAAAYYYYY to much heat. Noise, size, and power usage (although related to heat) I can deal with, but heat I cannot. Using the .13u was probalby not a mistake but not doing what AMD did (add more silicon layers) was definatley a mistake. The crazy number of PCB layers is kinda dumb as well. All in all my 2 biggeest gripes are the lack of a 256bit memory bus and better heat disapation (which are most peoples gripes). However unlike other people, I will not get and ATI unless I absolutely have to or if one is given to me. I have faith NVidia will pick themselves back up after this obvious disaster and put out something that all the fans and other people will like. We should give NVidia some benefit of the doubt and but not too much. The product does appear to perform well despite the flaws in the design and will hopefully be a stepping stone and a wake up call. Bascially, if they don't fix this, I will move to away from their chip sets even though those are good and go to Athlon MP based systems and get an ATI video card. I am sure there are many more fans out there who feel the same way but won't hesitate to move to ATI if things don't get better. I for one am truely hoping the communities feedback is reaching them, for the sake of their company. Well all, there is another one of my rants and babblings....lol

One mans throw-away is another mans god-box. Help friends in need, I always do!!
Then again, having extra parts are great for making dedicated servers for LAN parties!!!!
January 28, 2003 11:45:46 PM

This is not a brand-name war. I think the Nforce2 is a great chipset for example, but the FX is moronic. Did you listen to the MP3s? I personally find the noise the most annoying thing, not heat.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 29, 2003 12:21:56 AM

I did, and I have a VERY VERY loud case but use head phones so I would not be bothered. I do admit that they could have come up with something MUCH quieter like some kind of tipmatic system. But in my opinion HEAT is the worst. The temps on that thing are NUTS!!!

One mans throw-away is another mans god-box. Help friends in need, I always do!!
Then again, having extra parts are great for making dedicated servers for LAN parties!!!!
January 29, 2003 12:27:56 AM

" Instead nv has loaded it with stuff that'll be used in the future"

In the future there will be way better cards than this POS. When I took a look at this card I just busted up laughing. So thhis is what everyone has been waiting for, LOL. I feel sorry for the cables that have to be sitting where the heat comes out. Can't wait to get my hands on the Tyan 9700 Pro :) .
January 29, 2003 12:57:37 AM

Again as dave said, the features on it are not THIS much, and are in fact very proprietary to be used in games. Chances are they won't be used, it's still DX9.
I expect the card to become a professional card like the Quadro FX, than a gamer's card for the "future".

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
January 29, 2003 1:08:59 AM

The quadro fx is released actually. I'm waiting to see how it performs, though I think we'll see it's true power maybe w/ a newer version of max/ other 3d software. Storm, I find it pretty dumb to buy a card every few months, so when I buy a vid card, it lasts ~2 years. I still have my quadro dcc, and doubt I'll upgrade till gf7 times.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 29, 2003 1:19:23 AM

I think they'll make with a new name other then geforce7 by that time flamethorwer205.


:) 
January 29, 2003 1:19:58 AM

You can buy the midline card every few months and you'll still be saving more money than if you buy the top of the line card when it comes out. Right now I don't think anybody really needs more than a geforce ti4200. Buy that for about $120 and you can upgrade again in 6 months or a year. Why the hell would you spend $400+ card so it will be "future proof"? Whats that card going to worth in a year? Maybe 1/2 of the price. Your "future proof" theory doesn't seem very optimal.
January 29, 2003 1:37:08 AM

Depends on what games you play. The GF4200 didn't impress me and I returned it.

<font color=red>
<A HREF="http://kevan.org/brain.cgi?dhlucke" target="_new">Introducing the NVIDIA GeForceFX: The first Videocard designed exclusively for deaf people!</A></font color=red>
January 29, 2003 1:44:45 AM

Well you can always oc it to 4400 speeds. And if not, I still think trying to future proof a pc is stupid.
January 29, 2003 1:51:32 AM

I also do 3d work, so we have to factor that in. 120x4=480 vs 350. I save $130.

"If everything seems under control, you're just not going fast enough."
- Mario Andretti
January 29, 2003 1:59:46 AM

Well your in a different category then. Then it is worth it if you do lots of 3d work or the like to get an expensive card like the quatro. But for a gamer, much of the power is wasted.
January 29, 2003 7:57:57 AM

Did you see the anandatech review... quite cutting, especially when they balanced up the aa/fsaa quality levels for benchmarking games. OUCH.

<b>My Computer is so powerful Sauron Desires it and mortal men Covet it, <i>My Precioussssssss</i></b>
!