Nvidia's Giant Gamble (no pun intended)

gal128

Distinguished
May 14, 2002
109
0
18,680
I don't know about you guys but personally I think Nvidia did the right thing by chosing the smaller core. It seems somewhat reminescent of the Intel/AMD battle. When the P4 was released everyone (even loyal Intel fanboys) was disappointed with its performance for one simple reason ... it sucked. Now as the P4 matures it is pounding the AMD performance numbers (no, I didn't mention anything about price). That could all change in the near future but that remains to be seen.

I think Nvidia is looking farther down the road with this current GPU and that would be to their advantage. I also think that they have put themselves in a position to squeeze ATI with the Ti4600 on the low end and the NV30/31 on the high end. I know ATI has to have something in their backpocket for a situation like this that is not a more mature driver set (maybe a 9xxx with a 400MHz core). Again, time will tell.

I think this leap coupled with DX9 will be producing some very happy gamers for the next few years. Hopefully DOOM III will start things off right.

Also, I think both companies would be able to enhance their driver sets a lot faster if they got someone along the lines of Terry Tate - Office Linebacker in those cube farms and labs.



HULK SMASH!!!
 

eden

Champion
Sure, maybe it will be better later, but if it already has such peak core and memory speeds, where exactly do you want to improve, if its temperatures and cooling are as well insanely strict?

That is the problem here, the card starts off on a better process technology, on an already existing and mastered (and mastered WELL mind you) API -DirectX 9-, yet it manages so little. The Pentium 4 Wilamette was more understandable, it was a new core ENTIRELY, with NO residue of the past, no borrowed technology from its previous processors (at least not major) or the competitor's. It was to be expected that its primary yeilds will be weak. But the main problem with it was of course competition, as Intel cut nearly 2 years in advance, the original Pentium 4 core, and therefore ended with a very young core design, with so little to perform.
Then Intel proceeded to 0.13m and had it mastered as well, like ATi mastered the R300+DirectX 9 core design.

Conclusion? nVidia slipped, did not fix the core, and has limited scaling for the future.
This is one of the reasons I am just not able to feel the same scenario here as with Intel's P4, or even ATi.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

gal128

Distinguished
May 14, 2002
109
0
18,680
With those things taken into consideration it leads me to believe that Nvidia is just overjuicing us. They have to know that we as consumers aren't going to stand for such a lackluster product (are we?). When I say lackluster I mean that huge ass fan that makes the card look like the hunchback of Notre Dame. And this can't be the best production of that core because the NV31/34 would have then have to have giant 4 slot fans to cool them accordingly.

We know that there are going to be driver sets coming out that will make the NV30 crank it up 10-20% conservatively and we all know that the mega fan has to go as well. With these two things in mind I feel like Nvidia is squeezing me down to the pennies and pickles right now.

HULK SMASH!!!
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
Conclusion? nVidia slipped, did not fix the core, and has limited scaling for the future.
That's was exactly gal128's point. Nvidia can still fix the core and certainly they will do it. It's an all new technology and they now just have to learn how to use it in the best way possible...

I'm pretty sure that nvidia could also have made a faster GF5 based on the 0.15 micron manufacturing process but they knew that after that it would not be possible to take more power out of that generation. So now, they have the "next generation" but they just need to finetune the technology...

I'm not saying that this will happen though, I'm just saying that nvidia has the ressources to get the nv30 technology in control and to optimize its performance...
 

Crashman

Polypheme
Former Staff
I think Nvidia did the right thing by chosing the smaller core

What are you on? That core is HUGE! Or do you mean they did the right thing by moving to a .13 micron process? Nobody would argue against that, because if they hadn't, the thing would put out even more heat!

You seem to be confused by the fact that they did a die process shrink and heat went up simultaniously? The "die shrink" has nothing to do with the heat going up, it's the millions of extra transistors, high voltage, and high clock speed, that are causing the heat problem. This chip wouldn't even have survived on .15 microns. You see, the "die shrink" was NEEDED to make it this "cool" and this "quiet".

So you'll never see anyone argueing against the "die shrink". Instead you'll see them saying "WTF, they shrunk the die and it's STILL this hot?"

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
 

TRENDING THREADS