NVidia made good desicion to jump to .13?

Boxical

Distinguished
Mar 15, 2002
6
0
18,510
Im no genius and I do realize that ATI has a better card out now. What Im wondering about is:
Even though Nvidia is experiencing delays as a result of decided to use .13micron for their new card, once they get it out, they will have a much easier time making newer and faster cards (raising gpu core speeds) with the .13 micron process.
ATI on the other hand is going to hit a roadblock, sure by not switching to .13 they got a faster card out now; but are they not going to run into problems when trying to make even faster gpu core speeds because they are still on the .15. Eventually they will be forced to switch and they will then again be behind Nvidia.

Any thoughts? Rebuttals? Complaints?
 

Crashman

Polypheme
Former Staff
Dude, you're so full of crap your eyes are brown! ATI is also moving to .13 micron. They simply released the card "early" on the .15 micron process while the foundries they work with continue to work the bugs out of their .13 micron manufacturing process. In fact, they plan on moving their own 9700 chips to the .13 micron process whenever these foundries are ready.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
 

letdown

Distinguished
Jul 11, 2002
66
0
18,630
If NV30 had made it out on time, allowing it to go head to head with the 9700, we'd all be praising nVidia.

Obviously it didn't make it, and that's because of the gamble on 0.13. Even though it seems stupid for nVid to take the risk when they're already the market leader, what other card would they have released?

In other words, where could they have made more money on another 0.15 GF4? THey have everything covered, and even though ATI just undercut them at every level, nVidia can still hang on to those markets on brand name alone. For a while...

Keeping the top spot among enthusiast cards isn't really that important, so there's no need to spend a bunch of time on a GF4 5000 that can just eek past a 9700.

The place they really got aced out is in the notebook market, which WILL cost them. But that's not NV30's fault.
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
Boxical, what you are saying is absolute right.
If nvidia gets a stable running nv30 out by the end of December (all information available is confirming this) then nvidia will have the big lead for quite some time.

Of course ATI is also releasing a 9700 supporting DDR II, but this one won't be available before May/June. So concerning DDR II, nvidia will have a lead for 5-6 months.

The 0.13 micron issue is even more worse for ATI since they won't be able to release a r300 or r350 core based on a 0.13 micron manufacturing process before Q4 2003 at the earliest because they will have major problems when doing this dieshrink (the same nvidia also had and probably even others).

So, nvidia will be again ahead on the 3d market for maybe a whole year if not more, since they are already preparing a 0.09 micron die for their next graphiccard after the nv30...

Information about when ATI releases their DDR II chip can be found here: http://www.eet.com/semi/news/OEG20021024S0043

On a sidenote:
History repeats itself. The process happening here currently with nvidia and ATI already started with Intel and AMD 2-3 years ago. At that time AMD released the Athlon XP based on the stable K6 core which of course did perform very well since the K6 technology was available for several years and had already been fine tuned. Intel on the other side had major problems breaking the 1GHz barrier with their PIII and their new generation P4 core didn't work and perform like Intel hoped it in the beginning. So AMD was very successfull because the competition had nothing to offer during that weak phase. Over the years the P4 has now been optimized and enhanced and therefore is performing like Intel always hoped it would do, beating again AMD since they concentrated themselves so much on their now dated Athlon XPs based on the K6 core which no longer are scalable. Adding to that, Intel has a nearly ready P4 based on 0.09 micron and is already designing a CPU based on 0.064 micron! AMD still has troubles getting their 0.13 dies working cool, fast and stable...

Back to ATI and nvidia: ATI shameless made use out of a situation where nvidia had nothing to offer because their were concentrating on a cutting-edge technology. So ATI released their r300 which beats the GF4 but unfortunately the 9700 is not ready to keep up with the technology nvidia will release in a two months time (nv30 showcase already in 3 weeks!). ATI now has the performance lead over any competitor but will again be second by the end of this year. As with AMD, it will take ATI much time (at least 9 months) to release a dieshrink of their r300. Of course many ATI fans will now be shouting out loud that the nv30 can't beat their 9700 but this behaving is normal when the underdog gets a product out which can temporarly beat the competition. Just wait for the nv30 release and it will be again all calm here in these forums :)

Have a nice day!

PS: I'm solely talking about performance and technology advantages, not pricing advantages which the r300 will surely have over the nv30!
 

Ghostdog

Distinguished
May 28, 2002
702
0
18,980
The only thing that is for sure about the NV30 is the 0.13 micron size. Do you know vacs, for sure, that the new ATI disease will beat the r300. OK, so abviously it will. It is almost impossible for nVidia to f*** things up that bad. But when people say "oh yeah, the Nv30 will most likely make cinematic rendering possible..." it´s disturbing, since they hardly know [-peep-] in reality.

Right now ATI is doing very well, they´ve even promissed a competitor for the NV30, but eventually they will have to move to 0.13. And if you think about the trouble that nVidia has had... Will they be able to patch the situation somehow? It could happen if they enslave the engineers and play it right.

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
nvidia has stated that they have the situation under control and all passed problems have been solved. We'll see about that in 3 weeks for sure. I believe that the question is no longer if nvidia can patch it up or not (I'm sure they did); the question is how many problems will ATI have in the transition process to 0.13 micron! Don't be so naive and think the ATI will have no problems whatsoever when doing the dieshrink. Actually it's more likely that ATI will have even more troubles then nvidia since they don't have the amount of resources and knowledge nvidia currently has! Beware I'm not saying that ATI has mediocre engineers because they have some of the best in the world but they won't possible safe them from dieshrink issues...

Concerning the "cinematic rendering", yes it will be possible for the nv30 to do cinemalike CG rendering! Pay attention, I said "cinemalike". Graphics like seen in "Final Fantasy - The Spirits Within" won't be available anytime soon for the PC market. Even with the best shaders available on any upcoming graphiccard. Graphical achievements like seen in the Final Fantasy movie are even demanding everything out of the rendering workstation featuring several gigs of RAM each. Some scenes in FF needed to be divided into an average of 10 layers and rendered separatly and then put together in the end to handle all that massive amount of data.

That we will be seeing with the nv30 is cinemaLIKE graphics which can be very indistinguishable to the cinema counterpart if used correctly. Many tricks will have to be used, many details will have to be neglected but in the end realtime graphics like one is used from the current CG movies will be possible. There is no doubt about it. The GF3 is already performant enough to render a FF scene in realtime; of course the hair of Aki did have not the same detail as in the movie, but in the end both the cinema and GF3 scene were indentical to the viewer. Don't forget that the shader performance and flexibility of the nv30 are lightyears ahead of the GF3. You'll see that when the first nv30 optimized demos come out, I can hardly wait for it :)

Brief, cinemalike graphics will be possible next year if the developpers can quickly enough adobt the new shaders possibilities of the nv30. Although it's very high likely that the first nv30 optimized games will unfortunately ship after the nv35 has been released.

The nv30 only opens the door to a new graphic revolution, the revolution itself will come later...
 

eden

Champion
This all looks interesting indeed, but it's the end result we want.

I'd like to correct you on the K6 thing. AMD did not use the K7 off the K6, it was a different design from NexGen. If you try to look at the K6 pipeline and core, it is most definitly NOT the same at all than K7, far from it.
Please provide me proof in what you claimed, though.

--
I guess I just see the world from a fisheye. -Eden
 

Rubberbband

Distinguished
Jul 9, 2001
867
1
18,985
Alot of people will wait for Nvidia's card anyways. Let's face it most people can't afford the price of a 9700. Once Nvidia releases the GF5, they'll immedietly release a budget version (Ti 5200?) and I'll buy that one. I also trust Nvidia's products more than Ati's . I've run two Ati vid cards (8500LE and Radeon 64Mb VIVO) and both were a pain to set up.

The Men Behind the GUNS!

<A HREF="http://www.btvillarin.com/phpBB/viewtopic.php?t=327" target="_new"><b>MY SYSTEM</b></A>
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
Eden, would I shock you if I said that the Hammer core is basically an advanced K6 core? I can't find that article again where the author explains this, but I found another article (have been looking hard to find it back), which explains why the Athlon XP is just higher clocked K6 CPU:

http://www.emulators.com/docs/pentium_4.htm
scroll down until you find the headline "Failed to Meet Expectations" and then read the next paragraph.

Infact I can only suggest everyone to read the complete article (may take several hours) to understand the current situation between AMD and Intel much better. It's very very interesting (expecially Round 3 and 4) and not too difficult to understand:

http://www.emulators.com/pentium4.htm

BTW the author of that article previously preferred and recommended AMD until... well read the article ;)
 

Ghostdog

Distinguished
May 28, 2002
702
0
18,980
No, you missunderstood. I agree that nVidia must have already solved the previous issues, since the release is under a month away.

I never said I doubted that ATI will have 0.13 problems, I´m sure they will, maybe even bigger then nVidia (since, as you stated, nVidia is a bigger company with more experience), I only wondered if they could have some replacements for the time when going through the manufacturing change.
nVidia has had a gap between products, while ATI gave us the 9700Pro and for the first time took over the throne for real. Maybe they couldn´t get the GF4´s any higher and focused all their energy on making their next chip something big. I really hope so.

As allready talked about on another thread, even though Cg would make all kinds of cool stuff possible, it will take some time before we see Cg games.
For one thing, most gamers have "outdated" hardware. And it takes some time to get used to new tehcnology. Maybe there are a few game developers that want to make the coolest graphics out there, but since DX8.1 is only now swimming up mainstream, we will have to wait a bit.
And how about compatibility? I´m not to clear about this issue. Is the NV30 the only chip to currently support Cg, or was it that Cg is compatible with DirectX 9 hardware?

Could you give me some links to these "Cg movies"? I would love to see what my new GC will be able to do.

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 

Ghostdog

Distinguished
May 28, 2002
702
0
18,980
When the NV30 comes out the Radeons will have to cost less while the NV30 as a new product will cost about the same as the first Radeon 9700pro´s. The budget version will cost less, but will it be worth it?

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 

Crashman

Polypheme
Former Staff
ATI <i>already</i> reported problems with the dieshrink, which would indicate they are <i>already</i> working on it.

Shameless? You mean like how nVidia held back good drivers for the GeForce3 until the day before ATI released the 8500?

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
 

eden

Champion
I know that article and you're far from right about the K6-K7, as nothing even backs your proof!

I can name you many things or THG themselves can show you how far the K7 is from a K6 (read old articles about K7's introduction), and how K8 is a K7 souped-up with 2 extra stages, and NOT a K6.

--
I guess I just see the world from a fisheye. -Eden
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
Hmmmm... how did this thread get off into Hammer stuff? Let's try to get it back on course.

Here's my thoughts on this matter:

NVidia made a good choice of moving to .13u, but their 2 problems were that it took them too long to get the card out and it wasn't as good as the R9700 Pro. If they had have been able to transition to .13u earlier and get their card up to R9700 Pro performance that would've been great. Unforunately, their card had to be modified to outperform the R9700 Pro and I think after that, the DDRII pushed them behind schedule behind more than .13u probs.

This way, ATi'll have a .13u card out before the middle of next year, and nVidia won't have much time before then. ATi basically one-upped nVidia in terms of product cycles, and unless ATi misses one, nVidia may never get that advantage back.

...And all the King's horses and all the King's men couldn't put my computer back together again...
 

Makaveli

Splendid
I agree with you chuck. Ati is aware of the 0.13 issues nvidia had. I seriously doubt they been sitting on there ass. Ati might have problems going 0.13 and they might not! Companies learn from other companies mistakes. And I believe the problems they were having was due to an immature manufacturing process and low yields. I think ATI will is already working on a DDR 2 radeon on 0.13!.
Don't get me wrong nvidia is a great company and has a good track record. But Ati has been doing this just as long if not longer. Everyone doubted they could recover after the 8500 screw up on release and look at where they are now.

In the end this is all just speculation and none of us know what is gonna happen, so until I see an NV30 production board all bets are off!
 

eden

Champion
I was simply countering a rather false claim that the K7 is based on K6 as a souped up clock ramping CPU.

I think vacs is mixed with the core names.
K6 is the old 1997-98 CPU by AMD, it had three versions, K6, K6-2 and K6-3 and could run on Intel motherboards and sockets. K7 was out in 1999, under the name Athlon, used Socket A 462, and Slot A.
Palomino is a K7, and is a refined process with core advancements.

I am very inclined in that vacs is mixing core architectures with core names.

--
I guess I just see the world from a fisheye. -Eden<P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 10/27/02 09:09 AM.</EM></FONT></P>