Could it really have happened? Is it true?

vegandago

Distinguished
Nov 9, 2002
153
0
18,680
Disclaimer: In all honesty, by writing this editorial I in no way intend to start a flame-war. Rather, I intend to open the doors for an open and objective discussion on where the graphics card industry is heading.

Furthermore, by writing this, I am in no way claiming or implying that I am an expert on graphics cards, per se, but rather an end user, who, through the years (beginning with Diamond’s Monster 3d) has watched the industry grow and develop.

Now with that said, let me move on to my topic for discussion:

Has ATI done to Nvidia, what Nvidia did to 3Dfx?

WAIT!!! DO NOT HIT THE REPLY BUTTON YET!!!

Geez, atleast hear me out first.

Ok, where to begin??? Well, let us start with the timeframe here. As we all should know ATI released the Radeon 9700pro in August, which featured their much-hyped R300 chip. This chip has proven to be amazing in terms of what is available today. As anyone with a computer can see the 9700 easily performs 2 – 4x better than a Ti-4600 when the graphics options are cranked up.

So what about Nvidia, well as we all also should know Nvidia plans on counter-attacking with the advent of their ultra-hyped NV30 chip. Unfortunately for us gamers Nvidia has seen serious delays due to production and we won’t likely see the NV30 until February at the earliest.

So what kind of gap does that give us??? [aug, sept, oct, nov, dec, jan, feb…]
Ok, I got it, about 6-7 months!

So, what does this really mean? Well, it means that unless ATI has been too busy partying over the success of the 9700pro for the last 6 months they most likely have something up their sleeves for February.



Let’s objectively analyze the GeforceFX as it has been recently revealed, and I apologize ahead of time if this seems overly negative but that is the way I see it, as a long time (and current) user of Nvidia cards.

Nvidia chose to use the .13 process to help boost clockspeeds and keep temperatures down, correct? Well, I do believe that when I saw the pictures of the GeforceFX last week it had some absolutely ridiculous cooling mechanism attached to it. And come-on folks, let’s be realistic here. THAT FAN IS RIDICULOUS FOR A STOCK CARD.

What does this indicate to me? It indicates to me that Nvidia is in fact having much difficulty keeping the NV30 in a safe working temperature. This is something that all manufacturers face at some point but the fact that they need to use this absolutely ridiculous cooler for the card shows me that the NV30 is already near it’s limit. What next? Include water based cooling out of the box? If we put one of those huge cooling fans on a radeon 9700, how far could we overclock it? Would the NV30 be worth it then?

If I remember correctly Nvidia is claiming, and correct me if I am wrong, a 25 – 50% increase in performance over the 9700pro.

Let’s face it, those numbers are probably BS… That is what these companies do… They put out BS numbers.

But let’s assume that Nvidia decided to be accurate (in real-time performance)…

What are we expected to do with a $400 card that is already near it’s limit that only offers a 25 – 50% performance boost over a card that was available 6 months ago? This ofcourse doesn’t take into account that the 9700pro offers a much more conservative cooling heatsink/fan combo… What happens when we put a slot cooler with heat pipes on a 9700pro? Could we see equal or better performance out of it?



Ofcourse, does any of this really matter? Because ATI surely has not been sitting on their asses for the last few months.

Well, that is a good starting point … Please give me your feedback… and please try to keep it civil and objective. Like I said, I apologize for it seeming very negative but I really am looking at this scenario objectively. I am a longtime user of Nvidia cards but I just don’t feel it anymore. I think their time may have come.



"There is no dark or light side, only power and those too weak to seek it."
 

TheGame

Distinguished
Sep 8, 2002
43
0
18,530
I hope not because as much as i like ATI, i dont want to see them put nVidia out of buisness. the reason being who will be the competition for ATI? and no competition means higher prices for consumers.

-TheGame-
 

vegandago

Distinguished
Nov 9, 2002
153
0
18,680
I'm sorry, I should have clarified... When I made my opening statement I didn't mean to imply that Nvidia could go out of business just that they could have potentially lost their position as king.

"There is no dark or light side, only power and those too weak to seek it."
 

HolyShiznit

Distinguished
Nov 21, 2002
687
1
18,980
Yeah ATI has the best card out now, and has for a while. But at that high of a price, I wouldn't think Radeon 9700 sales would kill off NVidia's business.
 

vegandago

Distinguished
Nov 9, 2002
153
0
18,680
"Yeah ATI has the best card out now, and has for a while. But at that high of a price, I wouldn't think Radeon 9700 sales would kill off NVidia's business."

When have you ever seen the top of the line graphics card sell for less than $300?


"There is no dark or light side, only power and those too weak to seek it."
 

knowan

Distinguished
Aug 20, 2001
991
0
18,980
there's a rumor that the r-350 (sucessor to the 9700) has just tapped out. I haven't heard any news yet on features, speed or even die size yet.

Oh, and the heat pipe on the geforce fx has liquid inside it, so in effect it is liquid cooled already. I don't think that it will overclock very well.

--------------
Knowan likes you. Knowan is your friend. Knowan thinks you're great.
 

bloaty

Distinguished
Sep 25, 2002
133
0
18,680
That thing is liquid cooled aswell? haha, and i thought it was just had a big ass cooling system for show. But that is a good point then since radeon chip runs relativly cool(cant say the same for the ram though). Perhaps nvidia is clocking the hell outta their card just to match or beat the r300?
And i dont just hate nvidia either, i just hate everything in general.
 

Crashman

Polypheme
Former Staff
Didn't ATI say they wouldn't be ready for .13 microns until Mid February? And that the R300 had been PLANNED for .13 micron but moved to .15 micron when they confronted this delay? It sound like nVidia couldn't make that concession as the NV30 already produces too much heat! Anyhow, even if the NV30 performs as expected, and even if the R350 stomps it, nVidia will be around long enough to recover the lead eventually, because of brand loyalty. Remember the stubborn 3DFX users who continued to buy those products even after their had been no new technology for 2 years? Well, a delay of 6 months or even a year for nVidia to get a good and lasting lead would be completely acceptable for their deciples! Don't forget that most of the people in this forum still recommend the Ti4200 over the 8500 even though it is the same speed, with fewer features and lower quality TV features, and cost much more. It's ATI's challenge to gather those who ride the boarderlines right now.

Also remember the 1337 G4YM3RZ won't settle for anything less than an nVidia/AMD solution, many going so far as to claim a 6 month old Ti4200/XP2100+ system would beat anything from ATI or Intel, which would include a 9700/2.8GHz combo. Such buyers will continue to influence the idiots they know, assuring nVidia sales will remain strong no matter what happens.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
 

LtBlue14

Distinguished
Sep 18, 2002
900
0
18,980
to be fair the 4200 DOES outperform the 8500 in all of Tom's VGA tests, i just went back to make sure. in some benchmarks the difference is negligible (q3a), but the 4200 also overclocks better than the 8500 if i'm not mistaken. people have also preferred nvidia for a while because of ati's past driver problems. they want to recommend something that's tried and true. supposedly ati is releasing good drivers now, but past issues will still have their effect on the mentality of the buyer or advisor. i DO think that there are many nvidia fans who would not think to recommend the 8500 for the average user looking to save money and still get a decent card. it depends on what you want for your money (namely, OC-ability and a good reputation for drivers)
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
I hope not because as much as i like ATI, i dont want to see them put nVidia out of buisness. the reason being who will be the competition for ATI? and no competition means higher prices for consumers.

Trident may come as competeitor of ATI if Nvidia dies. SiS can also do this if they can give up cheating with their graphics chip (Turbo Filtering mode of Xabre400 for example). They are planning to launch Xabre II at 1Q of 2003. According to SiS, it should perform like Radeon 9500 Pro.

Let us know <A HREF="http://forumz.tomshardware.com/community/modules.php?name=Forums&file=viewtopic&p=25703#25703" target="_new"> What File compression format you use? </A>
 

ckal

Distinguished
Nov 26, 2002
1
0
18,510
"As anyone with a computer can see the 9700 easily performs 2 – 4x better than a Ti-4600 when the graphics options are cranked up. "

from the benchmarks on this site
<A HREF="http://www6.tomshardware.com/graphic/02q3/020819/radeon9700-11.html" target="_new">http://www6.tomshardware.com/graphic/02q3/020819/radeon9700-11.html</A>

I only see a 25-30% increase in most place by the radeon 9700 PRO over the ti 4600 .. not 200-400%

also this 6 month gap is nothing new in the video card industry, in fact lets look at the last part of the ATI vs. Nvidia war

So, it wasn't until august that ATI took back the title of having the fastest video card. That's 6 months after this Nov 7th review were the TI 4600 clearly dominated the market : <A HREF="http://www6.tomshardware.com/graphic/02q1/020304/geforce4-09.html" target="_new">http://www6.tomshardware.com/graphic/02q1/020304/geforce4-09.html</A>

And it's not even if Nvidia was even really loosing at that point, as we can see the Geforce 3 ti 500 could still out muscled the ATI 8500 (the best card they had then) in certain areas as for back as Nov 7th. (almost an entire year)<A HREF="http://www6.tomshardware.com/graphic/01q4/011107/radeon-05.html" target="_new">http://www6.tomshardware.com/graphic/01q4/011107/radeon-05.html</A>

And although the ATI radeon 9700 PRO is the overall fastest card out right now. It's still not untouchable.
<A HREF="http://www6.tomshardware.com/graphic/02q3/020819/images/image049.gif" target="_new">http://www6.tomshardware.com/graphic/02q3/020819/images/image049.gif</A>
 

HolyShiznit

Distinguished
Nov 21, 2002
687
1
18,980
"When have you ever seen the top of the line graphics card sell for less than $300?"

Oh I agree completely; I wouldn't expect a top card to be cheap. My point was that, in terms of the big picture, the big sales are mostly of the cheaper cards, right? Doesn't NVidia stack up pretty well against the mid-range ATI cards?
 

Crashman

Polypheme
Former Staff
I thought those test were old? You should talk to AMD_Man, he knows what's happened since, I'm fairly certain he pointed out benchmarks since then that showed newer driver revisions put the 8500 on top about 1/2 the time.

<font color=blue>You're posting in a forum with class. It may be third class, but it's still class!</font color=blue>
 

vegandago

Distinguished
Nov 9, 2002
153
0
18,680
"As anyone with a computer can see the 9700 easily performs 2 – 4x better than a Ti-4600 when the graphics options are cranked up. "

from the benchmarks on this site
http://www6.tomshardware.com/graphic/02q3/020819/radeon9700-11.html

I only see a 25-30% increase in most place by the radeon 9700 PRO over the ti 4600 .. not 200-400%""




To clarify, I was talking about when we put anti-aliasing and aniscopic filtering on. Considering without those options both cards are very limited by the cpu.

<A HREF="http://www.tomshardware.com/graphic/02q4/021104/r9700pro-cards-22.html" target="_new">http://www.tomshardware.com/graphic/02q4/021104/r9700pro-cards-22.html</A>

<A HREF="http://www.anandtech.com/showdoc.html?i=1683&p=10" target="_new">http://www.anandtech.com/showdoc.html?i=1683&p=10</A>

just some examples


"also this 6 month gap is nothing new in the video card industry, in fact lets look at the last part of the ATI vs. Nvidia war

So, it wasn't until august that ATI took back the title of having the fastest video card. That's 6 months after this Nov 7th review were the TI 4600 clearly dominated the market : http://www6.tomshardware.com/graphic/02q1/020304/geforce4-09.html"


this is true but is different from what I was indicating. Yes, the TI-4600 was the best for 6 months... but wait, when has ATI EVER been better than nvidia prior to the 9700??? I don't htink they ever were.. simple fact of the matter, Nvidia has not only lost but MISSED an entire product cycle. This favores ATI greatly. Not to mention that based on what i see with the geforceFX, nvidia is struggling to make it a viable competitor... reread my original post about the cooler


"There is no dark or light side, only power and those too weak to seek it."
 

LtBlue14

Distinguished
Sep 18, 2002
900
0
18,980
the 200-400% remark was accompanied by a "with graphics options cranked up"
you want <A HREF="http://www6.tomshardware.com/graphic/02q3/020819/radeon9700-23.html" target="_new">this site</A> for those benches

<P ID="edit"><FONT SIZE=-1><EM>Edited by ltblue14 on 11/25/02 07:14 PM.</EM></FONT></P>
 

LtBlue14

Distinguished
Sep 18, 2002
900
0
18,980
whoops i see that you already addressed that, and neither the edit post nor delete post functions seem to be working properly right now..
 

eden

Champion
Use (quote) and (/quote) but not parenthesis but [] instead to quote stuff, I could not find what you were replying for and where you began replying!

I agree with you however, you're spot on.
As for ckal, there's a good example of the traditional nVidia zealot.

--
*You can do anything you set your mind to man. -Eminem
 

andrew632

Distinguished
Nov 26, 2002
1
0
18,510
The fact of the matter is, since ATI released the 9700 Pro, nVidia has gained about 1.5% more market share while ATI lost about 1% (these are conservative estimates). While ATI might have the "performance crown" nVidia just has an assload of "budget" and midrange cards that flood the market and are the mainstream graphics cards (GF4 MX440, Ti 4200) that most people tend to buy. ATI released the 9000 and other low-end models a little too late.
 

Ghostdog

Distinguished
May 28, 2002
702
0
18,980
Generally I think ATI is on a roll right now, but that´s not enough to kill nVidia as a company.

nVidia did mess up a bit however, the NV30 should have used a 256-bit interface, then it would be future-proof aswell. The 0.13-process should have kept nVidias production solid for a good time in the future, but looking at the GeForceFX, they might have to start considering 0.09 or equal in a few generations. But then again, the GFfx is, accoriding to nVidia, a new architecture alltogether.

Also, the features don´t, apart from the PS/VS 2.0+ support, seem that much different to the R300. But I guess you would need to work for nVidia to really now what makes it special.

The main point is, nVidia is behind schedule, and it might be enough for ATI to get their counter-attack out to battle the GeForceFX.

<font color=red>I´m starting to feel like a real computer consultant.</font color=red>
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
nVidia did mess up a bit however, the NV30 should have used a 256-bit interface, then it would be future-proof aswell. The 0.13-process should have kept nVidias production solid for a good time in the future, but looking at the GeForceFX, they might have to start considering 0.09 or equal in a few generations. But then again, the GFfx is, accoriding to nVidia, a new architecture alltogether.
their new cards is yet now bruteforce more than cleverness. just think of the power that is used there in just from technical dates, still it does not look like it will kill the r300 by much. and the cooler, additional to .13 shows clearly they go to the max of the power. right from the start.
thats not a good start. they should beat the r300 without being faster by force. and _then_ crank up the force even more. but it looks the card itself is slower, it just runs at higher speed.. so once ati got the speedboost on their card (.13 or not..), it'll look bad.
i'm just thinking about getting an nv30, just to rip the cooler and plug onto my radeon:D

Also, the features don´t, apart from the PS/VS 2.0+ support, seem that much different to the R300. But I guess you would need to work for nVidia to really now what makes it special.
yep, nothing new except those +. thats _NO_ new generation, no _NEW_ big step, _NO_ big advantage and move to the cineFX world. not at all. its what the r300 has since half a year, except more instructions in ps and branching/conditionals on variables in vertexshaders.
_BUT_ those features are not standart, and developers will have to develop two entierly different paths to get most of both standard and nvidia proprietary. guess now why nvidia makes so much marketing, even on developers side? if they don't support their proprietary, and chances are there, they loose quite much. they have useless power in their card then wich no one can use, but everyone pays. if you code for nvidia proprietary on the other hand, its difficult to get standard code working => you need to code twice. and that is not efficient.
so those in fact tiny additional features don't help much, but force, once again, developers to .. make a choise, draw a line, where in fact only standard should be there to use.
i'm interested if i one time see a card from nvidia wich _ONLY_ has a standard and _NOTHING_ more in it.

The main point is, nVidia is behind schedule, and it might be enough for ATI to get their counter-attack out to battle the GeForceFX.
heh, once they wanted to launch at the same time with the r300..
yes they are quite behind schedule:D if only ati can crank up to be at about the same speed, but cheaper, just before launch.. that'll be so fun:D (and ati is able to produce slower cheap cards for the standard enduser, while nvidia isn't yet..)

"take a look around" - limp bizkit

www.google.com
 

hartski

Distinguished
Feb 12, 2002
403
0
18,780
I agree. If it has been 6 or more months since the Ti4600was king Nvidia should have made the NV30 at least no less than 2 months prior the release of the R9700.

When it comes to "loyalty" I still see a lot of misinformed consumers that just goes with the most popular brand, Nvidia. That is however for the mainstream cards namely the GF4MX. Most ppl would buy a GF4MX over a Radeon 8500, both are close in price. Although now ATI has the spotlight for itself and now it attracted a lot of consumers, of all levels and ATI's name is ever more exposed.

It took a long time for ATI to take the throne and not even the R8500 was able to do that.

All I hope is that come NV30 release ATI has R350 out too or so close. I would like to see ATI and Nvidia really going at each other, same with AMD and Intel.

*out of topic* --- is there a connection? Intel = Pentium Classic, P2, P3, P4 ... Nvidia = GF classic, GF2, GF3, GF4
... AMD = Athlon, Athlon XP, Athlon 64 ... ATI = Fury, Rage, Radeon ... Also AMD and ATI are both acronyms.
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
yeas hes right the 4200 DOES outperform the 8500. they do come quite quite close in some benchmarks, but on the whole the 4200 is faster (doesnt say much because the 4200 is a generation newer than the 8500 and really the price difference doesnt justify the performance diff.. id go for an 8500 any day)

and as far as the .13 process... going with this manufacturing process actually produces less heat along with the faster clock speeds. This doesnt mean that the chip will be cooler.. it will just run a few hundred MHZ more at the same temp. (my k62-550 is hotter than my duron 1.3gig for example)

IMO i think your right about ATI having something up its sleeve.. and there so many other things to look at. if ATI were to release a .13micron version of the r300 then they could raise it to clock speeds comparable to the nv30 easily. pair that with DDR2 and you got a card that will easily compete with NVIDIA i think.

consider this.
the NV30 @ 500mhz does 350million verticies a second.
the r300 @ 325mhz does 325million verticies a second.

soo.. what will the r300 do at speeds equal to the nv30? outperform it =). looks like the ATI chip is more efficient to me. ive said this before...

and also theres the nv30s features that the r300 doesnt have. the alleged partial dx10 support. well, by the time dx10 comes around both companies will have several new cards out each .. so dx9 seems fine for me now. heck im satisfied with dx8.1 for now until games come out =)
 

bikeman

Distinguished
Jan 16, 2002
233
0
18,680
A possible explanation for the huge cooling requirements is to be found in the same area as the reasons for the AXP to run hotter than an P4, whereas they both dissipate the same wattage: Die size ... The Radeon 9700 could be dissipating more heat, but has a larger surface to do so. Maybe the GeforceFX dissipates less power, but needs a more ... 'robust' cooling solution to get rid of the heat. So, in theory, nVidia could very easily sove this problem by increasing the die size next stepping they make, and who knows how it will be able to ramp up in speed.
BTW, can I note that I don't like the total negative attitude that is taken towards this new product. The starting post of this thread, on the other hand, quite explicitly asked to answer in a civilised manner. Which is not happening, apparently. Some persons actually like to make themselves heard, not by what they say, but by how they say it. But I'm pointing at _NOBODY_, really ...

Greetz,
Bikeman
PS: Nobody, I respect what you say and the seeming wishdom behind it, but I really get annoyed by your writing style. On the other hand, respecting doesn't mean I agree with you, but that's something else ...

<i>Then again, that's just my opinion</i>
 

bikeman

Distinguished
Jan 16, 2002
233
0
18,680
consider this.
the NV30 @ 500mhz does 350million verticies a second.
the r300 @ 325mhz does 325million verticies a second.
Just a thought I had: Could it be that these vertex pipelines get more and more similar to a CPU? That could mean that also there you have the IPC-stuff coming along. I mean, maybe the nVidia core is better in some instructions than the Radeon is, and the given 325-number of ATI is a maximum. It's just a thought, trying to justify the rather low vertex-count the GeforceFX is capable of handling. (How dare I say 'rather low', actually ...)

Greetz again,
Bikeman

<i>Then again, that's just my opinion</i>