Sign in with
Sign up | Sign in
Your question

ATi vs. nVidia - An Interesting Read

Last response: in Graphics & Displays
Share
November 1, 2003 5:18:30 PM

<A HREF="http://www.penstarsys.com/editor/tech/graphics/nv_ati_d..." target="_new">http://www.penstarsys.com/editor/tech/graphics/nv_ati_d...;/A>

----------------------------------------------
this signature area is too dern short, no space to put your system specs
November 1, 2003 6:00:32 PM

Thx that was interesting to read and gives some hope to the ones who already have FX cards :)  Altrough I'd rather wait for the second part of that article before drawing any conclusions.
a b U Graphics card
a b Î Nvidia
November 1, 2003 6:59:06 PM

Damn it I just sent that to GW (read it yesterday morning at work).

I was goint to post it later, since the end says there's lots of errors and he's working on a re-write (I thought GW could give him a more nV-centric view).

Here's the segment:
<i> This is not to say that everything here is wrong, but new information is certainly changing my perception of the situation. I would still welcome any feedback that readers would like to give. I am getting a lot of ATI-centric information, but I would like to also get a lot more NVIDIA info regarding their current technology and philosophy.</i>


Ooh well the cats out of the bag.

It was a nice breakdown, I do think he was a little soft on the nV 'cheating' issue, and that using it as a saving the shareholders money defense was schtoopid. What about ATI's shareholders, what about consumers. Would defrauding the government (the accounting issue nV faced) be an acceptable method of protecting the shareholders investments?

I don't buy that excuse. Reviewers and gamers don't/shouldn't care about the shareholders, they should simply care about trying to be objective and doing faithful comparisons; 'In a blind taste taste more people prefered the tasted of nV's green PCB to that of ATI's red one'. Now stuff like THAT is important to know! Now floptimizations.

Anywhoo, nice grab. I saved it and was going check on it periodically for updates. Well I'll still do that, but at least now there's something to talk about.

Nice grab LtB.



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
Related resources
a b U Graphics card
a b Î Nvidia
November 1, 2003 7:08:31 PM

I think the one thing you can draw from the article is that there is NO hope for the current FX cards. Sure they will be able to display VS/PS 3.0 stuff while the Radeons will not, but just like the FX5200 with PS 2.0 stuff, they won't be able to perform well enough for any game using them. Meaning that the current FX line while simply be good at showing demos, and maybe Longhorn (what I've always advocted for the FX5200 line :wink: ).

The main thing that should console nV fans is that they have EXperience in 32 bit architecture, and therefore have a leg-up on ATI inn that respect, so FUTURE products will likely benifit. Speaking of experience benifits, I don't think nV's experience with the 0.13 micron process gives them any advanatge as it's been a bad/troubled experience. ATI reached low-k first, made their part work with GOOD yields. Sure ramping up more speed/transistors might seem like a big issue, if it weren't for the fact that nV isn't doing it well either. IBM MAY be able to help with future processes, but then again so may Intel if they decide to Fab ATI's next next gen. piece, the R500 w/ 0.09 (which seems to be the buzz, and does make alot of sense, and where they have alot more experience), but it does seem that they are with TSMC for the forseeable future even heading into the <A HREF="http://www.xbitlabs.com/news/video/display/200309290047..." target="_new">0.11 process </A> for a refresh.

Needless to say that's all speculation, but it is the only bright hope for nV, NOT for it's current customers (unless nV gives them loyalty credit towards their next purchase of the cards that SHOULD benifit consumers).

Once again the future will reveal all!

Damn Future! I want it Now!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
November 1, 2003 8:39:59 PM

Jeebus - is the author of that article an nVidia employee? It's so one-sided it's unbelievable. Excuse after excuse as to why the GFFX sucks in the real world but is supposedly actually a much better overall product. What a load of crap. The way they put it ATi/MS conspired against nVidia when really what happened is nVidia thought it was bigger than the rest of the industry and tried to do their own thing to corner the entire market.


Also nVidia never in a million years thought ATi was going to come out with something so good, therefore it was ramping things up slowly so as to keep bleeding the market witout having to bring too much innovation to the table. When the 9700Pro came out nVidia [-peep-] their pants and had to overclock their weak technology through the roof to try to compete - hence more delays and the dustbuster attached to the side, not to mention all the driver cheats.

The biggest joke though is trying to excuse nVidia cheating their customers because they have a responsibility to their shareholders! Weak argument doesn't even begin to describe that.

I wonder what card the author has in his machine...
<P ID="edit"><FONT SIZE=-1><EM>Edited by RAIN_KING_UK on 11/01/03 05:44 PM.</EM></FONT></P>
November 1, 2003 11:27:15 PM

Quote:
Damn it I just sent that to GW (read it yesterday morning at work).

I was goint to post it later, since the end says there's lots of errors and he's working on a re-write (I thought GW could give him a more nV-centric view).

Here's the segment:
This is not to say that everything here is wrong, but new information is certainly changing my perception of the situation. I would still welcome any feedback that readers would like to give. I am getting a lot of ATI-centric information, but I would like to also get a lot more NVIDIA info regarding their current technology and philosophy.


Ooh well the cats out of the bag.

Yeah sorry! I am looking forward to the follow up on this article, and we have to remember that there's more to the story than what he has there...so no DEFINITE conclusions yet.

Quote:
I think the one thing you can draw from the article is that there is NO hope for the current FX cards. Sure they will be able to display VS/PS 3.0 stuff while the Radeons will not, but just like the FX5200 with PS 2.0 stuff, they won't be able to perform well enough for any game using them. Meaning that the current FX line while simply be good at showing demos, and maybe Longhorn (what I've always advocted for the FX5200 line ).

Agreed.

Quote:
Jeebus - is the author of that article an nVidia employee? It's so one-sided it's unbelievable. Excuse after excuse as to why the GFFX sucks in the real world but is supposedly actually a much better overall product. What a load of crap. The way they put it ATi/MS conspired against nVidia when really what happened is nVidia thought it was bigger than the rest of the industry and tried to do their own thing to corner the entire market.

He was saying how the NV3x series were designed and why they aren't doing well compared to the ATi cards. They were basically designed against a spec that was never developed (though in the future more of the features will be supported), and that's why they perform so poorly. He does say that nVidia went their own route, and got burned for it in the end. He doesn't describe it at an ATi/MS conspiracy - he says that most of the industry thought that the ATi/MS route was a better one to follow than the nVidia one, and ATi was just smart to stick with them. I can see how you could think that it's biased, but in fact I think it's a very good explanation for why nVidia has done as poorly as they have. They DID, after all, own the industry for a long time (and it can be argued that they still do - they're still much bigger than ATi), and they HAVE designed very good cards in the past. It begs the question of why their cards were so BAD this time around, and this article really gives a good answer for it. It doesn't make <i>excuses</i>, it just says that nVidia screwed up by going their own way.

EDIT: i wanted to add that one reason it could seem nvidia biased is just that there is so much info on nvidia and hardly any on ati - which he admitted, and now he's being flooded with ati info, so the next article may focus on their experience a bit more.

----------------------------------------------
this signature area is too dern short, no space to put your system specs<P ID="edit"><FONT SIZE=-1><EM>Edited by LtBlue14 on 11/02/03 00:17 AM.</EM></FONT></P>
November 1, 2003 11:47:20 PM

jesus guys, would you all just stop! put your handbags away, and have a sip of tea! look, it's no good in arguing whos the better, they both have good products. there needs to be more than 1 manufacturer to keep pushing each other to making better products! if there was only 1 then theyd get lazy and we would have dx9 for years and not have dx10, dx11, dx12...
November 2, 2003 2:14:22 AM

stfu u blinded fanATic, can't accept the fact the truth why nVidia lost to ATi and that it's possible for nVidia to waste ATi completely in the future? If you dont know anything, dont say anything. The only reason the dustbuster was as loud as hell is because the ABit people left nVidia to finish off the design on their own instead of finishing it for them. Driver cheats? that's a thing in the past, have you seen how good the 52.16s are? You probably haven't cause you're still freaking blinded by ATi's high scores.

pssssss

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy
bring it fanATics~~ nVidia PWNS all!
SCREW aBOX! LONG LIVE nBOX!!
November 2, 2003 4:08:14 AM

I disagree. nVidia managed without ATi's competition and released the DX8 NV20. They never needed to, now did they?
But they did!
Only after did the R200 come out.

In fact was it even nVidia who conceived Pixel and Vertex Shaders, isn't it Microsoft who did, for the DX8 standard?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>This just in, over 56 no-lifers have their pics up on THGC's Photo Album! </b></font color=blue></A> :lol: 
a b U Graphics card
a b Î Nvidia
November 2, 2003 7:12:36 AM

Yeah, no need to apologize, I just tend to think, 'Ahh cool, but I'll wait to share it, no one else knows about this, it's not on the big sites, and I'll wait for the update'. Of course little did I know that about 5+ hrs later it appeared on all the major sites. :frown: Oh well, so be it.

IT's interesting that in ATI's recent Analysts meeting they mentioned how important it was that they hit the DX9 standards, and that's where their focus will remain with DX9.1, and that they will be willing to sacrifice margin (not IQ or integrity) to gain performance, because they know that nV will be targeting that standard as well.

I'm not sold on the production advantages of nV. I think they squandered their potential lead. A bunch of underperforming parts does not make experience, and ATI is the only one with a LOW-K process part, not nV. So despite the help of IBM, they won't gain anything short term.

I have to disagree and say that intentionally or not, there appears to be SOME nV-centric bend in his statements. PErhaps it's simply trying to defend them and to NOT look like too much of a FanATIc, who knows. However I disagree strongly with his take on their questionable 'floptimizations' being warranted or excuseable. I think that's just rationalizing after the fact. Cheating for self-interest or cheating for the shreholder is still cheating, and I'm certain a large number of nV head honchos were more worried about THEIR bottom line than the shareholder. These actions could have GRAVELY hurt the company and shareholders, if the word had gotten out even to the nOObs and Knobs wout there. They were lucky I'd say (either that or very savy of the ignorance of their target FX5200/MX market).

I'm very interested in seeing how the adjustments/rewrite looks I even saved the original back at work so I could reference it later.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
a b U Graphics card
a b Î Nvidia
November 2, 2003 7:42:28 AM

DX10 WILL come With or Without competition.

After that who knows.

Supposedly DX10 will be incorperated into Longhorn because there are some MS effects that will require it (which actually makes me shiver [anyone remember security hole in DX9 :eek:  ).

There will always be competition, even if there isn't a competitor. That may seem to make no sense, but when the development lags, and ease of entry is low enough to show good return, you will see new entrants who will take market share. Sure it's easier for incumbents to defend their position, but should they sit back, then there's nothing.

I think what we are more likely to see is that regardless of which companies compete, the hardware will outdo the software by leaps and bounds. We may have DX10 parts long before the release of DX10 itself, and that would be WAY before games actually used it. Just look at the DX9 situation, it's been more than a year, and how many games are out there tha one could really call DX9, maybe a handful?

The fact is that nV decided to go it's own way, and go AGAINST the DX9 and HLSL standard and try to forge their own exclusive 'cult' where only THEIR parts got THEIR benifits. Hopefully they learned from this, because if nV HAD succeeded, then you WOULD see less competition, because one company would hold the key to the kingdom (whereas MS controls the software, but not the hardware, and even then OpenGL is still around, just in case :wink: ).

The main thing is that while there is no use in arguing 'whos the better' there is a point in arguing who's done people wrong, so that hopefully they DON'T do it again, and other companies also learn that by example. This wasn't some innocent design error or that nV was trying to give their cutomers way better visuals, this was a play for power that didn't work out, and their reaction to that misstep was to do one more insult by 'floptimizing'.

Competition is good, but this actually describes the OPPOSITE of what you are saying.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil: 
November 2, 2003 3:12:14 PM

Yet another piece of Grape's infinite wisdom.

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
November 2, 2003 9:18:50 PM

Looking at your sig, I do so hope you were being ironic on purpose. ;) 
November 3, 2003 12:47:38 PM

Excellent point TGGA. The FX line is a dead end when trying on pants in the future. The question I'd ask is...are the R350 and R360 set to fail on the PS3.0/VS as well? We know the FX cards perhaps will have the ability but not the performance. What of the RADEONS?

I think the question that the article didn't answer was what the REAL difference between PS2.0 and PS3.0 is. Not neccessarily the 'on paper' definition...but what it entails as far as coding and instruction goes and how the VPU, pipeline, registers, drivers, etc. will deal with it...especially on ATI's front. Anyways, that's the info I'd like to know.

----------
<b>Got any of that beer that has candy floating in it? You know, Skittlebrau? </b> <i>Homer Simpson</i>

TKS
November 3, 2003 1:19:07 PM

Good read. I have to admit that Nvidia is making some real steps in the right direction with their new Forceware drivers and custom compiler therein.

The 5900 and 5950 are, at the very least, not a complete joke anymore. I'd even go as far as saying respectable and competitive.

Having said that, I think the author's conclusions are... well... kind of stupid. Seems Nvidiotish, and here's why:

MAKING HARDWARE "FOR THE FUTURE" AND NOT FOR APPS USED CURRENTLY AND IN THE NEXT TWO YEARS IS COMPLETELY USELESS

This was Nvidia's biggest mistake. Call it like it is.

Learn a lesson from Itanium: advanced technology that can't run things has no practical reason to exist.

No one cares if the FX line is more advanced than it's competitors, because it's all about the bottom line. And the bottom line is performance.

By the time 32 bit precision is needed with everything, no-one will be using an FX card anymore because they won't be fast enough to do it properly anyway. So the entire "advanced technology" argument stares back at us from the toilet.

Nvidia, I think, has learned a couple hard lessons in the past couple years:

1. Don't ignore the industry. You aren't powerful enough to dictate standards. (Their ignorant handling of the DX9 spec)

2. Early adoption is not always the best choice (the 13 micron fiasco, and the FX line)

Having learned this (I don't think that they are complete idiots, quite the contrary) I am optimistic that Nvidia's next product will be quite a bit more competitive. Ati's R420 should be a monster, too...

The only real winners here are us, I reckon.

------------------
Radeon 9500 (hardmodded to PRO, o/c to 322/322)
AMD AthlonXP 2400+ (o/c to 2600+ with 143 fsb)
3dMark03: 4055
November 3, 2003 1:46:27 PM

And what if PS3.0 requires some stuff that the FX line don't have? I think it's worthless to talk about those features in existing products line. Check what happened to Matrox, they retract DX9.0 support for Parhelia.

Claiming "future" proof on technology driven world it's not an option. You never know hat can happen.

Maybe nVidia prepared the future with the FX line, but ATI too! ATI choose to focus un performance and nVidia choose to focus un features. And claiming that nVidia have the advantage because they use a more up-to-date manufacturing process is not a clear advantage. Because ATI will benefits it anyway, because when they will switch to smaller process, they will use "mature" production facilites.

I think next summer will be HOT, we will have a lot of fun with HL2, DOOM 3 and next gen. GPU!

--
Would you buy a GPS enabled soap bar?
November 3, 2003 3:33:33 PM

> we would have dx9 for years and not have dx10, dx11, dx12...

Just as a point, ms said dx9 is the last one for a few years.......

Shadus
November 3, 2003 3:58:20 PM

Bring Back The VOODOO!!!!



It's all Bush's fault...all of it...
November 3, 2003 6:29:03 PM

Nog, my matrox millenium 2 8mb 2d card + 2 voodoo2 in SLI-mode... hmmm 3d lovin.

Shadus
!