GT200 performance analysis + RV770 revised

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
I don't understand how Nvidia didn't up their texture address and filtering when they reworked their SP with GT200. The old G92 cores had 8 texture address/filter for every 16SP but GT200 has 8 texture address/filter for every 24SP. Those textures did a whole lot more for games than just higher SP clocks with G92 with modern games.

GT200 has the same 8 by 8 texturing ability just like G92 but only 10 clusters of 24 SP instead of 8 by 16SP which equals out to 80tmu. Texture fillrate was the biggest difference when comparing G92 vs G80 and why G92 was able to beat it in lower resolutions or get very close to high resolution with much lower memory bandwidth and less ROP. If they did 12 by 12 which would be the exact same number as G92 SP/texture ratio it would have 120 tmu instead of 80. GT200 is inferior far as texturing ability when you compare ratio to G92.

GeForce 9800 GTX 10.8 pixel fillrate 43.2 bilinear fillrate 21.6 FP16 fillrate 70.4 GB/s

GeForce GTX 260 16.1 pixel fillrate 41.5 bilinear fillrate 20.7 FP16 fillrate 111.9 GB/s

GeForce GTX 280 19.3 pixel fillrate 48.2 bilinear fillrate 24.1 FP16 fillrate 141.7 GB/s

Radeon HD 4850 10.0 pixel fillrate 25.0 bilinear fillrate 25.0 FP 16 Fillrate 64 GB/s

Radeon HD 4870 12.0 pixel fillrate 30.0 bilinear fillrate 30.0 FP 16 Fillrate 115.2 GB/s

Games don't need all that processing power as of yet. Most games off loads to textures and back to the memory for the most part straight from Nvidia by nRollo. So having more fillrate makes the biggest difference when you want the performance NOW long as you aren't shader limited. Sure 280gtx has more fillrate than 9800gtx but in reality it doesn't have that much more. 260gtx has even less than 9800gtx. This is where bandwidth comes into play with GT200 where it's not so limited compared to 9800gtx which you see the performance gains from most games. Just look at any of the reviews. You will see that 260gtx isn't that far off in performance compared to 9800gtx only when AA is applied in some ridiculous high resolution does it seem like it's more faster because of bandwidth advantages. Nvidia made a future product like 2900xt tried to do. But it's still not happening.
 

hughyhunter

Distinguished
Nov 20, 2007
865
0
18,980
Alot of that I really didnt understand but I agree with the GTX 200's not living up to the price! I'm disappointed in Nvidia for flooding the market with GPUs that for one arent that much faster than previous gen GPUs and two are way expensive.
Your right about games not really needin all that power too. ONly game that stresses my set up is Crysis.... I'm glad these cards were released. Now it makes the GX2 cheaper and more affordable. It would be even nicer if the GX2 would get down in the 300's. Making it that much more affordable.
I just dont understand why Nvidia would even release the GTX 260. It's going to perform on a similar level as the GX2 and 9800GTX.... and cost much more. It's so stupid IMO. They should have waited it out for a good few more months... let eveyone continue to buy 8 series cards and 9 series cards and introduce the GTX 200's with more performance this upcoming Christmas. That would have been good for marketing and not flooding the market with GPU after GPU... All these cards Nvidia released ie 8800GT, 8800GTS 512, 9800GTX, 9800GX2, GTX 200 perform rather similarly on a price performance standpoint and it just doesnt make much sense with the naming and the series designation.
Alright... I've ranted too much.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
Even Crysis doesn't need that much processing power if you look at any of the reviews. A 260gtx is whole 3-5 fps faster than 9800gtx with much lower memory bandwidth and whole lot less ROP.

In the situation of Geforce 7 and 8 it was a time when we needed more processing power than Geforce 7 can dish out. But those days have past and gone where Geforce 8 have enough processing power.

Fact is fillrate is king combined with right amount of memory bandwidth.
 

SpinachEater

Distinguished
Oct 10, 2007
1,769
0
19,810
Maziar!!! I haven't seen you in a while. Where have you been?




NEVERMIND!!! I see what you are saying now.
 
TY Marv, I digested most of it, then reread it again and got the rest. The advantage nVidia had over ATI has been lessened by ....nVidia. So, youre saying future games will require higher pixel rates. Could this be so forward looking that the newer DX model will make texturing that much easier? And like ATI who did the shader AA thing, nVidia is doing the pixel/process thing?
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


For instant gratification you need more texture fillrate with right combination of bandwidth which GT200 doesn't deliver that much more than G92. Not more SP. The bandwidth bottleneck is gone for the most part. Nvidia want maximum performance in DX10 & future games. This will require more shading power than G80's ratio would provide. In these kinds of games, GT200 performs well and the increase in per clock performance compared to G80 is close to the increase in ALU performance per clock.

assassinscreed.gif


This is what Nvidia is aiming for 260gtx beating out gx2 but that's really upto the developers not Nvidia but what Nvidia got is slightly better performance than a single 9800gtx in most of the current titles out. Just some ridiculous bandwidth starved situations does GT200 prevail. Do you see what ATI was aiming at the time when they made 2900xt? Nvidia is just following suit and is just catching up in that dept.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


With a refresh Nvidia needs to up the texture ratio by 2x folds. Then the instant gratification of performance for old or new just like how 8800gt was able to beat down G80GTS with less memory bandwidth and rop. More shader is longevity and the future but not present time.

If you see my review about my 8800gs I talk about texture fillrate compared to 9600gt. It's true that more SP does help in these low end cards to a certain extent but that performance had much to do with having more texture fillrate compared to 9600gt while having less rop and less memory bandwidth and still comparable performance.
 

hughyhunter

Distinguished
Nov 20, 2007
865
0
18,980
Doesnt it make more sense for GPU engineers/manufactures to develop the cards around bad a$% games rather than game developers to develop games for the GPU?
 

njalterio

Distinguished
Jan 14, 2008
780
0
18,990



What you are saying makes sense if the SP/texture ratio is as relevant as you think.

However, I think you are missing the larger part of the picture. I am not an electrical engineer, but I know there has to be a lot more into designing a graphics card then having a magical ratio of shaders and tmus.
I will be the first to agree that the GTX 200 was a huge let down for it's price, but I would not make technical assumptions about the GTX 200 when I only have access to third part benchmarks for two days while teams of engineers have at Nvidia have been working on this for over a year.

I have tremendous respect for the design of this card. It is the idiots in marketing and whoever made the decision to charge $650 for the card that I criticize.





 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


If you have a high understanding of math it's not far off from computers. It's all numbers anyway. In the real world yes there are variables but it isn't far off from specs and how coders program the games.
 

arrpeegeer

Distinguished
Apr 22, 2008
135
0
18,680
I was just playing Crysis last night and for 3 hours, not a single hic-up or glitch. Ran like a DVD movie it was so beautiful and my rig only has:

9800gtx
E8500
4 gigs on an X48 board

Even enthusiasts learn to stay away from bleeding edge and get the best price-point vs. proven performance and stability

At least I do :)

Also, now-a-days, my criteria right along with performance is heat and quietness. Unfortunately, the 280 (saw one today) fails on both counts - if those matter to you.
 

arrpeegeer

Distinguished
Apr 22, 2008
135
0
18,680
Oh yeah forgot to say I've read in multiple places NVidia is coming out with a lower die (55nm or 50nm?) in a few months or so for the G200 series.

Maybe then you'll see the heat, quietness, and performance worthy of the price.

For now, buh-bye
 

njalterio

Distinguished
Jan 14, 2008
780
0
18,990



I am not understanding what you are saying. There are variables but they are not far off from the specs? What variables are you talking about and how do they relate to specifications? or coding? How does math tie into your original post?

I agree that math can be used to describe how a computer (or any system) fundamentally works, but that is not what we are talking about.

The aim of this thread is "performance analysis". Well from what we saw from various benchmarks is that the GTX 280 is a pretty heavy duty card. It can play every game on intense settings. It slightly edges out the 9800 GX2, which has two gpus. Considering this card has only one gpu, it is a pretty huge leg up from past single gpu cards. There's definitely an increase in the sophistication of the technology, which makes me wonder; why are you criticizing the design aspects?

The fault of this card is timing and pricing. If Nvidia wanted to sell this card at $650, then they should not have released the 9800 GX2.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Coming out with the 65nm version a few months before the 55nm is pulling a fast one. Anyone buying one now will be like someone who bought a 125 watt Phenom 9850 a few months before the 95 watt version arrives. Anyone buying that will be just a few months shy of Deneb.

Nvidia's getting into AMD territory here. Putting out stopgap products to improve performance just a smidgen rather than holding off until the real part is ready. Sad really.
 
Ill remind everyone, the only reason nVidia released the x2 was because the 3870x2 currently was the fastest card out. Competition will make you do things in business, sometimes smart, generous,forward looking and other times not
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


Performance-wise in graphics, math like this isn't quite destiny, but it's close.. If you are good with numbers you are also good with analyzing it's performance. The specs are basically numbers. Variable between how coders program the games. 1 game might be texture heavy, pixel heavy, another might be shader heavy, cpu heavy, etc, etc.. You do realize that coding is also number crunching.



GTX is not really faster than GX2 or even close gx2 performance. I don't know what review you've been reading. GX2 is much more powerful than GT200. Only reason why GT200 spread it's legs in 2560x1200 resolution with AA because it has 1gig of vram and a lot more bandwidth. Not because GX2 is lacking in fillrate or processing power compared to GT200.

cod4-1920.gif


hl2-1920.gif


etqw-1920.gif


crysis-high-1920.gif


crysis-veryhigh-1920.gif


grid.gif


Instant gratification comes from fillrate long as it's not shader limited. AA performance also derive from it. The fact that there are couple of games that are shader heavy doesn't mean all games are like this. Within the last generations shader count has been rising but not to a point that a G92 128SP would be shader limited.

I don't know what you are trying to get at but I'm analyzing it's performance not because I think GT200 sucks although I think it sucks for what Nvidia is trying to sell it for. I think you have your head on backwards thinking I'm dissing GT200 and you need to stick up for it. Discussion for improvements in it's architecture and why Nvidia didn't do it with add more texture fillrate when Nvidia spokesman specifically said that most games off loads to textures and back to the memory which has to do with fillrate not shader which GT200 emphasizes on.

Just look at 260gtx vs 9800gtx. You do realize that 260gtx has whole lot more ROP, shader, and bandwidth but it isn't all that much faster than 9800gtx in most of the current crop of games. 3 fps faster in Crysis? That's not exactly what I call a step up. Now read what I'm trying to say on the first post and if you have an understanding I'm here to respond.
 

hughyhunter

Distinguished
Nov 20, 2007
865
0
18,980
Marve... I totally agree...!

It's funny how Nvidia in order to market this card only allows websites that do the reviews to post frame rates at the high 2560x1600 resolution... this might not be the case but I sure bet that Toms did the same. I've only come across reviews at this resolution. http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-18.html well and the 1920x1200.

Most people run on a 1680x1050 and few run on 1920... and even few above that. So why not do reviews on that resolution??? Because Nvidia knows it's own GX2 can keep up!

This card would be more pleasing to the purchaser and enthusiast if it were priced accordingly. With the price drops that have been around they should price it close to GX2 if not very slight above. Or they should have held off for a couple of months... done the die shrink... and release them with higher clocks and charge a bit less....

That's way to much to ask! That's why we should boycott Nvidia! :kaola:
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


No it does not because those "b-a" games you're talking about get beat by twitch gamers and then get dumped to never be replayed again. Many old school gamers like myself go back to games we enjoyed years ago (I've been known to replay Daggerfall under Dosbox). If a card was developed entirely around a limited play FPS like Crysis, then it wouldn't be capable of meeting the demands of games that fully develop under DX10.1 in the future.

Besides, not everyone who plays games spends $450 (I did last February but that was a first for me), let alone $650, just to get an extra 10 fps in a particular "b-a" game. At any rate, I don't play FPS, so though Crysis visuals impress me, it's gameplay does not. What I want are open world fantasy and SF CRPGs that are fully DX10.1 (patched or from the ground up) that I can enjoy for years.
 

njalterio

Distinguished
Jan 14, 2008
780
0
18,990
@ Marvelous: I am not trying to stick up for Nvidia, or for the GT200 (Currently I like ATI). The main reason why I am skeptical of what you are saying is because I find it hard to believe that teams of engineers at Nvidia made such an obvious mistake of using more shaders than necessary at the cost of architecture improvements elsewhere, especially when they have been working on this for so long.

Why didn't Nvidia up their texture address and filtering? Who knows. There must be some technical reason for it. No disrespect intended towards you of course, but I doubt the people at Nvidia are reading this thread and thinking "Why didn't we think of that?"

Basically what I am getting at is that the rabbit hole must go deeper than that when there are millions of dollars on the line.