ATI doesnt like shader model 3.0??

e_engineer

Distinguished
Jan 27, 2005
29
0
18,530
When are ATI cards going to support pixel shader 3.0??X850XT is a big disappointment about this situation.

POWER ELECTRONÝCS
 

TheRod

Distinguished
Aug 2, 2002
2,031
0
19,780
Humm... WHy a big disapointment??? No games is using PS3.0 except a few (FarCry...) and it doesn't give any sognificant performance boost. If you REALLY want SM3.0 now, buy nVidia 6000 series or wait 3-6 months for the next generation Radeon that will support SM3.0.

-
A7N8X / <font color=green><b>Sempron 2800+</b></font color=green> <- <i>Is this affecting my credibility?</i>
Kingston DDR333 2x256Megs
<font color=red>Radeon 8500 128Megs</font color=red> @ C:275/M:290 <- <i>It's enough for WoW!</i>
 

cleeve

Illustrious
Your thoughts are without merit, sir.

Show me a single title where shader model 3.0 even makes a crucial difference. Even Far Cry, which is probably the most prevalent shader 3.0 game out there, hardly shows a difference between 2.0 and 3.0.

Ati is coming out with the R520 in a matter of months, it is shader model 3.0 capable and will *still* be available long before Shader model 3.0 is a must-have...

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 332/345)</i>
<b>AthlonXP <font color=red>3200+</b></font color=red> <i>(Barton 2500+ o/c 400 FSB)</i>
<b>3dMark03: <font color=red>5,354</b>
 

pauldh

Illustrious
Not only very limited performance gain in farcry with SM3, also no visual quality benefit as patch 1.3 with sm3 only helped the GF6800 series match the IQ of the X800 series. [H] did a piece on that where they basically were not impressed with patch 1.3 and SM3, but like getting to play with HDR for future dreams really and not realistic use due to even the 6800U being too weak for it. I'd have to agree with your comment about no merit. SM3 simply is not currently worth anything going by todays games. And ATI seemed to know what they were doing as lack of SM3 didn't hurt them afterall thusfar, just lack of availablity. When it's needed they will have it. And despite HDR looking pretty neat, HDR on a 6800Ultra is about like the FX5200 and DX9. Technology is there, but the raw power needed is not. So what is HDR worth to the GF6800GT or 6800U owner? And how much is SM3 on a 6800 series really going to be worth to make it outlast the X800 series? Less than I would have thought so it seems.





<A HREF="http://service.futuremark.com/compare?2k3=3400555" target="_new"> My</A>
<A HREF="http://service.futuremark.com/compare?2k1=8268935" target="_new">Gamer</A>
 
Remember that the ATI cards are actually FASTER than the nV cards in SM3.0 'aware' FartCry. Any performance benifit isn't worth mentioning because geometric instancing gets support for both architectures, and the SM2.0+ card outperfroms the SM3.0 card with the same IQ.

And the one truely different feature, crytek's HDR implementation, doesn't show off the SM3.0 as a must have either, it takes such a perfromance hit on this generation of cards, that it simply proves that SM3.0 isn't ready for primetime yet. ATI and Masa's implementation of HDR is far far more efficient and equally effective, and it performas significantly better on ATI hardware, showing it as the better method over floating point for this generation.

Just like in the past nV has cards with some of the new features, but none of those cards have the ability to exploit those new features.

Sounds like ATI's timed it just right, bringing their parts to market well in advannce of any true need for it, heck they can even get a refresh and a mid-level release between the R520 and the release of things like UE3.

Seriously if you wanted to pick on something you shoulda focused more on the architectural benifts that I won't elaborate on, so you'll have something to keep you busy. But bringing up SM3.0 for this generation is like mentioning 32bit for the last generation. Irrelevant in all but demos.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
The funniest part of PAtch 1.3+SM3.0+Crytek's HDR, is that it didn't expose a shortcoming of SM/PS2.0 so much as show how really crytek's choice of floating point HDR was a very poor choice compared to other methods which achieve near similar effects at nowhere near the penalty. If anything the push to SM3.0 probably caused alot of devlopers to rethink their implementation of that effect based on those result if anything. I doubt anyone thought, "geez we should add that to our engine because it'd bring every card to a halt!"

Probably even Crytek looks at it as a worthy experiment, but as somewhat of a failed experiment; or at least one that they learned from, but won't be quick to implement in any other applications for all the effort it took, and it's poor reception.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

sweatlaserxp

Distinguished
Sep 7, 2003
965
0
18,980
ATI and Masa's implementation of HDR lighting is far far more efficient and equally effective, and it performas significantly better on ATI hardware, showing it as the better method over floating point for this generation.
But still- a <i>massive</i> performance hit. HDR lighting is one of the critical elements that will push 3D games into the next era, and right now developers can only implement it in tight measure. Cards need ridiculous power to render it as in the rthdribl demo, except in a real-world polygonal environment. I project that the next generation of cards will not be able to pull it off at higher resolutions. But, we'll see :tongue:

<A HREF="http://atomfilms.shockwave.com/landing/landingIndex.jsp?id=dumb01&mature=accept" target="_new">DumbLand</A>
 
HDR lighting is one of the critical elements that will push 3D games into the next era
That and half a dozen other things neither card does well (displaement mapping [still virtual])

Not only are the cards not ready for HDR, neither are the monitors, so the FULL extent of HDR like Crytek and nV are talking about, won't even be visible, so talking about the absolute limits is a waste right now IMO.

HL2 has HDR effects not as 'dramatic' as FartCry's level 7-11, but close to level 3, and yet at a far less princely sum. Rthdribl actually works very VERY well on ATI's hardware, but no so much so on the nV gear. Being a TWIMTBP game it's not surprising that crytek used the fp-blending method requiring SM3.0, but really is it that much more impressive? I don't think so.

You say to do it in the rthdribl fashion would require ridiculous amounts of power, but it's more efficient than full floating point rw-hdr as seen in FartCry. And if the performance penalty is that ATI would require 3 passes to do what nV does with one, well a 60% drop as can be seen on the nV cards in FartCry almost evens out the equation. And with efficiencies with each renderpass, probably less of a delta. The biggest problem is that the length of the instructions using the crytek method makes the implementation slow on this generation, so the net gain is negligeable, and really for the near future they need to simplify like rthdribl to achieve the effect but without the same level of penalization.

And it's funy because actually the GF6800 doesn't do 32bit FP-Blending for HDR, only 16bit. So by the same token, nV still isn't fully capable yet of matching those 3 passes on the ATI which could render a better Pseudo-FP32(FP24+) image at the only small 10-20% performance hit, and the nV card could then do the same at full FP32 with an additional hit yet again. But would we even notice ANY of those IQ differences?

It's not that it's not possible on one or the other card, it's more that it's not practical for EITHER card right now, and like you say maybe the next generation won't be fully up to snuff for what we're being lead to expect by all the SM3.0 boosters out there. The only good news is that ATi may be on to a new architecture, and who knows what benifits that may bring to even rudementary branching and such.
Heck alot of the spec of PS2.0 aren't fully realiseable with current hardware so it's not suprising their superset would be worse.

We need MORE POWER! :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

e_engineer

Distinguished
Jan 27, 2005
29
0
18,530
your words frustrated me sir!!i never try to create flamebait between two opponent companies...i just wondered if shader model 3.0 is necessary for ATI cards today and other buddies answered me with the way of respect.:-(
im a stranger here,new member...sorry if i broke anybody's heart!!

POWER ELECTRONÝCS
 

cleeve

Illustrious
My apologies, if your questions were innocent and I mistook them for otherwise.

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 332/345)</i>
<b>AthlonXP <font color=red>3200+</b></font color=red> <i>(Barton 2500+ o/c 400 FSB)</i>
<b>3dMark03: <font color=red>5,354</b>
 

KCjoker

Distinguished
Jun 10, 2002
273
0
18,780
Well here's a link from last August that shows some games that use 3.0 and of course there will be more. I think ATI will have it but Nvidia has a jump on them for driver support.

<A HREF="http://www.theinquirer.net/?article=18134" target="_new">http://www.theinquirer.net/?article=18134</A>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
those are just fanATic talk~~~:p

6 months ago they were complaining how GeForce FX don't have SM2.0.

These fanATics are never satisfied :D

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

sweatlaserxp

Distinguished
Sep 7, 2003
965
0
18,980
I don't care about feature support, because it's not like nVidia or ATi don't support certain graphics features in certain games. Only NV40 can do HDR lighting in Far Cry, but who cares if it's a f@cking slideshow. Read Grape's explanation. In the end, both nVidia and ATi will keep tempo with the new specs in OpenGL and Direct3D, as they have been doing for quite a while, even if one of them decides that immediate support is unnecessary for the end-gamer. If that approach was foolish then one of them would have gone under a while ago.

Either way, it's proven itself again and again that real-world performance has nothing to do with support for bleeding-edge features. If I was a developer I'd be using a 6800 Ultra- no question- but for everybody else SM3.0 really is not important. ATi and nVidia have created a sort of permanent rivalry and there will always be one camp arguing about how their new architecture is better, and more <i>future-proof</i>, but in the end it's usually a stalemate. I prefer price-to-performance- the best benchmark.

<A HREF="http://atomfilms.shockwave.com/landing/landingIndex.jsp?id=dumb01&mature=accept" target="_new">DumbLand</A>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
shut up you fanATic!



RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

e_engineer

Distinguished
Jan 27, 2005
29
0
18,530
hey coolman!its not a fanatic talk!its just a forum that if new features are needed or not!:)i used to have nVidia's cards too!

POWER ELECTRONÝCS
 
You must notice his sig,

<i><font color=green>120% nVidia Fanboy+119% Money Fanboy</i></font color=green>

He is our little nV-sponsored court JEster. We can dismiss his posts as the light hearted fun they are, yours on he other hand seemed fishy.
I'm still of the position your wording needs refinement, because it looks baiting with your competence comments, but I'll take your word that it was just a question. However don't be surprised if the wording you use which may mean little in your first language comes off as something else when translated and gets you the reaction you got. Maybe preface open ended coment's like, 'ATI doesn't <i>LIKE</i> SM3.0'?, with something to let people know that your wording is probably not the best and isn't meant to incite when making sweeping statements like that.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

e_engineer

Distinguished
Jan 27, 2005
29
0
18,530
you are right sir!our language,of course,is different from yours and that may cause misunderstandings...since then ill be more careful!thx..and sorry everybody

POWER ELECTRONÝCS
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
Dr.Zoidburg "FINALLY! RECOGNITION!"

hey i make educated thoughtful posts...............once in a while :p

call me nVidiot again Grape, and I'll show you what nVidiots can do to fanATics

*pulls down pants*

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

cleeve

Illustrious
Your lowered pants are no match for my loaded handgun.

Advantage: Ati

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 332/345)</i>
<b>AthlonXP <font color=red>3200+</b></font color=red> <i>(Barton 2500+ o/c 400 FSB)</i>
<b>3dMark03: <font color=red>5,354</b>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
*optimizes source code*

you're nothing compare to me now!

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D
 

eden

Champion
*Watches, strangely aroused.*

--
The <b><A HREF="http://snipurl.com/bl3t" target="_new"><font color=red>THGC Photo Album</font color=red></A></b>, send in your pics, get your own webpage and view other members' sites.
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
I told u i optimized my source code

my ass is exit only now.

and Eden

YOU KEEP OUT OF THIS

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy+119% Money Fanboy
GeForce 6800 Ultra--> The Way we thought FX 5800Ultra is meant to be played
THGC's resident Asian and nVboy :D