Shader Model 3.0

eagles20015

Distinguished
Mar 20, 2006
5
0
18,510
I want to upgrade video cards for oblivion and I wanted to get a x800gto but I'm not sure about it because it only has SM2.0

Should I try to go to an Nvidia card to get SM3.0 or would 2.0 be ok?
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
I don't think games "need" SM3.0. Some games take advantage of it. Both X800GTO and 6800GS are very nice cards for the price.
He won't need SM3.0 unless he wants to play with much higher quality; something the 6800GS can't do even with it's Shader model 3.0 support.
 

soeddy

Distinguished
Mar 20, 2006
39
0
18,530
I think could try a nvidia because they have S.M.3 and the most recent games use that technology and the price of a 6800GS or GT should not be very far away from a x800GTO...

P4 2.8HTGhz
GPU 6600GT 5501000
1024 Ram
 

hergieburbur

Distinguished
Dec 19, 2005
1,907
0
19,780
I think could try a nvidia because they have S.M.3 and the most recent games use that technology and the price of a 6800GS or GT should not be very far away from a x800GTO...

P4 2.8HTGhz
GPU 6600GT 5501000
1024 Ram

Yes, but as stated, the lower end 6800 cards can't really run SM3.0 well anyway, so I say get the better card.
 
Just wait until after oblivion comes out.

Agreed, it comes out tomorrow.

And while I apreciate the idea of buying something so you can play right away, as I'll be playing it day 1, I'm not deciding on what to do about building until I've played it and seen the benchamrks as to what's an advantage and what can and cannot be used. I'll play it at low features on day 1-11 until I feel comfortable with my decision on what to buy for it. In the first few days it's likely alot of menial stuff anyways so, make sure when you do go into the nice battles and do go off to the far off beautiful places you have the best setups for thos situations.

Of course I'd sure hate to be killed by a diseased rat just because my framerate stuttered. :x
 

malphadour

Distinguished
Mar 3, 2006
142
0
18,680
Ignore any statements about the 6800GS not being able to run SM3.0 games. People have some misguided idea that SM3.0 has some massive performance hit on the card. SM3.0 is a code path and not just a quality model, so as well as increasing the model quality, it also has performance enhancements that mean SM3.0 at absolute worst causes an 8% performance hit and that is in a scene that is choc full of complex models. Under normal circumstances the framerate hit is minimal, and in a lot of instances SM3.0 actually causes an improvement in framerate.

Will you guys please go and read some SM2 vs SM3 benchmark tests before you make false statements about it.

Regarding Oblivion itself, you really want at least a half decent card with SM2 or 3 to be able to enjoy the games yummyness (Yummy Model 1.0).
 
Will you guys please go and read some SM2 vs SM3 benchmark tests before you make false statements about it.

Perhaps you should first. You're talking to some people, like Cleeve, who know from first hand experience on various SM3.0 and SM2.0+ cards what each has to offer.

But since you question my learned colleagues how about you supply a few beanchmarks that show the advanatages of SM3.0 (as implemented in the GF6 series) versus SM2.0B (as implemented in the X8 seriers) where the advanatage is anything more than 5-10% difference, whereas the penalty of enabling the truely differentiating features like OpenEXR HDR show a performance hit of 30+% (far more than your "at worst 8%" comment). Remember geometric instancing is supported on both models so that great benifit is moot. The only other major benifit (which conventiently enough also appears in FartCry) once again offers only minimal difference, not worth mentioning other than as a tech demo / proof of concpet. Sure it's a benifit, but like very small and in only very few games.

Regarding Oblivion itself, you really want at least a half decent card with SM2 or 3 to be able to enjoy the games yummyness (Yummy Model 1.0).

Well SM2.0 is bare minimum period (I think first game ever to REQUIRE it), SM3.0 wold be nice, but you also have to take into account the branching abilities of each card. I have a feeling that once again the GF6 is going to prove to be incapable of truely exploiting SM3.0, whereas the GF7 and X1x00 series cards will shine. IMO the GF6 will run best with SM3.0-centric features turned down or off.

I could be wrong, but I guess we'll know it a few hours time.
 
:tongue:


Hey, I'm still recovering from chest cold, I'm bound to be grumpy.

*grumble* spent whole weekend indoors *grumble* JackA$$ "friends" call me from hill to rub it in *grumble* gonna show them *grumble* gonna set the chalet on fire *mumble - grumble* yep, now gonna go get my stapler. *mumble* (trails off in the voice of Milton from Office Space) :twisted:
 
Oh, PS;
There is one interesting 'new' benifit of SM3.0 that 'MAY' offer some nice unforseen (well as being easily exploited, concepts been around for a while) benifit in the Havoc FX engine;

http://www.firingsquad.com/features/havok_fx_interview/

But we'll have to wait and see before we can tell if even here the GF6 series has much to offer extra as an SM3.0 card as well, especially when they are talking about SLi's GF7900s as their Demo platform.
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Well, Oblivion was realesed by Gamestop in Ohio at 7pm tonight.
Friend of mine has it. SM 3.0 is an option, and if your GPU doesnt support HDR, then it uses alot of Bloom effects and tone mapping to top it off.His system is a 2200+ (I know, but he hasnt had much $$) X800XT and a Gig 1/2 of PC 2100.

He would have no reason to BS me, but he's saying even though he's fighting multiple targets (most has been 3) that he's netting playable frame rates.

I get my copy friday, so I'm very excited.
 
What resolution was he running at?

The cpu figures are promising though.

Well it'll be almost 18hours 'til I can get my copy which is waiting for me. Probably by the end of the night I might be able to get a feel for the game and may have some benchies by mid-week or the weekend.
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Grape: 1024x768: Auto-detected his system to medium settings.
He's a wood elf, and I must say, the game looks freaking sweet.
Especially the opening cinematics. He sent me a couple screenies.
Maybe I could post them later....
 

pauldh

Illustrious
I'll play it at low features on day 1-11 until I feel comfortable with my decision on what to buy for it.
COuld it be Mr. anti-ludite is going to be grabbing a 7800GS (for his VAXP) by day 4? :tongue:

:roll: I don't think so that's gonna happen.
 
I'll play it at low features on day 1-11 until I feel comfortable with my decision on what to buy for it.
COuld it be Mr. anti-ludite is going to be grabbing a 7800GS (for his VAXP) by day 4? :tongue:


Nah, first off the GF7800GS on this side of the border is ridiculously expensive so unwise.

Second in order to be even reasonably worthwhile I'd need to upgrade the CPU and the memory (still single channel on the VAX though) , so for that kind of outlay really, might as well do a fresh build, but I'd prefer doing that when I have the time, and know what the best path is (AM2 ? , DDR2 ? X1700 or AIWX1900 or G80?).
 

cleeve

Illustrious
Ignore any statements about the 6800GS not being able to run SM3.0 games.

Well, the spiffiest feature that SM 3.0 compatible cards bring to the table is OpenEXR. True, it's not a strictly SM 3.0 feature per se, but it's honestly the only thing I've seen out of a SM 3.0 card that would offer a tangible visible difference over a SM 2.0 card.

Now, I can tell you from experience because my last card was a 6800 Ultra... it is a useless feature on the 6800 series. Don't get me wrong, 6800's are great cards, but OpenEXR shut my card down to a crawl in Far Cry. It just doesn't have the horsepower.

So any 6800 owner who thinks their checkbox SM 3.0 enabled card will let them use OpenEXR HDR happily is going to be sorely dissapointed.

My X1800 XL, on the other hand, FLIES in Far Cry with BOTH open EXR and AA enabled... which is nice. :)
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Cleeve:

An X1800XL eh? I'm ordering two GPU's too tinker with. One will be a 7900GT. But the X1800XL is coming next Tuesday, cant wait to see what it will do.