Will you guys please go and read some SM2 vs SM3 benchmark tests before you make false statements about it.
Perhaps you should first. You're talking to some people, like Cleeve, who know from first hand experience on various SM3.0 and SM2.0+ cards what each has to offer.
But since you question my learned colleagues how about you supply a few beanchmarks that show the advanatages of SM3.0 (as implemented in the GF6 series) versus SM2.0B (as implemented in the X8 seriers) where the advanatage is anything more than 5-10% difference, whereas the penalty of enabling the truely differentiating features like OpenEXR HDR show a performance hit of 30+% (far more than your "at worst 8%" comment). Remember geometric instancing is supported on both models so that great benifit is moot. The only other major benifit (which conventiently enough also appears in FartCry) once again offers only minimal difference, not worth mentioning other than as a tech demo / proof of concpet. Sure it's a benifit, but like very small and in only very few games.
Regarding Oblivion itself, you really want at least a half decent card with SM2 or 3 to be able to enjoy the games yummyness (Yummy Model 1.0).
Well SM2.0 is bare minimum period (I think first game ever to REQUIRE it), SM3.0 wold be nice, but you also have to take into account the branching abilities of each card. I have a feeling that once again the GF6 is going to prove to be incapable of truely exploiting SM3.0, whereas the GF7 and X1x00 series cards will shine. IMO the GF6 will run best with SM3.0-centric features turned down or off.
I could be wrong, but I guess we'll know it a few hours time.