I shouldn't be this picky - but bloom and HDR are quite different things.
I know that I was just detailing the two 'sweet spots' I found. I did mention 1024x768 w/ HDR+2XAA first, and then 1280x1024 with Bloom and no AA, depending on which you value more (HDR or Resolution). I prefer the HDR as it's a good implementation in Oblivion, especially if you know how to tweak a little more out of it. And the main difference is the 'dynamic' part which is truely impressive and immersive.
Even a GeForce FX can do bloom without sweating (not for Oblivion, but generally)
Actually it can't do it without sweating in general either, just check out rthdribl on an FX, it gets hammered compared to the other cards and that would be the halfway point and best implementation of basic bloom IMO.
- it is just an image space technique and you just need fill-rate power for it. HDR lighting on the other hand requires higher precision calculation support in the pipeline - you need a gen-3 card for this.
You don't 'NEED' a gen 3 card, it could be done with 3 passes and dithering on an R300+, but no one bothers with that implementation. It's a question of ease of use and how anal retentive people get about things, just like the G7 series can do OpenEXRFP16HDR+FSAA, but only through multiple loops and applying AA to the int8 resultants after the first ROP blend (whereas the X1K is able to maintain FP16 throughout [which isn't a spec, just a nice feature that makes it dang efficient too]).
True - I should have been clearer here. But complete SM 3.0 compliance is yet to appear on cards - even the drivers for the better ones unroll the loops during compilation of shaders and convert branched code to something more understandable - there is no "going back" through the pipeline like for the CPU. You can say all these cards are able to emulate SM 3.0 characteristics
I wouldn't say it's emmulation, because it's not a set requirement with the spec for DX. It's more of a supported feature, or super set beyond just compliance to the strict minimum requirements which is the usual standard for compliance. And this sorta leads into other debates of compliant, capable and supported, which to me would describe X1300/GF7300 / X1600XT/GF7600 / GF7900/X1900. Heck FP16 blending isn't even an SM3.0 requirement, but the support is layed out in the spec (can't remember if it appeared at all early in DX with in SM2.0/2.0A(no point adding in 2.0B beyond 2.0A IMO)
This is not actually true - again it boils down to how you define playability. Things like draw-distance, distant landscapes etc. do come into the picture.
All of which are actually very easily done on the X1600P, heck on my mobility X700 my draw distance is near max, and distant landscapes are on.
The raw vertex and fragment processing power does matter regardless of what shader model your card supports.
I know, and I do address that in my mention of grass, sinnce it's a huge hit, and since Oblivion renders everything (no early Z occlusion culling) it's a very important issue.
What the x1600 gains in features, it loses somewhat in speed (is the transistor count same as x850/800?- i don't know).
I agree with that, the X800GTO/PRO+ / X850Pro+ anything above those two would clobber the X1600P in raw framerates, however, for this game and this game alone, the benifits are strong for the added features, and it's one of the few games where it holds up strong against a GF6800GS or GF7600 series card. Oblivion IMO is the X1600P's sweetspot. Because even the GF6800GS is more expensive, the X800GTO more expensive and by large enough margins to matter. Now move to PCIe the story is completely different, there's far better choices, even remaining with just this game the X1800GTO has so much more to bring to performance, that it make the game that much better of an experience that it would be worth the premium over both and X1600XT and GF7600GT.
That's why your statement about grass holds true.
Yeah, and it's the only thing I say is a killer for the X1600P. The difference between default and grass size=100-110 IMO is huge, visually it's very close, performance wise it's almost night and day.
[quote[However, most implementations of x1600 are underclocked and have got lots of headroom for overclocking - people have pushed core speed to 600MHz and have found it performing almost as well as the cards having normal GPU cores running at the same clock speed.[/quote]
Agreed, the core offer alot of headroom, and I got my friend's up to 570mhz before chickening out (or wisely stopping) with someone else's gear. He wasn't to keen from the start cause it's new. Truely the only thing holding the AGP pro back from being a bit better is the crummy memory, if it had the XT's memory (or they sold an AGP XT) it'd be a much stronger card for this game IMO.
Couldn't have put it better myself. Can't wait for DX10 - a pity it runs only on Vista - ohhh! - the Halo3 trailer! 8O 8O
LOL! I put that trailer on my PSP to show a friend at work, it's a nice feature, and I like Halo, but remember it's Halo2+ for the PC, Halo3 for thew Xbox360. I just hope they add co-op play in the PC version.
For me Vista and DX10 offer alot of nice things, but having recently converted to the temple of the 'only laptops' I might have to wait until fall of 2007 until and appropriate solution comes around for me (X1700[ie X1800GTO+]) performance with a mid-range selection/price.
We'll see, I think DX10 will initially be like DX9/SM2.0-SM3.0/DX9.0c in that it'll have initial really cool demos, but the 'need' for gaming will come a short while later. I suspect by that Fall '07 it'll start making sense, the two titles of interest for me are UT2K7 (love Unreal) and Crysis (FartCry was OK, but I think I need to embrace this because obviously I missed the multiplayer benifit of FC which many people here enjoyed). Crysis looks the most ipressive by the early tech demo, so I am interested in DX10, but like DX9.0C, not convinced it's a killer app until later. Oblivion is the first title that made me wish I had an SM3.0 card, and even though it might have meant a somewhat underwhelming MobileX1600, it would be nice to have those features now.
I suspect that by the time Vista is out, and old enough to make the general population feel it's 'stable & able' that will be when we see more push towards getting those DX10 games and features to market ASAP, until that point we may have this oldschool resistance, of why bother from both light-medium gamers and the developers who know that's where the bulk of the money is. The main titles will always push the envelope, and until they truely demo/playable demo on DX10 hardware I think most of us will be sceptical of the 'need'.