Sapphire ATI x1600 Pro 512mb AGP - good enough for Oblivion?

cutter

Distinguished
Sep 14, 2004
199
0
18,680
Basically, I'm just trying to figure out if my rig will run Oblivion on high settings with this video card. I'm planning on building a new one eventually but for now my main goal was to experience Oblivion as it should be.

Will the above video card be enough with the following specs???:

AMD Athlon XP 3200+
2.2GHz
1 gig RAM

Thanks in advance.
 

cleeve

Illustrious
High settings? Sorry, not a chance. Even an X1800 struggles with high settings on Oblivion.

An X1600 PRO might be able to handle HDR in Oblivion with alot of the other details set low, specifically Grass distance.

Without HDR enabled it'd be able to handle more of the other details.

in AGP your best bet is to forget HDR and buy a used X800 XL for $150-ish. That'd give you much better framerates than an X1600 PRO, although you wouldn't get to see it 'as it should be', which is with HDR enabled.

but if you want to "experience Oblivion as it should be", you need an X1800 GTO at the very least, and that means a PCI express platform.
 

HYST3R

Distinguished
Feb 27, 2006
463
0
18,780
what up cleeve! once again the x1600 brings us together.

but yes hes right. the x1600 is a lil weak for high settings in oblivion. and if you really want HDR plus high setings your gonna need atleast a x1800.

a x1900 would be your best bet.
 

cutter

Distinguished
Sep 14, 2004
199
0
18,680
Will the x1600 Pro at least make the game playable? Will the game still look pretty good or is it not even worth it without the x1900 you mentioned?
 

superbrett2000

Distinguished
Mar 30, 2006
53
1
18,535
I'd get an X850 Pro, mod it to an X850 XT PE and upgrade the heatsink/fan on it. You will wind up with one of the best AGP cards available. Sure you wont be able to use HDR, but you will be able to run the game with higher settings.

I'm using an X850 Pro modded to an X850 XT PE and I'm running Oblivion fairly smooth with many options turned on or at/near full tilt.
 

ivoryjohn

Distinguished
Sep 20, 2001
174
0
18,680
From what I have read, the 1600 is supposed to be decent at shader stuff but weak at the other geometry. So you could probably get by with medium-high settings (and all the oblivion.ini tweaks to improve performance).

You would have to live with some occasional sluggish performance, but since it is an RPG, I can live with framerates down to about 20.

Visually, the 1600 would probably give you a good taste of what you can expect when you upgrade. And when you upgrade, you can look forward to better resolutions than the 1600 will provide.

I use an 1800xl and I am pleased with performance, and I play Oblivion a lot.
 

HYST3R

Distinguished
Feb 27, 2006
463
0
18,780
yes it will be very playable with a x1600.

it just doesnt have the horsepower to turn on all the goodies at the same time, which is what i found out first hand.
 
Will the x1600 Pro at least make the game playable? Will the game still look pretty good or is it not even worth it without the x1900 you mentioned?

From my experience with a friend's PC (dual MP2000+) you can run with large textures, and then change grass size to 110 in the .ini file with everything else at default (some sliders half way some all the way on).
With that I played a while @1280x1024 with bloom (no HDR) and 1024x768 with Bloom and 2XAA and just HDR. The second time I played I tried with 1024x768 with HDR and 2XAA and it was very playable with only a few hiccups in the hours I played around with those setttnigs, I found that only in indoor complex high number NPCs was it really noticeable, The Oblivion gates were noticeably slower at 1280x1024, but ok at 1024x768 even with HDR/AA.
 

bobbydamm

Distinguished
Nov 21, 2005
143
0
18,680
That's like asking "Can My Ti4600 Play Doom 3?".
Frame Rate Juggernaut!
The X1600 may do well at lower res. with goodies on, but....
as one mentioned here the X1800GTO should be the one if you're on a budget.
 

Kholonar

Distinguished
May 7, 2006
215
0
18,680
The truth is, there's two things going on here.

"Is my video card good enough to play Oblivion" - It could be argued that no video card can play oblivion right.

"I want to experience Oblivion as it should be" - Well, the landscape textures are really low resolution and can look really ugly. I personally don't think the HDR is done particularly well (compared to HL2).

Oblivion is a beautiful game in the model detail, the textures and the lightning. However, the best bit about oblivion is the gameplay and I would definately recommend buying the game even if your graphics card can only scrape through. Oblivion is a next generation game that will only look right later on but it plays good now.

If you question is whether to buy the 1600pro agp than I'd probably agree with some of the other people and buy a cheaper last (or is it last last) generation card. Then you can save up for a better, pci-express, based motherboard.
 

cleeve

Illustrious
It could be argued that no video card can play oblivion right.

I don't know about that... I'm running oblivion on an X1900 XTX right now, and let me tell you... it certainly seems all right to me. :)

It's funny though, goes to show people's tastes differe greatly, but I think Oblivion's HDR is one of the best implementations of HDR to date... I'd put it above HL2: Lost Coast in my book.
 

bobbydamm

Distinguished
Nov 21, 2005
143
0
18,680
I don't know about that... I'm running oblivion on an X1900 XTX right now,

Are these gifts you get because you're So Damn Good(!!)or are you just .....able to afford it? Then again, you may probably work at someplace that gives you access to such goodies? Nonetheless...1900XTX
pretty cool.
 

silentcoercion

Distinguished
Apr 18, 2006
90
0
18,630
Or there's always the possiblity that he's someone like me... College student working part time somewhere, but with no real expenses, thus being able to afford stuff like when it shouldn't really be possible. Though I'm just using a lowly x1800xt, because I like stockpiling money. *shrug*
 

photon3d

Distinguished
Jun 14, 2006
2
0
18,510
Playability is a subjective thing - it is up to you whether you want extremely good-looking graphics or very smooth framerates. As far as x1600 pro is concerned - it will be able to render all the stuff that Oblivion demands including HDR, AA and other expensive stuff. What may (will) happen is that you may end up with very cheesy framerates.

But be assured - things with the x1600 pro would not be like what they are for people with SM 2.0/2.0a cards (X800/850). x1600Pro is SM 3.0 compliant (though not fully - but no card out there can claim this). Theoretically, it supports everything but practically, you my find it does not support some things too well. And x1600 Pro is as good at number-crunching for SM 2.0 shaders as any gen-2 card out there. On top of that you get neat stuff like HDR and SM 3.0 features - so it won't be a bad choice - and a cheap one too!

So if you are content to play even with "toned-down" features (and by that I mean toned-down - not totally removed) - then Ati x1600 Pro is the best AGP card that you can get.

One warning - x1600 Pro will get obsolete very soon. Ati and nVidia both have their gen-4 (ie SM 4.0 cards - a quantum leap actually) cards in the making and they should be out by next year. Expect the x1900 series to become what x1600 is today - a mid-range card (pricewise) - so if you can wait a bit, go for a PCI-e motherboard and believe me, you will get really good value-for-money even if you buy an x1900/GeForce7900. On the other hand, Oblivion is too good a game to wait that long!
 
Playability is a subjective thing - it is up to you whether you want extremely good-looking graphics or very smooth framerates. As far as x1600 pro is concerned - it will be able to render all the stuff that Oblivion demands including HDR, AA and other expensive stuff. What may (will) happen is that you may end up with very cheesy framerates.

Howeverit can handle the frame rates at a reasonable level. I have played it with and without HDR+AA @ 1024x768 and it's wquite playable (of course it suffers the same momentary chuggin even the X1800-1900 experiences when at it's 'optimal' settings, but those moments are brief). I even found 1280x1024 with no AA and using Bloom instead of HDR to be very playable. The most important thing is to reduce grass quality (or increase grass size) because it's very limiting and especially on the X1600 (due to design). But it's quite playable and a good buy compared to it's price rivals.

But be assured - things with the x1600 pro would not be like what they are for people with SM 2.0/2.0a cards (X800/850). x1600Pro is SM 3.0 compliant (though not fully - but no card out there can claim this).

That's kind of a loaded statement, because it is about what it means to be compliant/compatible/capable, however it does have IHV specific methods that surpass the norm, just like nV. That who 'done right' debate is funny because both have flund dung in the past, but in reality every feature of SM3.0 can be exposed in the cards, same with nV's cards (remember FP16HDR+AA is not SM3.0 issue). DCT test alone doesn't mean much.

Theoretically, it supports everything but practically, you my find it does not support some things too well.

The only thing it supports out of the normal methedology would be R2VB, and when programmed for it it can perform better than the norm. Otherwise it does very well with it's typically SM3.0 stuff, especially things like dynamic branching which it does better than a GF7900.

So if you are content to play even with "toned-down" features (and by that I mean toned-down - not totally removed) - then Ati x1600 Pro is the best AGP card that you can get.

Actually I'd say 'toned down' in Oblivion is a falsehood. For this game the X1600P can play with all features enabled, whereas other solutions don't have them on medium, they have them OFF. It's a demand game and no system can play all maxed, period. But it does well at giving people a taste of what Oblivion looks like with the features enabled.

One warning - x1600 Pro will get obsolete very soon. Ati and nVidia both have their gen-4 (ie SM 4.0 cards - a quantum leap actually) cards in the making and they should be out by next year.

True, but waiting sucks, and for AGP this is really it for choice for this game. There will not likely be an X1700 AGP, and the GF7600GS isn't an improvement for this game. So really for the price this is the best option for now/

Expect the x1900 series to become what x1600 is today - a mid-range card (pricewise) -

Well that is the rumour about the RV570, but in AGP, unlikely.... but possible I guess.

so if you can wait a bit, go for a PCI-e motherboard and believe me, you will get really good value-for-money even if you buy an x1900/GeForce7900. On the other hand, Oblivion is too good a game to wait that long!

Exactly, get the X1600P for now to tide you over to a good whole system upgrade, play it with everything on and enjoy the game. Then take your relative saving and put it into the mid-range G8x/R6xx card, which should be a great jump-to point from a n X1600P, and by then Vista/DX10/WGFx.x/WTF! should be clearer, etc.
 

photon3d

Distinguished
Jun 14, 2006
2
0
18,510
I even found 1280x1024 with no AA and using Bloom instead of HDR to be very playable. The most important thing is to reduce grass quality (or increase grass size) because it's very limiting and especially on the X1600 (due to design).
I shouldn't be this picky - but bloom and HDR are quite different things. Even a GeForce FX can do bloom without sweating (not for Oblivion, but generally) - it is just an image space technique and you just need fill-rate power for it. HDR lighting on the other hand requires higher precision calculation support in the pipeline - you need a gen-3 card for this.
That's kind of a loaded statement, because it is about what it means to be compliant/compatible/capable, however it does have IHV specific methods that surpass the norm, just like nV. That who 'done right' debate is funny because both have flund dung in the past, but in reality every feature of SM3.0 can be exposed in the cards, same with nV's cards (remember FP16HDR+AA is not SM3.0 issue).
True - I should have been clearer here. But complete SM 3.0 compliance is yet to appear on cards - even the drivers for the better ones unroll the loops during compilation of shaders and convert branched code to something more understandable - there is no "going back" through the pipeline like for the CPU. You can say all these cards are able to emulate SM 3.0 characteristics
The only thing it supports out of the normal methedology would be R2VB, and when programmed for it it can perform better than the norm. Otherwise it does very well with it's typically SM3.0 stuff, especially things like dynamic branching which it does better than a GF7900.
True - no debate.
Actually I'd say 'toned down' in Oblivion is a falsehood. For this game the X1600P can play with all features enabled, whereas other solutions don't have them on medium, they have them OFF.
This is not actually true - again it boils down to how you define playability. Things like draw-distance, distant landscapes etc. do come into the picture. The raw vertex and fragment processing power does matter regardless of what shader model your card supports. What the x1600 gains in features, it loses somewhat in speed (is the transistor count same as x850/800?- i don't know). That's why your statement about grass holds true. However, most implementations of x1600 are underclocked and have got lots of headroom for overclocking - people have pushed core speed to 600MHz and have found it performing almost as well as the cards having normal GPU cores running at the same clock speed.
Exactly, get the X1600P for now to tide you over to a good whole system upgrade, play it with everything on and enjoy the game. Then take your relative saving and put it into the mid-range G8x/R6xx card, which should be a great jump-to point from a n X1600P, and by then Vista/DX10/WGFx.x/WTF! should be clearer, etc.
Couldn't have put it better myself. Can't wait for DX10 - a pity it runs only on Vista - ohhh! - the Halo3 trailer! 8O 8O
 
I shouldn't be this picky - but bloom and HDR are quite different things.

I know that I was just detailing the two 'sweet spots' I found. I did mention 1024x768 w/ HDR+2XAA first, and then 1280x1024 with Bloom and no AA, depending on which you value more (HDR or Resolution). I prefer the HDR as it's a good implementation in Oblivion, especially if you know how to tweak a little more out of it. And the main difference is the 'dynamic' part which is truely impressive and immersive.

Even a GeForce FX can do bloom without sweating (not for Oblivion, but generally)

Actually it can't do it without sweating in general either, just check out rthdribl on an FX, it gets hammered compared to the other cards and that would be the halfway point and best implementation of basic bloom IMO.

- it is just an image space technique and you just need fill-rate power for it. HDR lighting on the other hand requires higher precision calculation support in the pipeline - you need a gen-3 card for this.

You don't 'NEED' a gen 3 card, it could be done with 3 passes and dithering on an R300+, but no one bothers with that implementation. It's a question of ease of use and how anal retentive people get about things, just like the G7 series can do OpenEXRFP16HDR+FSAA, but only through multiple loops and applying AA to the int8 resultants after the first ROP blend (whereas the X1K is able to maintain FP16 throughout [which isn't a spec, just a nice feature that makes it dang efficient too]).

True - I should have been clearer here. But complete SM 3.0 compliance is yet to appear on cards - even the drivers for the better ones unroll the loops during compilation of shaders and convert branched code to something more understandable - there is no "going back" through the pipeline like for the CPU. You can say all these cards are able to emulate SM 3.0 characteristics

I wouldn't say it's emmulation, because it's not a set requirement with the spec for DX. It's more of a supported feature, or super set beyond just compliance to the strict minimum requirements which is the usual standard for compliance. And this sorta leads into other debates of compliant, capable and supported, which to me would describe X1300/GF7300 / X1600XT/GF7600 / GF7900/X1900. Heck FP16 blending isn't even an SM3.0 requirement, but the support is layed out in the spec (can't remember if it appeared at all early in DX with in SM2.0/2.0A(no point adding in 2.0B beyond 2.0A IMO)

This is not actually true - again it boils down to how you define playability. Things like draw-distance, distant landscapes etc. do come into the picture.

All of which are actually very easily done on the X1600P, heck on my mobility X700 my draw distance is near max, and distant landscapes are on.

The raw vertex and fragment processing power does matter regardless of what shader model your card supports.

I know, and I do address that in my mention of grass, sinnce it's a huge hit, and since Oblivion renders everything (no early Z occlusion culling) it's a very important issue.

What the x1600 gains in features, it loses somewhat in speed (is the transistor count same as x850/800?- i don't know).

I agree with that, the X800GTO/PRO+ / X850Pro+ anything above those two would clobber the X1600P in raw framerates, however, for this game and this game alone, the benifits are strong for the added features, and it's one of the few games where it holds up strong against a GF6800GS or GF7600 series card. Oblivion IMO is the X1600P's sweetspot. Because even the GF6800GS is more expensive, the X800GTO more expensive and by large enough margins to matter. Now move to PCIe the story is completely different, there's far better choices, even remaining with just this game the X1800GTO has so much more to bring to performance, that it make the game that much better of an experience that it would be worth the premium over both and X1600XT and GF7600GT.

That's why your statement about grass holds true.

Yeah, and it's the only thing I say is a killer for the X1600P. The difference between default and grass size=100-110 IMO is huge, visually it's very close, performance wise it's almost night and day.

[quote[However, most implementations of x1600 are underclocked and have got lots of headroom for overclocking - people have pushed core speed to 600MHz and have found it performing almost as well as the cards having normal GPU cores running at the same clock speed.[/quote]

Agreed, the core offer alot of headroom, and I got my friend's up to 570mhz before chickening out (or wisely stopping) with someone else's gear. He wasn't to keen from the start cause it's new. Truely the only thing holding the AGP pro back from being a bit better is the crummy memory, if it had the XT's memory (or they sold an AGP XT) it'd be a much stronger card for this game IMO.

Couldn't have put it better myself. Can't wait for DX10 - a pity it runs only on Vista - ohhh! - the Halo3 trailer! 8O 8O

LOL! I put that trailer on my PSP to show a friend at work, it's a nice feature, and I like Halo, but remember it's Halo2+ for the PC, Halo3 for thew Xbox360. I just hope they add co-op play in the PC version.

For me Vista and DX10 offer alot of nice things, but having recently converted to the temple of the 'only laptops' I might have to wait until fall of 2007 until and appropriate solution comes around for me (X1700[ie X1800GTO+]) performance with a mid-range selection/price.

We'll see, I think DX10 will initially be like DX9/SM2.0-SM3.0/DX9.0c in that it'll have initial really cool demos, but the 'need' for gaming will come a short while later. I suspect by that Fall '07 it'll start making sense, the two titles of interest for me are UT2K7 (love Unreal) and Crysis (FartCry was OK, but I think I need to embrace this because obviously I missed the multiplayer benifit of FC which many people here enjoyed). Crysis looks the most ipressive by the early tech demo, so I am interested in DX10, but like DX9.0C, not convinced it's a killer app until later. Oblivion is the first title that made me wish I had an SM3.0 card, and even though it might have meant a somewhat underwhelming MobileX1600, it would be nice to have those features now.

I suspect that by the time Vista is out, and old enough to make the general population feel it's 'stable & able' that will be when we see more push towards getting those DX10 games and features to market ASAP, until that point we may have this oldschool resistance, of why bother from both light-medium gamers and the developers who know that's where the bulk of the money is. The main titles will always push the envelope, and until they truely demo/playable demo on DX10 hardware I think most of us will be sceptical of the 'need'.
 
Hey Ape, I see the new X1600 Pro AGP article has Oblivion scores...

LOL!

Hmm, wonder where that came from. Sweet! 8)

Just amazing looking at the difference between my old card and that one. Jeez, I'm glad I didn't have the R9600P to sell him at the time, likely I would've gotten cused out. Just amazing to me since II really enjoyed the R9600P while I had it. WOW! 8O

Makes me appreciate the MRX700 in my laptop a little more, whereas my R9600P got 'ok' or comparable 3Dmarks, obviously the R9600P is struggling in ESIV, whereas I feel comfortable gaming even at 1200x800 (a little skitish at time near oblivion gates at that setting) with the MRX700.