Sign in with
Sign up | Sign in
Your question
Closed

OBLIVION BENCHMARK ARTICLE!

Last response: in Graphics & Displays
Share
April 4, 2006 2:58:49 PM

At Firingsquad:

http://www.firingsquad.com/hardware/oblivion_high-end_performance/page3.asp

Interesting stuff. Radeons definitely the cards of choice, the X1800 XT's really show up well, usually besting the 7900 GT and even the 7900 GTX with some options, like foliage.

The 7900 GTX is no slouch though, but the X1900 XT bests it more often than not.

SLI/Crossfire is a different story, it looks almost completrely useless for both Nvidia and Ati. Two 7900 GTX's get a measly 6 fps gain in some cases, and some dual-card setups actually perform worse than their single counterparts.


On a side note; once again, OpenEXR HDR looks to be a performance killer.

There has been alot of talk here lately discussing wether or not the 7600 GT is powerful enough to handle HDR.

I'm not sure how people were claiming 7600 GTs could do it when you look at the 7900 GT numbers below. Indoors and in towns, the 7600 GT might be able to run HDR, but in the forest? I mean, the 7900 GT is bought to it's knees:


1024x768, Foliage areas with HDR:
7900 GT: 23 fps
X1800 XT: 35 fps

1024x768, Foliage areas, no HDR, 4xAA 8xAF:
7900 GT: 26 fps
X1800 XT: 37 fps


HDR is hurting these cards more than 4xAA 8xAF!
Even in areas will minimal foilage, the 7900 GT is scraping by with 45 fps with HDR enabled:


1024x768, Mountainous areas with HDR:
7900 GT: 45 fps
X1800 XT: 61 fps

1024x768, Mountainous areas, no HDR, 4xAA 8xAF:
7900 GT: 51 fps
X1800 XT: 64 fps

Is the 7600 GT getting better scores than it's 7900 GT brother? hard to believe. Once again though, we should see the midrange cards in a future Firingsquad review. Still very interested in seeing exact numbers for the midrange cards like 7600 GT, X1800 GTO, and X850 XT...
a b U Graphics card
April 4, 2006 3:15:40 PM

Thanks for the link and the assesment!
April 4, 2006 3:35:36 PM

well, all I have to say is..... NVIDIA's slogan is very Ironical then on this game. TWIMTBP, how come NVIDIA boasts that ES IV. Oblivion is optimized on NVIDIA cards....
:cry:  :cry:  :cry:  :cry:  :cry:  :cry: 

the X1800XT with only 16px pipes vs. the 7900GT with 24
Related resources
April 4, 2006 3:49:51 PM

Only thing they don't say is how they had the game set up and what mode was Crossfire in for those tests? Did they test all the modes and use the fastest? I know alternate frame rendering is supposedly the fastest but did they check?

SLI is definitely looking good with that game though. Definitely stirs the pot on whether I should get a 7900GT or a X1900XT. I know the X1900XT is faster but if I could get a 512MB version of a GT and OC it maybe it'd even things out and cost less than the XT. Who knows. Of course then theres the question of will I run dual cards. I mean if a current generation game is bringing current gen cards to their knees, waiting till the new cards comes out looks a little better. The question is when will those cards be out. May I could deal waiting until. June, maybe not unless its early June. I really don't care about AM2 since its not going to really make a difference in performance.
April 4, 2006 4:02:47 PM

For Oblivion players- based on this - the best bang-for-your-buck card to get is the X1800 XT.

1024x768 with HDR in heavy foliage bought every card to it's knees, but the X1800 XT hung in there with the big boys:

SLI 2x7900 GTs: 41 fps
X1900 XTX: 38 fps
X1800 XT: 36 fps
7900 GTX: 34 fps

Those are sobering numbers and probably indicate a CPU bottleneck, but still shows the X1800XT in a very fine light. IN areas with less foliage the X1800 XT still holds it's own, right on the heels of the X1900 XTX:

SLI 2x 7900 GTX: 77 fps
7900 GTX: 71 fps
X1900 XTX: 64 fps
X1800 XT: 61 fps


On closer inspection, SLI and Crossfire look almost completely useless in this game. An extra 7900 GTX in SLI gets you a measly 6 frames per second @ 1024x768? Good lord, what a waste!
April 4, 2006 4:38:30 PM

Quote:
On closer inspection, SLI and Crossfire look almost completely useless in this game. An ectra 7900 GTX in SLI gets you a measly 6 frames per second @ 1024x768? Good lord, what a waste!


while a sli/crossfire setup may not bring in any substantial increase over a single card setup in oblivion, consider the gains this setup would have on past or future games.

purchasing a multi-card setup would be a waste, if it were purchased solely for this game, but judging from my experience, people who do make such purchases have a tendency to play more than just one game.

great post though, it was nice to see some numbers on a few of these cards and how they stacked up.
April 4, 2006 4:42:29 PM

Quote:

purchasing a multi-card setup would be a waste, if it were purchased solely for this game, but judging from my experience, people who do make such purchases have a tendency to play more than just one game.


That's kind of a given, Kumana - I mean, we're only talking about one game here, and that's the context of the remark. There are certainly alot of other games out there, and SLI/Xfire shows considerable improvement in many of them.

Besides, I said:
"SLI and Crossfire look almost completely useless in this game."
April 4, 2006 5:11:13 PM

well my 3200+AMD(old 754 w/1MB L2) + a 6800GT runs this game at 1280x1024 at 15FPS w/HDR outside...so there is some kinda issue there. That site tells me i shouldn't be higher than 7FPS on a FX-57 w/2GB of ram....I don't have grass shadows on, but i do have full water detail...view distance is max(for trees and buildings) but half way for everything else. Oh and this is 8xAF too. also on a not so fast HDD.

15FPS is only 7FPS off from the 7900GT at a lower res and on a FX-57??? i don't think so.
April 4, 2006 5:38:55 PM

It says in the mountainous areas, 1600x1200, a 6800 GT will get 15fps.

In rough foliage is where it drops though. Are you saying your framerate doesn't drop when you turn the foliage high? Hell, even my X1800 XL wil slow down with grass on max.

They probably have settings higher than you... grass shadows, etc.
April 4, 2006 6:25:11 PM

im in foliage, yes. no grass shadows though...but seriously there is no difference imo. yes certainly gfx will take a large hit with it, but certainly not useful in image quality. There is NOTHING in the article that states they went in and turned grass shadows on though either. In the test setup, they actually are VERY vague as to what is on and what is off in terms of draw distance, shadows, water effects...blah blah.

All im saying is to someone that hasn't bought the game, and looks at that chart and says well damn, i don't even have a fx-57 and a 6800GT is only running at 7FPS...not gonna buy that one. I have a good CPU still(granted the memory controller isn't even dual-channel, but 1MB cache is good), 1GB of decent ram(2-3-3-6)OCZ, and a 6800GT. And at a higher than recommended res(recommended me at 1024x768 w/HDR) i pushed it to 1280x1024 and still am running with very high settings and getting 15FPS.

I think the only thing that could be different in those tests is foliage draw distance. Like i said mine is set to medium distance for foliage/grass. Tree's however and buildings are on 90%(not max).

oh and indoors im getting like 40-60fps so im good there :) 
April 4, 2006 6:58:30 PM

I'm on High Settings with all sliders at half, no AA or AF, and HDR enabled. Don't have fraps running or anything but it hardly ever graphically lags for more than a sec when I first get into the game. Every now and then I'll see a missed frame.
April 4, 2006 7:11:52 PM

The lowest FPS i got with my system (specs in the sig) was about 19 fps, but It was in a room filled with NPCs, so I guessed the bottleneck was the cpu having to deal with lots of AI interacting with each other, not a graphical hold up. Overall though, my fps generally is high, with AF and HDR on, so I can't complain... actually, I love my card for being able to run Oblivion at all... even on ultra high settings :D 
April 4, 2006 7:23:02 PM

Quote:
I'm on High Settings with all sliders at half, no AA or AF, and HDR enabled. Don't have fraps running or anything but it hardly ever graphically lags for more than a sec when I first get into the game. Every now and then I'll see a missed frame.


I found I had unpleasant lag with grass distance on full. once I lowered the grass, the game runs much more smoothly.

Indoors it's smooth as silk, the only place I have problems is in the foliage filled forest when my settings are too high. Even then, it's playable, but it's much nicer and smoother with the grass turned down.
April 4, 2006 7:51:19 PM

Good find Cleave.
Nothing new really. A shadder intensive game like Oblivion will run better on a ATI card and in fact most newer games that are not only for nvidia optimised will work slightly better on ATI cards due to high number of pixel shadders.
7600GT cannot handle HDR AND bring a decent(30+ most of the time) framerate in some games and it will certainly have problems in the future with that enabled.
Yet another proof that x1800xt is superior.
I don't care about gfx card brands,i only care about performance. X1800XT is a clearly better card. There is no need for discoussing.
People got lost in all the nvidia hype and got 7900GTs thinking they are the best for the money. Well peeps,they are not.
7900GT is very good now and quite competitive but it has problems in the long term. Changing frequencies and adding pipes is not always the best way. I expect the difference between the two to increase as time goes by.
April 4, 2006 8:09:37 PM

Quote:
An extra 7900 GTX in SLI gets you a measly 6 frames per second @ 1024x768? Good lord, what a waste!
Most expensive 6 frames per second I've ever seen.
April 4, 2006 8:32:43 PM

its because the ati cards have a different architecture than the nvidia cards, cant judge performance on pipelines alone anymore. Ati is going for more efficiency per pipe so they stuck with 16.
April 4, 2006 9:07:58 PM

Great post man! I read the article and I noticed that the SLI and Crossfire indoor frame rates were insane! if you go by the max framerates the SLI was besting the ATI all the time, not that it matters. But over 100 fps is well... GOOD :p  Outside I have to admit my system gets kinda freaky with all options on high w/HDR running 1920x1200 res. 2 x 7800 GTX's

Since this is a single player game is there a command to make a time demo that we can play with?
April 4, 2006 9:20:05 PM

I'd just like to put out that this weekend I had a topic up over these two cards x1800 XT or the nVidia counterpart and went with the x1800! Since Oblivion is like my major game I'm definetly happy with the results! I got the 512 MB version and hopefully that helps things a bit too!

thanks everyone for your input in helping me get the right card!
a b U Graphics card
April 4, 2006 9:43:09 PM

Quote:
There is NOTHING in the article that states they went in and turned grass shadows on though either. In the test setup, they actually are VERY vague as to what is on and what is off in terms of draw distance, shadows, water effects...blah blah.

All im saying is to someone that hasn't bought the game, and looks at that chart and says well damn, i don't even have a fx-57 and a 6800GT is only running at 7FPS...not gonna buy that one. I have a good CPU still(granted the memory controller isn't even dual-channel, but 1MB cache is good), 1GB of decent ram(2-3-3-6)OCZ, and a 6800GT. And at a higher than recommended res(recommended me at 1024x768 w/HDR) i pushed it to 1280x1024 and still am running with very high settings and getting 15FPS.

I think the only thing that could be different in those tests is foliage draw distance. Like i said mine is set to medium distance for foliage/grass. Tree's however and buildings are on 90%(not max).

oh and indoors im getting like 40-60fps so im good there :) 


They crank everything to the max. - Quote :

"For our testing, we cranked up all visual settings to their maximums. Of course, we also turned on settings such as self shadows, shadows on grass, tree canopy shadows, water ripples etc."

Obviosly turning down a few of these detail levels could double your framerates.
a b U Graphics card
April 4, 2006 9:44:27 PM

Wow! They test in three areas and average 10 runs. Freaking time consuming to say the least.

Thanks for the link Cleeve.
April 4, 2006 9:45:17 PM

They mention in the benchmark that the X1800XT is a 512mb version. I have seen mentioned on this forum that with some cards the extra memory is not really useful, however this benchmark hints that the extra memory on the X1800 XT might help. Any thoughts on this? Thanks
April 4, 2006 9:58:25 PM

Not dual core optimized. Should be :p 

Would like to see more benchmarks using high end widescreen monitors at higher resolutions too.
April 4, 2006 10:21:35 PM

I think the 7900 is the last on that architecture and if the games keep going ATIs way they will have to release a new gpu before vista[DX10]. I think ATI will just add a x1950 or something before changing to a new architecture[ unified shaders I hope], if it even has to for vista.
Nvidia will be back, with the every other month releases no one can ever be on top that long.
This could be some what of a driver issue since it's a new game.
a b U Graphics card
April 5, 2006 12:35:52 AM

Quote:

Besides, I said:
"SLI and Crossfire look almost completely useless in this game."


Ummm, you might want to check your Oblivion ReadMe file and see something the reviewer missed. I'd post the section, but I'm still at work (for anoth 25+ mins).

Xfire is not supposrted until new drivers.
April 5, 2006 1:02:44 AM

Can't turn on AA & HDR in Oblivion (on the PC) at the same time, no matter what card you have. Apparently it is hard coded that way.

You can do so on the Xbox (360). There is a conspiracy theory surrounding that.
a b U Graphics card
April 5, 2006 1:27:21 AM

Yeah and there are two suspects according to the conspirinstigators;

nV - Being a TWIMTBP game and the nV cards can't do it anyways.

M$ - To give the Xbox a stand-out feature.

I suspect it may be added later, but probably after the release of the G80.
April 5, 2006 3:37:25 AM

Well, this topic has made me change my order from a 7900 GT, to a x1900 512....
a b U Graphics card
April 5, 2006 5:36:03 AM

Quote:
I'd post the section, but I'm still at work (for anoth 25+ mins).


Well here's the section in the ReadMe file;

-ATI Crossfire
Performance on systems using ATI Crossfire will be reduced until ATI releases an updated driver with Oblivion listed as a Crossfire Compatible game.

a b U Graphics card
April 5, 2006 5:37:04 AM

Quote:
Well, this topic has made me change my order from a 7900 GT, to a x1900 512....


Well they're both good cards, they just have different strengths.
a b U Graphics card
April 5, 2006 11:57:35 AM

Man, that new 7800GT of mine that hasn't seen oblivion yet, it gets crushed by a X1800XT. The X1800 offers 100% more framerates. :cry:  I should have waited a month and spent $50 more and got a real card. I'm just bummed I opened it instead of selling it sealed. The most GPU stressful game is currently the only game I play, yet I don't have one good card for it. Oh well, it will be better than the 6800U I am using now. Hmm, sell both and buy a X1900XT? :idea:
April 5, 2006 1:03:39 PM

Yeah.. I hate this game.

Having to play at 1024x768 again is most depressing... And here was me thinking my PC would be ok for a couple of years....

I've had far too many late nights lately because of this game....

Looks like I'll have to fork out for a new card sooner than I thought.... :cry: 
*cries*
a b U Graphics card
April 5, 2006 9:09:38 PM

LOL!

You're still running better than my X700, but I have no graphical issues happy there.

I do have to uninstall/reinstall my audio drivers though to make the Audigy2ZS PCMCIA card not give me terrible buzzing (like power lines, or kazoo, can of bees) every few days.

Great game though, and it's damn fine playable for a laptop, just at lower resolution. :( 
April 5, 2006 9:11:57 PM

I forgot about kazoo's :D 
a b U Graphics card
April 5, 2006 9:46:37 PM

Kazoos, Juice/MouthHarps, and SlideWhistles, the Band instruments of our youth! :mrgreen:
April 5, 2006 10:50:06 PM

Wow, all this 7900 bashing over one game. I only play fps. Is this game really that good or are you using it to gauge new game coding on gpus? I may try it if it's really that kick ass.
Has anyone posted benches of the physx against the sli. NV posted numbers but I've yet to see any from ageia to compair.
a b U Graphics card
April 5, 2006 11:09:52 PM

Quote:
Wow, all this 7900 bashing over one game.


Check the title of the thread, it's not ATi vs nVidia but Oblivion Benchmarking. Simple. Do you forget the D3 threads?

Quote:
I only play fps. Is this game really that good or are you using it to gauge new game coding on gpus? I may try it if it's really that kick ass.


People either love it or hate it. some people who've never liked/played RPGs become converts, see Hanner's review at E.B.;
"In all honesty, trying to review Oblivion is like trying to review the history of the universe - Whatever I do, I'm bound to miss out a whole slew of points which are probably more than worthy of coverage. It's for this reason that I decided to write this piece as I have - As a genuine 'n00b' to the world of role-playing games taking his first tentative steps into a genre that I know next to nothing about. Those first steps were made with trepidation, and I sat down half-expecting to be overcome by boredom or confusion within the first couple of hours of gameplay. But, as I write this, I'm still setting aside every hour I can to continue playing through the game, and am showing no signs of boredom after more hours than I care to mention hacking and slashing through everything I can find in the game. The most frightening part of all is that there's still so much left to do, learn and explore - I'm starting to wonder when, if ever, this game will let me have my life back..

..To sum up - No matter what your favoured genre, if you only buy one game in 2006, make it Oblivion... Indeed, this game could most likely easily see you through to 2007 without even having time to play anything else! It's a PC gaming phenomenon, and quite rightly so - It's a phenomenon that you'd be crazy to miss out on."


Quote:
Has anyone posted benches of the physx against the sli. NV posted numbers but I've yet to see any from ageia to compair.


Physx is not supported in this game, it uses the Havok engine using the CPU for physics (it does use multi-threading for that if you have it), if anything Bethesda will later add support for Havoc FX for additional physics work.
April 6, 2006 2:35:35 AM

Quote:
Is this game really that good or are you using it to gauge new game coding on gpus?


It's really that good, but more importantly it's probablyu the most advanced and graphics-intensive game out there so far: HDR, tons of polys, tons of shader effects, shadows, grass, you name it.

Indeed, I think there is some merit to considering Oblivion a bit of a benchmark for upcoming games. That's one of the reasons there is so much stock going into it.

But aside from that it's a really, really kickass game!
April 6, 2006 3:07:09 AM

Quote:

It's really that good

I agree. I've been running around in God-Mode, shooting peeps with my rusty bow.......

J/K.....I am a bit lost though, and I'm starting to kinda lose intrest?
April 6, 2006 4:10:32 AM

Quote:
I am a bit lost though, and I'm starting to kinda lose intrest?


I find that the game is much more focussed than morrowind because your active quest will show that pointer on the compass showing you where you have to go next to finish it.

But at the end of the day, it's like every RPG - or game for that mater - it's basically a waste of time, raising your avatar's arbatrary point values... Hey! I'm level 12! That's 3 levels more than 9! :p 
April 6, 2006 5:35:17 AM

Yeah pretty much. I went to the Arena and became a gladiator, went through the entire "yellow team" then finished off the current Grand Champion, that was pretty fun.
a b U Graphics card
April 6, 2006 5:53:11 AM

More Benchmarks from Xbit's latest GTX review;

http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html

Seems they got an Xfire boost with the renaming of the file (shades of FEER/F.E.A.R.?).

Unfortunately they seem to have used the 84.21 drivers, but since the 84.25 aren't supposed to be performance drivers so much as bug/stability drivers, it wouldn't matter for raw numbers.
a b U Graphics card
April 6, 2006 5:55:38 AM

Quote:

J/K.....I am a bit lost though, and I'm starting to kinda lose intrest?


Yeah I can understand that, getting sidetracked and a little aimless meandering kinda loses the buzz.
April 6, 2006 6:29:08 AM

One thing I saw was that the motherboard they listed for the SLI was not the X16 version. That would be the deluxe and if they did not use a X16 then the scores would be off.
a b U Graphics card
April 6, 2006 6:47:43 AM

Quote:
One thing I saw was that the motherboard they listed for the SLI was not the X16 version. That would be the deluxe and if they did not use a X16 then the scores would be off.


Doesn't matter for solutions that use the dongle/bridge.
a b U Graphics card
April 6, 2006 10:46:40 AM

Quote:
just read that page of the review you posted. they say that ati cards switch off AA when HDR is on, is that true? thought it was only nviida which did that.


Bethesda did that. They decided, IMO, since it's a TWIMTBP game (or because they wanted to give the Xbox a checkbox advantage) to disable the HDR+AA in the ATi cards, just like they disabled Crossfire too (see their change of profile in the game). Strange indeed. It may be activated again when nV releases their G80, but who knows.

Quote:
oh and am i an idiot or them. they say bloom cannot be enabled when hdr is. to my knowledge bloom i a minimal implementation of HDR effects and that when full hdr is enabled it includes bloom effects. it kinda sound like when people wonder why they cannot have trilinear and anistropic filterig at the same time. are they right?


Well it's a question of you can't select it when the other is selected, and it's just because like you say it's a subset. Just like if you select 4XAA the 2XAA box doesn't stay selected even though it's doing more than that. It's just semantice IMO. Poorly worder by them.
April 6, 2006 8:44:17 PM

Quote:

Doesn't matter for solutions that use the dongle/bridge.


Yes it does, it is a matter of 32 lanes to 16 in SLI mode. Now in single mode it does not matter since you only have 16 anyway.
a b U Graphics card
April 6, 2006 9:12:24 PM

Quote:

Yes it does, it is a matter of 32 lanes to 16 in SLI mode. Now in single mode it does not matter since you only have 16 anyway.


If that's what you think then prove it with a reputable review.

You obviously miss the main reason reason for 32X, which is to get rid of the bridge/dongle which do all the cross-communicating now; because for simple SLi/Xfire that uses the bridge, neither card saturates the 8 lanes.

Of course prove me wrong with a review that shows otherwise.

N.B., something more than a 1-5% margin of error where they actually check the validity.
a b U Graphics card
April 7, 2006 11:56:49 AM

FS updated their Oblivion crossfire results after renaming oblivion.exe. Crossfire X1800XT is now beating SLI 7900GTX outdoors at every setting and offering alot more at high res than a single X1900XTX also.

Quote off their home page
"UPDATE 4/6/06: We've just updated the CrossFire vs SLI performance numbers with the proper results after renaming the Oblivion.exe to "AFR-FriendlyD3D.exe". To skip straight to those results..."

http://firingsquad.com/hardware/oblivion_high-end_perfo...
a b U Graphics card
April 7, 2006 5:54:46 PM

Yeah the Chuck patch I posted in the other thread and S-Stranger posted here means you don't have to do the renaming. I'd be interested in seeing if it also fixes some of the issues Brandon at Fs was having, but from the looks of the 'known issues' probably not.
April 7, 2006 6:20:13 PM

Quote:
still something dodgy about that game. i would have thought a 1900xt would have beat a 1800xt by a few more FPS than that.


Its not the game, its the driver. The current drivers don't fully take advantage of the new power the X1900 has yet. Only way I can think the difference would be so little. Besides, its not like they would rewrite their game to take advantage of a card that was released only a few months before the game.
!