Sign in with
Sign up | Sign in
Your question

Xbitlabs benchmarks - odd results

Last response: in Graphics & Displays
Share
January 27, 2008 11:21:55 AM

http://www.xbitlabs.com/articles/video/display/radeon-h...

Ive got a 2900pro at stock speeds, and mine sails through S.T.A.L.K.E.R. at 1400x900 (my monitors max res, a fraction lower than 1280x1024). My minimum fps is something like 26/27 fps, maximum detail, with dynamic lighting. That is at the two points in the game where i found the fps was at its lowest (standing on the edge of the radioactive pond with the hut in the middle, where a soldier is hiding that you can get rewarded for killing & standing on the top of a tower in the freedom base and looking out over the complex.) I would just like to point out to anyone with an ati card, or considering buying an ati card, that xbitlabs' results are bogus, and this statement:

"The ATI Radeon HD is still plagued with performance-related problems in this game"

is false.

##This rant was brought to you by spoonboy productions plc##
January 27, 2008 11:58:22 AM

Whom to belive, the reputable website or the forum poster?

Its a though question as always....

To support your claims a little please post your gpu drivers, maybe xbitlabs used outdated ones...
January 27, 2008 12:16:04 PM

Well, this is what happens when people take the results from one outlet like it was god creating them himself.

I think the best way to test something is throw it in your machine and see how it does rather than arguing over the internet about someone's benchmark results, framerates, or 3dmark score in a completely different machine.

That crap has no bearing in my decision making process.

I would rather hear what USERS have to say.

Related resources
a b U Graphics card
January 27, 2008 12:28:28 PM

righteous said:
That crap has no bearing in my decision making process.
I would rather hear what USERS have to say.

I'll base my opinion/views on what I gather from all over, but I tell you I would trust a group of review sites way over what a user says. People have different views of playability. And I am not refering to Spoonboy in with this, but I have seen too many playability claims I know are false, so am not quick to trust what users say. My 7600GT is smooth as silk in Oblivion with everythig on max comes to mind.


a b U Graphics card
January 27, 2008 12:34:55 PM

@ spoonboy - I can't comment on Stalker myself. But xbit was running vista 32-bit with Cat 7.12 drivers. They also set the graphics drivers to Quality not default settings, which turns off optimizations and reduces performance. It's good to hear you find it plays well. It is good info to share with others. Post your driver and OS version so others who read this know that it works well for you with those.
January 27, 2008 12:35:43 PM

SirCrono said:
Whom to belive, the reputable website or the forum poster?

Its a though question as always....

To support your claims a little please post your gpu drivers, maybe xbitlabs used outdated ones...


There are a large number of well-known reputable sites.
Find some benchmarks from AnandTech, THG, or a few others.
Please post some results from there and compare them to what XBITLABS shows.

Check a number of differences......

Drivers. (Can be a large difference)
OS - Vista vs XP (Can be a large difference)
Game Patch Level - (Can be a large difference)

However, I would trust XBITLABS results over a poster.
January 27, 2008 12:40:42 PM

pauldh said:
I'll base my opinion/views on what I gather from all over, but I tell you I would trust a group of review sites way over what a user says. People have different views of playability. And I am not refering to Spoonboy in with this, but I have seen too many playability claims I know are false, so am not quick to trust what users say. My 7600GT is smooth as silk in Oblivion with everythig on max comes to mind.


What?

You would trust professional reviews who publish methods and related information to allow their tests to be duplicated under the same condition?

I find it's much better to just use Google until I find a post that re-affirms what I prefer to believe.
I find the answers I get that way to be much more acceptable to me.
January 27, 2008 1:04:36 PM

Nah the source of my rant is that xbit enable non-standard aa: "Also, to ensure maximum image quality, we enabled transparent texture filtering options: Adaptive Anti-Aliasing/Multi-sampling for ATI Catalyst and Antialiasing – Transparency: Multisampling for Nvidia ForceWare." Which kills performance, particularly for ati (yes im a fanboy lol). Other reviews using normal aa but otherwise identical filter and detail settings get different results. A quick example, check out the difference between Halflife 2 Episode 2 results:

http://www.hothardware.com/articles/Sapphires_Ultimate_...

http://www.xbitlabs.com/articles/video/display/radeon-h...

or World in Conflict:

http://www.pcper.com/article.php?aid=482&type=expert&pi...

http://www.xbitlabs.com/articles/video/display/radeon-h...

#note, the pcper.com review uses an older driver set.

What im getting at is that xbitlabs make comments derived from these results. Thats fair enough. However the impression given by the reviews is that the radeons are poor for gaming. What bemusses and annoys me is that they come up with terrible STALKER results and then say: "The ATI Radeon HD is still plagued with performance-related problems in this game" which is blatently not true from not just other benchmarks on other websites - wait, every other website - but also my own experiences with an r600 card. The game doesnt support aa but texture filtering can be enabled. However at 16xAF other sources still give wildly different results:

http://www.guru3d.com/article/Videocards/468/12/

http://www.xbitlabs.com/articles/v [...] 50_23.html

Even looking back at older benchmarks, although seemingly without texture filtering enabled, and running on XP:

http://www.legionhardware.com/document.php?id=680&p=7

"Using the maximum quality settings in S.T.A.L.K.E.R, which means we have enabled every possible setting, did not seem to hurt the performance all that much. All three cards suffered similar performance losses, so the margins did not really change here."

Or something a little bit more recent, again though, on xp:

http://www.techspot.com/review/74-inno3d-geforce8800gt/...

Or our very own Tom's

http://www.tomshardware.com/2007/10/29/geforce_8800_gt/...

The question is, how do they get such poor STALKER results???


January 27, 2008 1:06:58 PM

zenmaster said:
There are a large number of well-known reputable sites.
Find some benchmarks from AnandTech, THG, or a few others.
Please post some results from there and compare them to what XBITLABS shows.

Check a number of differences......

Drivers. (Can be a large difference)
OS - Vista vs XP (Can be a large difference)
Game Patch Level - (Can be a large difference)

However, I would trust XBITLABS results over a poster.


yeah, unless the circumstances spoonboy and xbit matched we have a problem. otherwise, it is normal to get different results in different circumstances.
January 27, 2008 1:13:25 PM

no offense but i highly doubt that your 7600gt is "silky smooth" with everything on max in oblivion.
January 27, 2008 1:31:51 PM

pauldh said:
@ spoonboy - I can't comment on Stalker myself. But xbit was running vista 32-bit with Cat 7.12 drivers. They also set the graphics drivers to Quality not default settings, which turns off optimizations and reduces performance. It's good to hear you find it plays well. It is good info to share with others. Post your driver and OS version so others who read this know that it works well for you with those.



Yeah i noted that, cheers anyway, xbit do nice reviews (for anything but graphics in my opinion, pls no flames lol) but ati results always come out terrible, which is strange because other reviews show higher frames. I saw the catalyst and forceware options were adjusted for higher quality, and theres arguments for and against benchmarking with non-default settings, but in a game that doesnt use AA and dynamic lighting (back to STALKER again, sorry!), I find it hard to see where they are losing performance. Im currently gaming (well not right now as im on a work placement abroad, but i gave stalker a thorough going over with my pro since i just upgraded) with a sapphire 2900pro @ stock, core 2 duo e6300@2.6ghz, 2gb ddr2 800, catalyst options at default, catalyst 7.12, all options at maximum in S.T.A.L.K.E.R & full dynamic lighting @1400x900. I believe my version of the game is 1.0003.

I have 2 saved games at points where my last card chugged badly (oc'd 7900gt) saving them named as the fps at that point, such as "19fps" & "22fps". This was because oc'ing the gt by 120mhz had almost no effect on these mins, which amazed me and so I decided to keep those saves for future reference when I decided to upgrade again. Upon loading up STALKER with a newly installed pro, - im sorry i have to be vague here I wasnt paying a huge amount of attention, just noting if these points, the most demanding i had seen in the game during the 2 times i completed it, were now smooth, - i got something like 27+fps. This means that the 5% of the game world that chugged a bit with a 7900gt would now be as smooth with a pro as the other 95% was with the gt. I also took a walk around the village area from where the game starts, poking around the buildings, doing some shooting, throwing grenades, then walking up the slope to the road, and turning around left and right to take in views with long draw distances. Frame rates would occasionaly dip then recover as new parts of the game world loaded (Side panel off, HD audible), which i verified by retracing my steps over the same areas, which didnt then suffer a sharp drop a second time. I then tried similar messing about at a save at the north entrance to the 'garbage' area and at the 'bar' area, going inside and then out (of the bar), walking around the buildings, etc. Apart from periods of hard disk activity, I never saw frames drop below about 28-30fps, except at the two points I mentioned right at the start. In short really, I cannot understand nor replicate the xbitlabs results.
January 27, 2008 1:34:00 PM

eagles453809 said:
no offense but i highly doubt that your 7600gt is "silky smooth" with everything on max in oblivion.


No i think he was joking with that lol. It was hardly what i would call 'silky' with a 7900gt oc'd to 700/850. Still got pesky mins of between 21-19fps outdoors.
a b U Graphics card
January 27, 2008 5:19:35 PM

LOL, Zen, I took you serious for a few seconds there.
a b U Graphics card
January 27, 2008 5:30:03 PM

@Spoonboy - Yep, I was joking about Oblivion but have seen claims of the 7600Gt and 7800GT MAXing Oblivion. Meanwhile my 7800GT oc was aweful for Oblivion and was quickly upgraded to a X1800XT that allowed aa+HDR, higher details, and still privided smoother framerates. Like you said, The GF7x00's low fps tank outside in Oblivion.

And again, I can't speak about Stalker myself, but will note that Digit-life had more than 100% the performance that xbit labs had. Obviously they may have benched different parts of the game, and they turned off the optimizations, but look at Digitlife with 16x HQAF and the HD2900XT averages about 60 fps vs 28 from xbit.
http://www.digit-life.com/articles2/digest3d/1107/itogi...
January 28, 2008 9:14:46 AM

Cheers for that. Yep, had trouble finding another review with stalker benched with 16xaf, but yeah god knows what xbit do, but there ati scores are just plain odd at times lol.

(btw i would have got into oblivion much more if it didnt tank with my 7900gt, just plain lost interest now though lol Other than that it was a great card all round. Crysis just instantly put it into another light though, and i saw the writing on the wall. It said "go on ebay and splurge 140 quid lol"

Cheers.
January 31, 2008 10:11:19 AM

Im sorry i cant let this one go and i could probably do with a good shake and a slap around the chops but i nevertheless find this just shocking. I should just stop looking at xbit's graphics card reviews cos they just make me angry. They arse around with driver settings and enable super-duper special performance killing über AA & AF, then find frame rates are poor, genius. Thats their decision fair enough im just bitter lol. Then they write rubbish like this: (asus top 3870 & 3850 review)

"According to our traditional testing methodology, the drivers were set up to [...] minimize the influence of software optimizations that both – AMD/ATI and Nvidia – enabled by default."

all sounds peachy until u reach the conclusion:

"Lows:

Insufficient driver optimization;
Texturing and raster processors could be too few;
Almost no overclocking potential;
High price."

"Insufficient driver optimization" lol they tried to isolate optimisation in the setup. total monkeys.

Links:

Start of article: http://www.xbitlabs.com/articles/video/display/asus-eax...
Test setup: http://www.xbitlabs.com/articles/video/display/asus-eax...
Conclusion: http://www.xbitlabs.com/articles/video/display/asus-eax...
January 31, 2008 10:55:28 AM

SirCrono said:
Whom to believe, the reputable website or the forum poster?

Its a though question as always....

To support your claims a little please post your gpu drivers, maybe xbitlabs used outdated ones...
dude like toms is reputable you'd think right. Well I'll use the classic pc3-2000 mother board, tec sites said its the best board they ever tested. Well 1 month later the board cause errors do to MTH and RMA were issued. So I dont believe all the crap they test. All these sites and everyone has a brand name that sponsors their site to say crap so believe the users.
January 31, 2008 12:43:47 PM

I used to go on gamespot's forums quite alot, the general opinion there is that tomshardware isn't any good - although I personally like it. There was even a thread called something like "Tomshardware benchmarks, noob bait". Put me off even looking on the site let alone joining the forums lol.

#i like tom's and i dont share their opinion btw#
January 31, 2008 1:34:27 PM

spoonboy said:
Nah the source of my rant is that xbit enable non-standard aa: "Also, to ensure maximum image quality, we enabled transparent texture filtering options: Adaptive Anti-Aliasing/Multi-sampling for ATI Catalyst and Antialiasing – Transparency: Multisampling for Nvidia ForceWare." Which kills performance, particularly for ati (yes im a fanboy lol). Other reviews using normal aa but otherwise identical filter and detail settings get different results. A quick example, check out the difference between Halflife 2 Episode 2 results:

http://www.hothardware.com/articles/Sapphires_Ultimate_...

http://www.xbitlabs.com/articles/video/display/radeon-h...

or World in Conflict:

http://www.pcper.com/article.php?aid=482&type=expert&pi...

http://www.xbitlabs.com/articles/video/display/radeon-h...

#note, the pcper.com review uses an older driver set.

What im getting at is that xbitlabs make comments derived from these results. Thats fair enough. However the impression given by the reviews is that the radeons are poor for gaming. What bemusses and annoys me is that they come up with terrible STALKER results and then say: "The ATI Radeon HD is still plagued with performance-related problems in this game" which is blatently not true from not just other benchmarks on other websites - wait, every other website - but also my own experiences with an r600 card. The game doesnt support aa but texture filtering can be enabled. However at 16xAF other sources still give wildly different results:

http://www.guru3d.com/article/Videocards/468/12/

http://www.xbitlabs.com/articles/v [...] 50_23.html

Even looking back at older benchmarks, although seemingly without texture filtering enabled, and running on XP:

http://www.legionhardware.com/document.php?id=680&p=7

"Using the maximum quality settings in S.T.A.L.K.E.R, which means we have enabled every possible setting, did not seem to hurt the performance all that much. All three cards suffered similar performance losses, so the margins did not really change here."

Or something a little bit more recent, again though, on xp:

http://www.techspot.com/review/74-inno3d-geforce8800gt/...

Or our very own Tom's

http://www.tomshardware.com/2007/10/29/geforce_8800_gt/...

The question is, how do they get such poor STALKER results???



THAT is a much better Post/Rant :>


!