Crysis Warhead!!Ati Rock with 8.9 Driver!

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
The 9800 GTX+, amd 4850, amd 3870, amd 4870, 8800 GT, 8800 GTS, GTX260, have higher minimum framerates than the GTX 280

These benchmarks are total hogwash, I can appreciate your enthusiasm but its pretty apparent they aren't repeating the test runs per card to find an average - which is pretty poor practice.

I'm not trying to be rude to you, but i'm just pointing out that this review is obviously quite biased and unrealistic.
 

LAN_deRf_HA

Distinguished
Nov 24, 2006
492
0
18,780
I have seen other benchmarks that showed the 280 GTX dipping to a much lower minimum fps than those cards, so something is probably going on, but not extreme enough to justify those numbers.
 

FrozenGpu

Distinguished
Dec 8, 2007
986
0
18,990


I'm not saying that the benchmarks are not bad, but ovaltine you are some what biased to the Ovaltine/nvdian fraction conglomerate [read into it, it's a fact! :D ] ...so your comments carry that undertone, no matter how much you try to clear that up. :pt1cable:

It's so bad now that I automatically think Ovaltine is part of "The It's Meant To Be Drunk" camp...DAMN you ovaltineplease, damn you!!!! :lol: :lol: :lol:
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
I own an AMD 4850 in my second PC, I don't really know how using Nvidia gtx260s in my main gaming PC makes me biased.

Anyways, I know you don't mean anything by it - but that website is proven amateur; they can post whatever they want but I would respond no differently if they posted a 4870x2 result having a lower min fps than a gtx280 in any other game - it would likewise be totally unbelievable nonsense.

ed: one thing I should clarify - i'm only an EVGA fanboy, and only cause of their services : O~
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
in no way are these biased [unless they are fake].. if they did one round with each card, and got these results from the one round, than hey, it gotta be real. it could even be luck. can't be biased though

*edit*
on another note, this site doesn't seem to have any credibility, so my last comment is worthless. lol

*another edit*
my proof: no author. of course, one of the first things you learn in hs/college is to never read/believe anything that doesn't have the presence of an author's name.. if psycho's the author, well that just makes my point a little more substantial
 

randomizer

Champion
Moderator

Biased, no. Inconclusive, yes. A single benchmark run can't be trusted. I've done them myself with Crysis, and you will often get different framerates the second time round.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
Yea, exactly; anyone who has done any benching on Crysis (or really any games modern games) will have to do 2-3 runs in order to get a conclusive result as randomizer said.

I'm not against gameplay benchmarking, hardocp frequently uses this method - however you have to do a LONGER run than 40 seconds, that is just not long enough to make a conclusive benchmark.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
If I run my gtx260 in non-sli I don't get those min framerates on a fraps benchmark - its not reality; I could accept a 1 fps difference, but not between 5-10 - thats just laughable.

I mean good god the 9800GTX+ framerates:

min 15, avg 17.


lol?
 

FrozenGpu

Distinguished
Dec 8, 2007
986
0
18,990


um 5-10 is completely possible given different hardware/software set-ups, you'd be surprised how the smallest things can affect fps...
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
W/e. I don't intend to argue the point as I already know this website is a bunch of amateurs to begin with, I would suggest waiting for a better website to do a technical review on the game cause this "review" is a jokeshow.
 
Ive seen more than 1fps, but 10, no. Im thinking they just gave lowest read, even for a single frame. Thats doesnt mean itd be noticeable, tho . Could be on fireup. All Im saying is, they arent the only ones to show a G280 with lower minimum fps than the 260 or the 4870. As we see more benches well know more
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
It's also worth noting that the 4870 isn't producing playable FPS in the enthusiast test either, so its victory there is irrelevant as far as I'm concerned.

Why is it so hard to believe that the 4870 can have higher minimums than the GTX280? The minimum probably comes when textures are streaming in, and the Radeons are know to have superior memory management.
 

topper743

Distinguished
Dec 6, 2007
407
0
18,790
You might want to consider this benchmark pulled from Tom's.

http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/Crysis-v1-21,751.html

It's not exactly the same as PCgames but very close as a benchmark. But it shows that the PCgames tests are repeatable in other sites and yield simular results. Notice that the 260 gtx is sli with 896 of mem and only does 0.3fps better? The hd4870 has 512 mb mem. The gtx 280 with 1024mb mem only does 1.1fps better. In this test only two cards beat the HD 4870 and those are the GTX280 1024mem and the 9800gx2 512x2 mem in that order. You will notice how the other Nvidia cards are sli'ed and the ATI is a solo card. As you said Crysis is optomized for Nvidia. All these top cards seem to be nearly at a tie give or take a frame. The highest performing cards will do better at higher resolutions. The sorry thing in all of this is that the very best hardware can pull only 26-29fps at 1680x1050, 0xAA, Trilinear, Very High Quality. The game coding in Crysis is poor. If you like the game great. One shouldn't have to have a 9800 gx2 tri-sli system to pull 60-70 fps on a game at 16x10 rez. I don't think that we should go to Crysis as the benchmark of record.
 
Like Ive said, you dont have to search far to find this. Slamming this site just doesnt do it for me. Looking at real numbers, other sites numbers backs up their findings. Also, for "amateurs theyre getting some nice interviews http://www.pcgameshardware.com/aid,602522/News/Exclusive_Interview_with_Epics_Tim_Sweeny_on_Unreal-Engine-3_and_UT_3/
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280


THG got slammed when they posted that article, and you're using it as a source reference?
 

invisik

Distinguished
Mar 27, 2008
2,476
0
19,810


those charts seem bs to me. the gtx260 in sli get 25.20 fps on 1680x1050 no aa v. high quality.
i have my 2 gtx260 in sli and i avg the about the same fps with the same settings except i have 16aa on.
when i oc my cards i get 30-35-fps.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
I'm going to duplicate their benchmark on my system with a gtx260 in non-sli mode, overclocked

and stock-clocked with a Core2Duo processor and 4gb RAM. I am using their method, at close to

the same point and following the same methodology - starting the benchmark as soon as the game

loads while still loading textures and stopping at 40 seconds.

http://img401.imageshack.us/my.php?image=warheadsavevy2.jpg



Tests:

GPU: 684/1404/1215, CPU: core2duo 4.05ghz, RAM: 4gb ddr2-800 unlinked 5/5/5/18

1680*1050, Enthusiast settings, NO AA/AF

Min 21, Max 28, Avg 24

http://img235.imageshack.us/my.php?image=desktopnw5.jpg



1680*1050, Enthusiast, 16XAF

Min 14, Max 22, Avg 18





GPU: 576/1242/999, CPU: 3.0ghz, RAM: 800mhz


1680*1050, Enthusiast, 16XAF

Min 14, Max 21, Avg 17



GPU: 684/1404/1215, CPU: 3.0ghz, RAM: 800mhz


1680*1050, Enthusiast, 16XAF

Min 16, Max 24, Avg 19


Suffice to say I don't expect my benchmarks to be the starting and ending authority on Crysis benchmarking nor do I find the performance mind-bogglingly impressive with a single gtx260 at "Enthusiast settings" - its clearly not playable with AF enabled at stock clocks any way you cut it.

However, the fact that my single gtx260 is vastly outperforming their gtx280 in min fps - with the only major differentiating factor being that I am using 4gb of RAM vs their 2gb of RAM prooves either 1 of 3 things: one, you need more than 2gb of RAM (obvious and proven) two, their testbed is a gongshow or three, they fail at testing without a timedemo (quite possible too)

I reserved the csv files for the other timedemos for the sake of validation if anyone cares to question it - but I thought i'd save myself a ton of trouble uploading that many images.
 


Like Randomizer said, not bias so much as unreliable IMO.

As for the minimum fps issue remember an average is including very high end numbers, and the min would represent a bottleneck of whatever is there. It could be a momentary dip, and I have seen it many times with many different otherwise powerful cards.

Seeing a card with a high 70fps average but low min 10 fps (better to have median low IMO) would show variability including very high highs, but another card showing a 40fps min and a 60fps, would mean the highs aren't high, but there's also few drops where some area is bottlenecked for whatever reason. It's tough to tell from a benchmark, but they don't look biased, or fake or anything just based on that, I have more of a problem with what we've discussed many times, not running multiple runs or resolutions.



That's the DX9 path, the Crysis DX10 path favours ATi actually.

Take not of their comments, which reflect most tests of the current Crysis: "At high details (Gamer mode) the Geforce GTX 280 is up ahead with the GTX 260 and the HD 4870 behind it... At maximal details (Enthusiast mode) the Radeon HD 4870 is the winner.."

Anywhoo, I expect Warhead to run similarly to the original Crysis, with lower shader quality running better on the GTX and the Very high shaders to run better on the HD4K series. The differences are minor and many people favour resolution over effects, so they run DX9 with the quality mods.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
^Thanks for that ovaltine. Do note that performance in this game is GREATLY improved by using DX9 mode. As far as I can tell the only thing you'll be missing out on is object motion blur, as it is not present under DX9 even with all 'enthusiast' settings.