Sign in with
Sign up | Sign in
Your question

Bioshock strongly favors the ATI Radeon

Last response: in Graphics & Displays
Share
August 28, 2007 11:56:35 PM

See Gamespot last benchmark: http://www.gamespot.com/features/6177688/p-6.html

Interesting to see the HD 2900XT win the performance crown, beating the 8800GTX by approximately 10%.

What is much more interesting though is how the X1950Pro trumps its direct competitor, the 7900GS. I've always thought that the X1950Pro had more long-term potential, well, this 40% victory is proving it.

Note, however, that the HD 2900XT is currently doing much worse in Windows Vista. The wide discrepancy between XP and Vista results suggest that Vista performance could be improved in a subsequent driver release. NVidia's drivers look much more mature in this benchmark.

Note to those who've been recommending the 640MB version of the 8800GTS "for high resolutions": it performs significantly worse than an overclocked 320MB at 2048x1536...

Some of the weirdest benchmarks I've seen lately. Thoughts?

August 29, 2007 12:05:09 AM

People on neowin.net told me this is a "The way it's meant to be played" nvidia-sponsored game.

HAHAHAHA!!!!
August 29, 2007 12:20:18 AM

ok... so the 8800 GTX out performs 2900 XT only in Vista.... Don't need to act like fanboy about it
Related resources
August 29, 2007 12:25:18 AM

It is because the lower mb versions of video cards use better ram. 4 fps isnt a big dif, where do you see ati whipping nvidia by 10%??
August 29, 2007 12:48:08 AM

That is pretty amazing, and one of the reasons I quit going to game spot years ago. If you go to a more reputable gaming source like firingsquad

http://www.firingsquad.com/hardware/bioshock_directx10_...

Isn't it amazing how the ATI card doesn't come in faster than any of the Nvidia cards but the old 7k series?

I wonder what kind of a financial deal ATI reached with gamespot on that one.
August 29, 2007 12:58:14 AM

My question is, why would you have a top of the line graphics card if your not running dx10? DX10 numbers are all that matter to me. All the other benchmarks Ive seen have nvidias cards winning too.
August 29, 2007 1:01:44 AM

The FiringSquad tests are under Windows Vista, while Gamespot are under XP. And Gamespot show that the HD 2900XT does a LOT worse in Vista than in XP. That wide discrepancy suggest a driver update would help in Vista.
August 29, 2007 1:03:19 AM

cory1234 said:
My question is, why would you have a top of the line graphics card if your not running dx10? DX10 numbers are all that matter to me. All the other benchmarks Ive seen have nvidias cards winning too.
Bioshock DX9 and DX10 modes show only negligible differences, both visually and in the framerate, as FiringSquad and Gamespot show.
August 29, 2007 1:12:18 AM

Trust gamespot for all the lastest gaming reviews and previews. But not for hardware I recommend you use this site "TomsHardware.com" It's really good in fact you are posting on it right now
August 29, 2007 1:41:00 AM

xaositect said:
That is pretty amazing, and one of the reasons I quit going to game spot years ago. If you go to a more reputable gaming source like firingsquad

http://www.firingsquad.com/hardware/bioshock_directx10_...

Isn't it amazing how the ATI card doesn't come in faster than any of the Nvidia cards but the old 7k series?

I wonder what kind of a financial deal ATI reached with gamespot on that one.


Uh, the tests were done under XP with Gamespot and under Vista Ultimate on Firingsquad. Big difference. It pays to actually read the small print before you start throwing baseless accusations around. I'll give you that Firingsquad is a little bit more reputable when it comes to hardware, but that doesn't mean that writers at gamespot are getting paid off by anyone either. That being said, I wish they would have benched with a quad core to see if it offers any advantage, though I seriously doubt it, I don't think the unreal engine supports more than 2 at this point.

Since most people are using XP at this point, I think this is a rather impressive showing for the 2900XT, although it is only one game and could be a fluke. We'll see when Crysis and some of the other big fall titles start making their appearances.
August 29, 2007 1:52:20 AM

cb62fcni said:
Since most people are using XP at this point, I think this is a rather impressive showing for the 2900XT, although it is only one game and could be a fluke. We'll see when Crysis and some of the other big fall titles start making their appearances.

The HD2900XT also beat the 8800GTX in Rainbow Six Vegas, which uses UE3.
August 29, 2007 2:59:53 AM

I call this bull simply based on the fact that it's from Gamespot. They gave the game a 9.0/10, which is the same rating they gave Perfect Dark Zero.
a b U Graphics card
August 29, 2007 7:34:51 AM

Heyyou27 said:
I call this bull simply based on the fact that it's from Gamespot. They gave the game a 9.0/10, which is the same rating they gave Perfect Dark Zero.



ROFL :lol:  Yea what's up with that ?
August 29, 2007 8:21:08 AM

I saw a review about DX10 on this site that showed ATI doing pretty good during DX10 games. DX10 is sooo young still.
August 29, 2007 9:32:57 AM

"Bioshock strongly favors the ATI Radeon"

4-5 fps is not what i call "strongly" especially when you look at dx10 scores and when the ultra is missing.
August 29, 2007 12:20:50 PM

It's been said a million times. the new ati drivers do make a difference.
August 29, 2007 12:45:16 PM

drakh said:
"Bioshock strongly favors the ATI Radeon"

4-5 fps is not what i call "strongly" especially when you look at dx10 scores and when the ultra is missing.
Care to read preview replies? Even if the HD 2900XT only beat the 8800GTX by 1 fps, it would still show a strong advantage for that card, given the fact that its nvidia counterpart is the 8800 GTS not GTX.

The DX10 scores are under Vista where the HD 2900XT is doing a lot worse than under XP, which suggests that a driver update can improve that.
August 29, 2007 1:05:50 PM

Interesting, but doesn't mean much.
I'm enjoying Bioshock just fine on my 8800 GTS 640mb, so no regrets here. :) 
August 29, 2007 1:07:35 PM

Not to diss 8800 owners or anything. I wish I had one of these cards!! :) 
August 29, 2007 1:50:52 PM

Haha, gamespot benchmarks. Why don't we just post everyone's little sisters benchmarks also, she's as reliable.
August 29, 2007 2:16:45 PM

Element0f0ne said:
Interesting, but doesn't mean much.
I'm enjoying Bioshock just fine on my 8800 GTS 640mb, so no regrets here. :) 

true..and im enjoying the game just fine with my x1950 pro aswell :) 
a b U Graphics card
August 29, 2007 2:31:03 PM

I am playing with the DEMO, have not bought the full version yet and will wait a bit before I buy. See my sig for specs. I have everything set a max, (DX9) and I have not checked what frame rates I am getting, but the game runs and looks just great. Very smooth and quick with my x1950pro and overclocked X2.
August 29, 2007 2:39:54 PM

bioshock 1920x1200

8800 gtx sli - 74.9 fps
8800 ultra - 70.1 fps
8800 gtx - 63.6 fps
8800 gts 640 sli - 58.6 fps

:pt1cable:  :pt1cable:  :pt1cable:  :pt1cable: 
August 29, 2007 7:09:07 PM

Wait till the Nvidia drivers to overtake the Ati drivers!
August 30, 2007 8:27:14 AM

itotallybelieveyou said:
ok... so the 8800 GTX out performs 2900 XT only in Vista.... Don't need to act like fanboy about it


He's being a fanboy? If you want to call me a fanboy because I only like NVidia and I'll never buy an ATI product, go right ahead cause I am proud of it! He is just reported that the 2900 is getting better driver wise. I'm shocked at the difference between the 8800GTS 640MB & the 320MB, its just not logical. Something strange is going on here, more memory should give you worse performance. They both have the same GPU, "I believe theres a driver issue with texture management".
August 30, 2007 9:33:17 AM

I think it is cause the R600 in the xbox got a nice work out on unreal engine 3 in gears of war, so that means that it is going to kick *** in ut3 cause it is nicely tuned for unreal engine 3 and in case you have not noticed a LOT of games are using unreal engine 3. To all of those of you who said the R600 would suck... Like Nelson says: HAAAAAAAAAAA HAAAAAAAAAAAAAAAAA for now, somehow i see nvidia sorting this out very soon.
August 30, 2007 11:12:37 AM

Hum at 1600x1200, Vista Dx10
HD 2900XT : 43fps
Geforce 8800 GTX : 60fps

...
August 30, 2007 11:19:32 AM

might also point out the worlds hugest flaw in your analysis, you see, the 2900xt that has 512mb of gddr3 ram may have less capacity and bandwidth but its latencies are atleast 2x faster tha those of the gddr4 1gb varient, which would easily account for that difference.
August 30, 2007 11:25:51 AM

Can someone tell me again the stats of the hd2900xt for dx10 wich on paper supposed to a lots better than the 8800 gtx...
August 30, 2007 12:25:54 PM

1740544,1,63296 said:
See Gamespot last benchmark: http://www.gamespot.com/features/6177688/p-6.html

Interesting to see the HD 2900XT win the performance crown, beating the 8800GTX by approximately 10%...quotemsg]

I looked at it and I think that largest gap is only 8.3%... and then the other is about 5%. We are talking about literally 3 or 4 fps more. That is not really going to do anything for the game. Also, where they running the drives Nvidia came out with for Bioshock? What drives were they using??? What were the video quality settings, whats turned on in video settings and whats turned off? Also is, what version of the 8800 GTX are they using, same for the HD2900 xt? I would be shocked to see their best HD2900XT beat a BFG OC2 or XXX.

Personally, this seams bias and just have discredited and credibility to me. Yes DX9 is still here, but shouldn't we only really care about DX10. Isn't that the direction we are heading, whether the game developers like it or not?

And for you who did not read the "benchmark", the GTX out performs the XT in DX10 by 29% (43/60 = |.7166666666-1| = .29 = 29%), or a difference in 17 fps (60-43=17).
August 30, 2007 12:32:23 PM

Nvidia stated they're going to release a driver to fix the Bioshock AA issues so we would see better performance on Nvidia cards in the near future.
August 30, 2007 12:34:49 PM

Rabidpeanut said:
I think it is cause the R600 in the xbox....



Right.... you do know that the Xbox 360 dose not have an R600 chip in it right. Its called a Xenos chip, with only 48 unified shaders vs the R600 64 unified shaders.
August 30, 2007 1:47:45 PM

bfellow said:
Nvidia stated they're going to release a driver to fix the Bioshock AA issues so we would see better performance on Nvidia cards in the near future.


NVidia also has to fix there texture managerment issues with the 8800 series, where the texture data gets stuck in Vram. The card with the highest amount of Vram will not notice it as much. Alt-tabbing helps this though.
August 30, 2007 3:08:06 PM

spaztic7 said:
1740544,1,63296 said:


Personally, this seams bias and just have discredited and credibility to me. Yes DX9 is still here, but shouldn't we only really care about DX10. Isn't that the direction we are heading, whether the game developers like it or not?

(43/60 = |.7166666666-1| = .29 = 29%), or a difference in 17 fps (60-43=17).
said:


how does it seem bias?there are no opinions puts forward here,only numbers,numbers cannot be biased.

the point here is that the 2900 while being significantly cheaper than the 8800gtx, still manages to beat it in dx9 in this game.and dx9 is still the dominant interface and will probabaly be for the next 18 months.so unfortunatly,the argument put forward here just does not fly.

secondly,whoever said 5fps is not gona make a difference is wrong.the difference between 25fps and 30 fps or even 35fps and 40fps in my opinion is fairly noticable.its especially noticable when you are getting those extra frames out ouf a card thats about 40%cheaper haha


bottom line,if i had just bought an 8800gtx id be kicking my own ***
August 30, 2007 3:48:42 PM

spaztic7 said:
I looked at it and I think that largest gap is only 8.3%... and then the other is about 5%. We are talking about literally 3 or 4 fps more. That is not really going to do anything for the game. Also, where they running the drives Nvidia came out with for Bioshock? What drives were they using??? What were the video quality settings, whats turned on in video settings and whats turned off? Also is, what version of the 8800 GTX are they using, same for the HD2900 xt? I would be shocked to see their best HD2900XT beat a BFG OC2 or XXX.
Ugh, I grow tired of repeating stuff. See, the HD 2900XT's direct competitor is the 8800GTS, not GTX, so even if it was on par with the GTX that would show a strong advantage for the ATI card. It's as if the X1650XT was on par with the 7900GT, see. Driver info and exact system specs used in the benchmark are written at the bottom of the page.

spaztic7 said:
Personally, this seams bias and just have discredited and credibility to me. Yes DX9 is still here, but shouldn't we only really care about DX10. Isn't that the direction we are heading, whether the game developers like it or not?

And for you who did not read the "benchmark", the GTX out performs the XT in DX10 by 29% (43/60 = |.7166666666-1| = .29 = 29%), or a difference in 17 fps (60-43=17).
The HD's bad performance in DX10 seems not caused by using DX10 but by using Vista, given the little difference between DX9 and DX10 under Vista, and the large difference between DX9 tests between XP and Vista. So as I said this looks related to drivers rather than DX10 capabilities, and we should expect Vista performance to be, ultimately, about as good as under XP, when drivers mature.

We're certainly heading for DX10, but nowhere near yet. In the meantime, all we have is DX9 games and sometimes a little extra (more psychological than anything) eye candy labeled "DX10", so that Vista + newGPU owners can feel good.
August 30, 2007 4:03:42 PM

goldenboy said:
secondly,whoever said 5fps is not gona make a difference is wrong.the difference between 25fps and 30 fps or even 35fps and 40fps in my opinion is fairly noticable.


Ok, first... We are not talking about 25 to 30 fps, nor are we talking about 35 to 40.

Second... I stand corrected about the drivers. In the small print they do list the drivers. I missed it on my first read through.

Third, half a year later AMD better have released a good video card. If it was worse then a card that was out for more the 6 months before... that would be the end of them. Sorry if Nvidia is release hardware quicker then AMD can.

And fourth, what’s up with this "Intel 975XBX2, eVGA 680i SLI" (scroll to the bottom where is gives the system setup) ? They used two different motherboards! From my understanding, when comparing hardware like video cards, you want to have all the same hardware other then the video cards comparing. All this shows is one hardware configuration with either or may do better or worse. Which benchmarks go with what motherboard?
August 30, 2007 4:05:40 PM

Dr_asik said:
We're certainly heading for DX10, but nowhere near yet. In the meantime, all we have is DX9 games and sometimes a little extra (more psychological than anything) eye candy labeled "DX10", so that Vista + newGPU owners can feel good.


Your right, when buying top-end or high mid-end, we care nothing of this silly eye candy.
August 30, 2007 4:47:59 PM

You're obviously trying to be sarcastic, but check out: http://www.firingsquad.com/hardware/bioshock_directx10_...

It takes careful examination to reveal any actual difference between the DX10 and DX9 modes of Bioshock. Extra eye-candy is good, but this is almost insignificant. We're not talking about the difference between Half-Life 2's DX9 and DX8 paths here.
August 30, 2007 4:50:59 PM

spaztic7 said:
Ok, first... We are not talking about 25 to 30 fps, nor are we talking about 35 to 40.

Third, half a year later AMD better have released a good video card. If it was worse then a card that was out for more the 6 months before... that would be the end of them. Sorry if Nvidia is release hardware quicker then AMD can.

And fourth, what’s up with this "Intel 975XBX2, eVGA 680i SLI" (scroll to the bottom where is gives the system setup) ? They used two different motherboards! From my understanding, when comparing hardware like video cards, you want to have all the same hardware other then the video cards comparing. All this shows is one hardware configuration with either or may do better or worse. Which benchmarks go with what motherboard?


firstly.ok i will admit that at those high framerates,5 frames is not alot when you are actually playing.however,it still shows that the 2900 is faster than the gtx.and in gaming terms,5fps represents a reasonable difference in performance and power output from a gpu,even if the real world experience is not drastically different.

secondly.i agree,ati did hold back for 6 months on the 2900 and again, you are right,if they hadnt pulled something awesome out then they would have been screwd.the point is that they did,and the 2900 came out half a year after the 8800gtx.and it beats the gtx in this game.its release date is of no consequence,it is cheaper and it wins.that is awesome for me cause im getting one on monday.

thirdly.i agree that is is not smart using different mobos if you wana isolate the performance of the gpu.however,the difference can not be said to account for the 2900 beating the gtx at all.it should not happen.they are not even supposed to be compared to each other.hell,if the mobo made the 2900 faster than the gtx then ill take one of those to
August 30, 2007 5:09:34 PM

and what about dx10 ? i see the gtx killing the hd on this one, not to mention recent games like World in Conflict. I agree the HD2900 is an excellent card and probably the best bang for the buck though.
August 30, 2007 5:16:12 PM

Rabidpeanut said:
I think it is cause the R600 in the xbox got a nice work out on unreal engine 3 in gears of war, so that means that it is going to kick *** in ut3 cause it is nicely tuned for unreal engine 3 and in case you have not noticed a LOT of games are using unreal engine 3. To all of those of you who said the R600 would suck... Like Nelson says: HAAAAAAAAAAA HAAAAAAAAAAAAAAAAA for now, somehow i see nvidia sorting this out very soon.
Uhh, the Xbox360 has the "R500" also known as Xenos; it has 48 unified shaders at 500MHz, integrated eDRAM, and doesn't support geometry shading or Direct X10.
August 30, 2007 6:01:03 PM

In DX10 the all the 8800's beat the 2900. Also in dx10 the 8800's get a performance gain with higher settings! This is an accomplishment in my mind for Nvidia with all that we've seen with DX10.

For these high end cards that are all DX10 why do all these reviews show DX9?!? If its a DX10 game than show it in DX10, especialy when there are noticable differences. Even if they are little, it should be shown.

Who the hell reviews something and turns of certian settings to make a biased conclusion on who is the winner?!?
August 30, 2007 6:43:28 PM

For everyone who has looked at the pictures on the firingsquad revied:

Quoted from the firingsquad arctice:

"UPDATE 8/29/07: Since publishing this article we've discovered that toggling between DX9 and DX10 in BioShock's graphics settings menu doesn't work 100% correctly, which explains why we couldn't see the difference in water ripples. Click here for the full story."

If you check these images out and read the update you will see the huge differences in DX9 and 10 and why the previous images did not show them. Got to love drivers!!
August 30, 2007 7:12:49 PM

OH MY GOODNAESS AND THE R500 is NOTHING EVEN REMOTELY LIKE THE R600 IN ANY WAY WHATSOEVER so they could *NEVER* in a billion years have figured out how to use unreal engine 3 on the R600 from that now could they?
August 30, 2007 8:19:53 PM

yes well if you wana run bioshock in dx10 then the gtx is the way to go.its quite alot more to pay for dx10 though,expecially when there is hardly any difference at this stage.
August 30, 2007 9:20:18 PM

what about af and aa are the 2900s still having a miserable time with these two features enabled.i noticed that firing squad and gamespot did not enable these.
August 30, 2007 11:27:44 PM

AF is not too bad with the 2900 in fact i turn it on 16x in all my games, but AA is still HORRIBLY bad.
August 31, 2007 2:03:31 AM

Dr_asik said:
You're obviously trying to be sarcastic, but check out: http://www.firingsquad.com/hardware/bioshock_directx10_...

It takes careful examination to reveal any actual difference between the DX10 and DX9 modes of Bioshock. Extra eye-candy is good, but this is almost insignificant. We're not talking about the difference between Half-Life 2's DX9 and DX8 paths here.

Ok.. enough of all the serious talk, I am tiered and my brian hurts now. I also agree with you that it dose take a lot of effort to notice the differences, I had about 4 people looking when comparing the PC between the 360. They PC (in DX10) had a little bit more detail, ran smoother, and reacted faster. I completely agree that this is not a jump like dx8 to dx9. I fear that those days are gone now :( 

No I did not check out firingsquad. I will have to look at that in a bit.

Goldenboy, that you for agreeing with me and I also agree with you. I would be dumb to not post the best. I just hope that the drivers will fix everything.... the next batch.
August 31, 2007 7:30:00 PM

Rabidpeanut said:
OH MY GOODNAESS AND THE R500 is NOTHING EVEN REMOTELY LIKE THE R600 IN ANY WAY WHATSOEVER so they could *NEVER* in a billion years have figured out how to use unreal engine 3 on the R600 from that now could they?


You said it, not us.

That was a joke.

You know what else favors the HD2900 xt? Cross fire! Who would have guessed....

Who is this ATi company they speak of? I thought they were bought out and turned into AMD... or DAMNIT.
a b U Graphics card
September 1, 2007 9:05:11 PM

Not that I'd expect objectivity in this thread, but some people need to read up first before commenting.

xaositect said:
That is pretty amazing, and one of the reasons I quit going to game spot years ago. If you go to a more reputable gaming source like firingsquad

http://www.firingsquad.com/hardware/bioshock_directx10_...

Isn't it amazing how the ATI card doesn't come in faster than any of the Nvidia cards but the old 7k series?


Isn't it also amazing that while the nV cards have a higher average and actually increase performance using DX10, yet in general have lower minimum FPS? I know personally I'd want the higher min fps even if the 'average' favoured the other option. Neither drops too low, but the min FPS of the XT is noticeably higher than the GTS, and the Ultra is only slightly higher than the XT, where is the GTX? Or as you imply was Firingsquad paid off for their review? Since I know folks over there, I doubt it's bias, I'll just say it's a choice for space, even if it wouldn't be my choice.

Rabidpeanut said:
I think it is cause the R600 in the xbox got a nice work out on unreal engine 3 in gears of war, so that means that it is going to kick *** in ut3 cause it is nicely tuned for unreal engine 3 and in case you have not noticed a LOT of games are using unreal engine 3.


As has been mentioned the R600 is not th R500/Xenos, many differences, and add to that that it's M$ that does the Xbox drivers, their experience in Gears and other titles (like Lost Planet) don't necessarily translate to the HD2K line. Associating the two for desktop PC gaming performance/benefits is naive.

spaztic7 said:

Third, half a year later AMD better have released a good video card. If it was worse then a card that was out for more the 6 months before... that would be the end of them. Sorry if Nvidia is release hardware quicker then AMD can.


Which means they also had 6 months to work on drivers, both have weaknesses and sofar AMD's is hardware, and nVidia's is software/drivers.

Quote:
And fourth, what’s up with this "Intel 975XBX2, eVGA 680i SLI" (scroll to the bottom where is gives the system setup) ? They used two different motherboards! From my understanding, when comparing hardware like video cards, you want to have all the same hardware other then the video cards comparing. All this shows is one hardware configuration with either or may do better or worse. Which benchmarks go with what motherboard?


Most sites do that now, and it's wrong. they both should be tested on a single platform the variance is too great, and both FirigngSquad and Gamespot are guilty of that.

They should do single card tests with the same mobo, and then later do Xfire vs SLi tests separately.

Quote:
Who is this ATi company they speak of? I thought they were bought out and turned into AMD... or DAMNIT.


Guess you never heard of Chevy, Lincoln, Gatorade, Tropicana, Pizza Hut, etc?
AMD is the owner of ATi now, but it still exists as their graphics division and the name they sell their cards under; so either you're ignorant or obtuse, neither of which is an admirable trait in someone posting about credibility. :sarcastic: 

In the end I think both reviews compared/contrasted and internally show that Bioshock favours neither maker exclusively. And based on how well it scales, either choice would be fine for this game.

I'll find out for myself shortly enough once I have the laptop up to speed (imagine that a ton of drivers issues with Vista [and I'm talking about more than just graphics] :whistle:  ).

!