Finally the GTX280 scores....sigh

Here they are...

http://www.theinquirer.net/gb/inquirer/news/2008/06/14/complete-gtx280-scores-here

And still the GTsuX doesn't smoothly max out Crysis at 1920*1200...sigh...
108 answers Last reply
More about finally gtx280 scores sigh
  1. mihirkula said:
    Here they are...

    http://www.theinquirer.net/gb/inquirer/news/2008/06/14/complete-gtx280-scores-here

    And still the GTsuX doesn't smoothly max out Crysis at 1920*1200...sigh...

    Wow, that was disappointing. Only average 40% performance increase over a single g92 core. Maybe we just had overinflated expectations. :sarcastic:
  2. thats not amazing at all looking at the spec. it beats the oc 9800gtx so it pretty much on par with a 9800gx2 or slightly better. very disappointing. hopefully its a driver issue and performance will increase alot more later on.
  3. dagger said:
    Wow, that was disappointing. Only average 40% performance increase over a single g92 core. Maybe we just had overinflated expectations. :sarcastic:


    At this point CPU is the bottleneck.
  4. And the door is left wide open for ATI.
  5. marvelous211 said:
    At this point CPU is the bottleneck.

    You must be joking, right? There is no way QX9650 (as tested) can possibly bottleneck anything in existance.
  6. It is the Inquirer and I take whatever they say with a bucket of salt but if it is true you also have to consider that it was done with pre-release drivers and mature drivers will likely add another 10-20% performance. Not a bad card but I hope ATI's new cards can compete so the 280 is priced reasonably.
  7. ya thats a fast cpu it shouldnt bottleneck it all if it did i doubt more then a frame or 2.
  8. Lame... they only give you one proc clock @ 3Ghz so not really helpful.
  9. invisik said:
    ya thats a fast cpu it shouldnt bottleneck it all if it did i doubt more then a frame or 2.

    With performance so close, any cpu that bottleneck one card would also bottleneck the other almost as much.
  10. Actually it's not so bad. 40 % more speed for next generation is guite good! And as you say, drivers are inmature at this moment.
    In some test there definetily seems to be reasonable good drivers and is some other not soo good at this moment. This is new card, it takea time until all games get's optimised drivers.
  11. dagger said:
    With performance so close, any cpu that bottleneck one card would also bottleneck the other almost as much.


    agreed! man i was going to pick this card up but its a huge disappointment im gonna wait for ati cards to come out hopefully they can give nvidia a can of "whoop ass".
  12. but i think as you can see the card really shows off at high resolution with filters turned on.and the 9800GTX is in OC form.and we still dont know how can or well the GTX 280 can OC.

    shouldnt we wait and see other site do their review?
  13. iluvgillgill said:
    but i think as you can see the card really shows off at high resolution with filters turned on.and the 9800GTX is in OC form.and we still dont know how can or well the GTX 280 can OC.

    shouldnt we wait and see other site do their review?

    This review is damning enough to convince anyone to no pay the $700 price tag for it. :sarcastic:
  14. does anyone no if BFG will include a pcie 8pin adapter for the gtx 280 like they did on the 9800gx2????
  15. iluvgillgill said:
    but i think as you can see the card really shows off at high resolution with filters turned on


    where? a paltry 27 FPS at 1920*1200.

    $600 for that? pass.
  16. i guess i don't see what everyone is complaining about. yeah the $700 is mighty high for anything, but it beats the previous gen card out in everything by a long shot and it seems to me what this card is REALLY good at are the high resolutions. maybe i'm not reading this chart right, but the gains at 2560x1600 are usually above 100%! how can you complain about that? i think this thing is going to rock on my new xhd3000!
  17. inquirer..... 'nuff said
  18. First off, you guys are bitching about a 30 to 40 percent increase at high res...

    Secondly, I hear all kinds of 600 and 700 dollar guestimates....yes if this is true then ATI may come out better, but right now the price is just a guess and there was no oc'ing on the 280 or the CPU, so this might not even be close to what it can do maxxed.

    While it's a nice preview, no sense getting all worked up till we see more reviews next week and see the actual price.
  19. From what I can see this card, ATM with the "pre-release drivers", is scarcely better than a 9800GX2 and will soon cost more too. Very, very disappointing. Whoever said it is a 40% increase over the past generation is wrong. tbh I don't know why the Inquirer didn't test a GTX280 witha GX2 as they are the high-end cards from each generation. The GTX260 would be a better comparison with the 9800GTX and I think we would see little, if any, improvement with a GTX260 over a 9800GTX. Come on ATI, make a decent card for once so that we can actually buy a good card worth what we pay for it...
  20. I'm curious how cool GTX280 will be on full load ? With this huge chip Nvidia gonna need a AirCon to cool it :D lol
  21. Best_Reviewss, sure the GTX 280 can max out Crysis at 19x12 but you would have to be crazy to pay $1400 for graphics performance. I thinks it's reasonable to expect to be able to max out Crysis at 19x12 for $700 which was what I was expecting with this "great new generation" of cards. All my hope now lies in the Hd4k series...
  22. Quote:
    http://www.hardware-infos.com/news.php?news=2151

    Sli'd 280gtx can max out crysis at 19x12 with aa AND AF. And max it out@ 25x16

    Yeah, you're going to pay 1.4k to sli those? :na:
  23. The GTX280 is a dual GPU setup or a single GPU?
  24. Yonef, the fact that the chip (package size too) is "huge" is actually better, it will be easier to cool - look up " power density."
  25. The GTX280 is a single GPu (I think?) but don't ever expect a GTX280X2 cos I doubt it will happen.
  26. Ketchup_rulez said:
    The GTX280 is a single GPu (I think?) but don't ever expect a GTX280X2 cos I doubt it will happen.

    If there are people willing to pay through the nose out there, why not? :p
  27. This is sad.. the GTX280 was up against the 9800 GTX.. which is slower then the 8800Ultra..

    wtf Nvidia? were you hiding the specs/benchmarks because you were ashamed at the performance of this card?

    I was expecting something spectacular.. something that would be amazingly fast.. not some slow **** card that is basically a 9800GX2

    I'm going to switch to ATI and Support Canada. At least they put out cards with awesome performance for low prices..

    pricing the gt280 at **** 600+ would be the most idiot idea nvidia ever had, considering its poor, poor performance.
  28. I really doubt the real price will be 700 bucks unless ATI blows chunks.
  29. sacre said:
    This is sad.. the GTX280 was up against the 9800 GTX.. which is slower then the 8800Ultra..

    wtf Nvidia? were you hiding the specs/benchmarks because you were ashamed at the performance of this card?

    I was expecting something spectacular.. something that would be amazingly fast.. not some slow **** card that is basically a 9800GX2

    I'm going to switch to ATI and Support Canada. At least they put out cards with awesome performance for low prices..

    pricing the gt280 at **** 600+ would be the most idiot idea nvidia ever had, considering its poor, poor performance.

    Some people don't know what a gpu is. Can you imagine the profit margin, even if just a few chumps buy those? :na:
  30. Ketchup_rulez said:
    Come on ATI, make a decent card for once so that we can actually buy a good card worth what we pay for it...


    Well, if the AMD card should perform better than the nvidia card or at least be on par, then we will be able to witness something really, really ugly happening to the price.
  31. dagger said:
    If there are people willing to pay through the nose out there, why not? :p


    I'm not totally sure about this but... I think, due to the manufacturing process and the heat and power this card will nee will mean that there will never be a GTX280X2. I think it's impossible but someone with more knowledge will have to confirm this! :D
  32. I like how everyone is getting pissed at numbers they pull out of thin air...rofl.

    I made my prediction, I also said it was just a guess and certainly not gonna get pissy about it if I am wrong...lmao !
  33. Ketchup_rulez said:
    I'm not totally sure about this but... I think, due to the manufacturing process and the heat and power this card will nee will mean that there will never be a GTX280X2. I think it's impossible but someone with more knowledge will have to confirm this! :D

    If they feel the need to develop such a card they will just lower each cores speed like they did with the 9800GX2 to keep it from overheating but I agree it's not very likely they will build such a card.
  34. Ketchup_rulez said:
    I'm not totally sure about this but... I think, due to the manufacturing process and the heat and power this card will nee will mean that there will never be a GTX280X2. I think it's impossible but someone with more knowledge will have to confirm this! :D

    Bah! Make it a tri or quad slot. It'd be like a block of card and whatnot... :na:
  35. They should have o/c'd the 280 if their going to compare it that way...
  36. Heads up all you cpu fans, this card is cpu bottlenecked, just wait you will see. Not in mosr games no, but in the highest most demanding games yes, youll need at least 4ghz for this card, if not more. And please expect much higher performance with better drivers
  37. dagger said:
    Bah! Make it a tri or quad slot. It'd be like a block of card and whatnot... :na:


    Yeah, and the cpu just to make sure....
  38. I know my proc holds back this 8800...I wouldn't be surprised if new cards need more cpu at 1600 and 1900.
  39. JAYDEEJOHN said:
    Heads up all you cpu fans, this card is cpu bottlenecked, just wait you will see. Not in mosr games no, but in the highest most demanding games yes, youll need at least 4ghz for this card, if not more. And please expect much higher performance with better drivers

    I don't buy it. The QX9650 doesn't bottleneck 9800gx2 in quad sli. So how can it bottleneck something with less performance? :sarcastic:
  40. First of all, quad sli IS the bottleneck for that setup, and secondly, are you saying that something that isnt just plain faster in a single card cant bottleneck a cpu? Here http://www.evilavatar.com/forums/showthread.php?t=54223 and we will see more of this
  41. dagger said:
    I don't buy it. The QX9650 doesn't bottleneck 9800gx2 in quad sli. So how can it bottleneck something with less performance? :sarcastic:


    uh because sli scales like crap and only works well with certain cherry picked games ?

    and tri and quad sli = tri and quad Crappe' de Scale'
  42. JAYDEEJOHN said:
    First of all, quad sli IS the bottleneck for that setup, and secondly, are you saying that something that isnt just plain faster in a single card cant bottleneck a cpu? Here http://www.evilavatar.com/forums/showthread.php?t=54223 and we will see more of this

    Lol, so if this is true, that means the real performance we get would be less than that benchmark? Since most people don't have a $1000+ "QX" cpu that they used for the benchmark... :na:
  43. Oh yeah, and in the world of computing things don't scale well across multiple multiple cores that are result and input dependent, like games. It's good to read up on course and fine grained parallism as it applies to rendering to see why.

    Plus, maybe there isn't anything out now to push that many shaders ?
  44. Add in the crap drivers, add in the slow cpu, well slow for the needs of this game, yes. It used to be around 3.4 to 3.6 would max out the old series, now its more. Alot more
  45. well i see it could be a slight bottleneck with current cpus but honestly anyone with 3.0ghz is fine it will bottleneck but wat like a few fps.
  46. I dunno, I'll believe it when we get more reports and they match up, not one saying it rocks and the other one saying it sucks....

    That said, I too will be disappointed if it's so so after some well known sites review it.

    P.S. Happy Father's day to all you fellow dads/gaming wankers :P
  47. invisik said:
    well i see it could be a slight bottleneck with current cpus but honestly anyone with 3.0ghz is fine it will bottleneck but wat like a few fps.


    So how come my 2.8 ghz cpu bottlenecks my 8800gts 512 by a third...if I put in an 8500...I could get nearly 50 percent faster on average gamewise according to the cpu charts on here.
  48. Bah! So the card cost $700, and needs a $1000+ "QX" cpu, which still bottlenecks it, just not as much, to get a 40% performance increase over a single lowly g92 core? :na:
  49. dagger said:
    Bah! So the card cost $700, and needs a $1000+ "QX" cpu, which still bottlenecks it, just not as much, to get a 40% performance increase over a single lowly g92 core? :na:


    lol will see when ppl start getting them :) :kaola: i still think the scores are a lie. till they are fully released and everything and will know 100%
Ask a new question

Read More

Graphics Cards Crysis Graphics