HOLY CRAP! A DX10 BENCH G80 vs R600!!!!!

Guru3d; Unless I don't understand english then.... Guru3d's Call of Juarez Bench

Thank god this is out also!!!
25 answers Last reply
More about holy crap dx10 bench r600
  1. Based on what I've read (which admittedly isn't that much), it seems that comparison isn't worth much at this point. I assume the dx10 version of that game is pretty fresh. the R600 drivers appear to be pretty unpolished. I don't think we'll really know where things stand for a little while, time for another driver revision or three. Not that ati doesn't deserve getting beat, showing up so late the party, and after having already developed a unified shader arch. card for a console.
    At the end of the day though, where the card stands now is relevant only if you're buying a card right now.
    if you're a few months off from a new purchase, as I am, we can wait and see whats the best for the $....if you're in a budget, that is.
  2. Very nice find! Looks like I may get me an 8800 GTS 640 since, I think thats the best my power supply can handle (550w Antec NeoHE)

    Best,

    3Ball
  3. Wow! Those are nice lookin graphics! :D
  4. Quote:
    Wow! Those are nice lookin graphics! :D


    I might need to change my shorts after that.... 8O
  5. I want to have graphics like that available to me...
  6. I wonder if my upgrades next month for E6700 and 8800GTS 320 will handle the fps at 1280x1024 resolution.
  7. Nice article! Come on ATI step up with the xtx
  8. nice looking benchmark!

    but 42 fps on 1024x768..... i get a feeling an SLI/crossfire set up may just be a good thing for stuff like that. Will have to see :)
  9. I double that.. those reviews are all I've been waiting for before buying a new rig. I recon an overclocked 6600 and a GTS 320 will do the trick couse that's what I'll be getting. Otherwise I just might add another 320 to the setup (going for ASUS SLI mainboard).

    Anyhow.. I am very disapointed by ATI.. not because of the preformance (no right to be couse the price is so good) but by the fact that it runs so bloody hot.. lost 3 Nvidia cards to that :( They had plenty of time to think of a decent cooling and sort out drivers and now Nvidia has probably sold enought of their Ge8 series to get their initial investments back and ATI is just starting to sell theirs with a very disapoining launch for a very low price (for such card).
  10. Quote:
    I double that.. those reviews are all I've been waiting for before buying a new rig. I recon an overclocked 6600 and a GTS 320 will do the trick couse that's what I'll be getting. Otherwise I just might add another 320 to the setup (going for ASUS SLI mainboard).

    Anyhow.. I am very disapointed by ATI.. not because of the preformance (no right to be couse the price is so good) but by the fact that it runs so bloody hot.. lost 3 Nvidia cards to that :( They had plenty of time to think of a decent cooling and sort out drivers and now Nvidia has probably sold enought of their Ge8 series to get their initial investments back and ATI is just starting to sell theirs with a very disapoining launch for a very low price (for such card).
    The power usage is a drag, just like the 1900's they use it up. The heat issues Im sure will be dealt with by other AIB dealers, so thatll help. The gts started out at 400 or so, so its priced right. If I were you, Id be looking for more vRam, maybe the 2900 or the gts 640, as they dont add up in SLI or XF. What rez are you looking to play at? The 320 version starts to drop after it goes higher than 16x10
  11. Quote:
    Wow! Those are nice lookin graphics! :D

    They are great looking.

    *drools*
  12. Quote:
    What rez are you looking to play at? The 320 version starts to drop after it goes higher than 16x10


    I love to play games in 1280x1024 rez, 1024x768 looks so jaggies, IMO.
    This review about DX10 made me doubt again about upgrading. The games are running on 27 fps only, that's not even a "good" frame rate to play with, there'll be lots of hiccups too, i think.
    This is just one game, what about FSX or Crysis ? does it will run only 5 fps on DX10 ?
    What about the people who just buy 8600 series ? they can't even play in 1024 rez.
  13. Quote:
    What rez are you looking to play at? The 320 version starts to drop after it goes higher than 16x10


    I love to play games in 1280x1024 rez, 1024x768 looks so jaggies, IMO.
    This review about DX10 made me doubt again about upgrading. The games are running on 27 fps only, that's not even a "good" frame rate to play with, there'll be lots of hiccups too, i think.
    This is just one game, what about FSX or Crysis ? does it will run only 5 fps on DX10 ?
    What about the people who just buy 8600 series ? they can't even play in 1024 rez.

    8) play on XP insted of vista then
  14. I think when you buy a DX10 card means you want to play a DX10 games. ATM, only Vista support DX10, so XP is out of league.
    And do you think Gates will release DX10 support for XP ? I don't think so, he will push people to buy Vista even it's a crap full of patch everywhere.
  15. Definitely going to have to upgrade to play that. 8O
  16. Heh, here is a funny article about this benchmark.

    http://www.legitreviews.com/article/504/1/

    Quote:
    Basically it seemed that we were getting run in a circle as each company told us to contact the other company and no one would give up the latest benchmark build. At first I wasn't going to use the benchmark utility as when a company hands you a disc and says try this out it throws off every red flag in my book, not to mention the fact that the benchmark utility was based off an old build version that had bugs fixed in newer versions. After running the benchmark and taking screenshots to compare the image quality I couldn't resist not running this benchmark as the findings were actually not in AMD's favor. Yes, the benchmark that AMD and ATI handed out the to the press for the launch of the ATI Radeon HD 2900 XT actually played out to favor NVIDIA even though the latest build was not handed out to the media.
  17. Interesting. The 2900XT can keep up with the GTX in DX10, but falls behind slightly in image quality, which is one of it's advantages in previous cards?
  18. Read my post above about the legitmacy of the CoJ benchmark.

    There is also this:





    http://www.pcgameshardware.de/?article_id=601352
  19. Quote:
    Interesting. The 2900XT can keep up with the GTX in DX10, but falls behind slightly in image quality, which is one of it's advantages in previous cards?


    It keeps up with the GTS, not the GTX. :wink:

    I saw that bit about the image quality as well though, that's a big turnaround for ATI. :oops:
  20. Quote:
    Interesting. The 2900XT can keep up with the GTX in DX10, but falls behind slightly in image quality, which is one of it's advantages in previous cards?


    It keeps up with the GTS, not the GTX. :wink:

    I saw that bit about the image quality as well though, that's a big turnaround for ATI. :oops:
    I was reffering to this article, not it's general performance:
    http://www.legitreviews.com/article/504/1/
  21. I'm at work so I can't see those.
  22. Quote:
    Wow! Those are nice lookin graphics! :D

    They are great looking.

    *drools*

    Yeah the thing is I don't see much difference in the two screenies that would require DX10 alone. There's heavy use of displacement mapping but that's been around since the Parhelia and P10 days. The HDR is nice but I would need to see more as to why it's DX10, it looks very DX9 FP16, but maybe they bumped up the floating point range.

    It is interesting to see greater use of displacement mapping the mountain s look really good, but the example by the fire is ridiculously exagerated, if that were to occur to that extend on a road or floor or something I'd expect it to be physical geometry not just displacement, as anything that falls ontop of it would float in the air above the gaps/surface.

    Anywhoo, nice to see them greater adoption, now just need geometry shaders to be exploited.
  23. I see. :D

    I had only looked at the benchie linked by the OP.
  24. I wonder if there is a problem with moving from a DX9 code game to DX10 patch on top. If it is a DX10 game from the start with adaptations to DX9 support will you have better performance? If not, knock out all mainstream DX10 cards and make them all DX9 cause all high end enthusiasts cards can barley do it.

    I do remember though that the DX10 Crysis demos were done with one 8800 gtx at pretty high resolutions, and those didn't look bad.
  25. The main thing is what you have are fundamentally DX9 built games right now, with DX10 tacked on top. They idn't design the grass, trees, terrain with DX10 code path in mind, what most of the early benchmarks show is still a fouundation of DX9 where most of the work is done, with certain features tacked on, like the displacement mapping on the mountains. They aren't saving code by doing the same thing with less, they have a dull mountain and they add features on top. It's different tan building the mountain for DX10, or having DX9 do the same thing and then see DX10 do it quiker. The same effect (for the mountains, not the floor) could be shown using parallax-oclusion/virtual-displacement mapping but that wasn't done in the original, so it's not a question of replacing code so much as taking it on afterwards.

    When it's a ground up option where the development paths involved 2 branches DX9 vs DX10 and they were built to exploit the benefits of each, that's when we should start to see the benefits of DX10 in comparison to the older DX9 methods, for now it's more of an after-thought that will initially increase the load on the cards not reduce it.
    I suspect there are many areas where crysis should see major differences and will be specifically coded to take advantage of DX10, like in the vegetation.

    Just FYI the early demos of Crysis were mostly in 720P (ie 1280x720 / 1366x768) and it's often mentioned in the gameplay videos.
Ask a new question

Read More

Graphics Cards Benchmark Graphics Product