Cod4 with 3070x2 - should i be more satisfied with results?

ok, heres the deal. According to benchmarks around the www, the 3870x2 is suppose to get an average of 70+ fps in cod4 @ 1600x1050 with everything set to max.

That means AA, AF, All hiigh, basically the works. When I play the game, sometimes fps will go up to 95 but then it can dip to 28fps! i mean 28fps! wth. Anything below 50fps I can feel and it often dips to 50 or lower. am I expecting too much? does the bar for AF pushed too far to the right?

I can set to 2xAA but I mean I just paid $500+ for a video card and I should be able to rock this game np.
48 answers Last reply
More about cod4 3070x2 satisfied results
  1. its ur ram... if u got vista
  2. That last bit of Nvidia fanboyism wasn't helpful, mousemonkey. At least ATI is honest and doesn't fudge drivers the way Nvidia did with Crysis (i.e. not displaying the water the way it's meant to be played just to get a few extra fps).

    Quanger, benchmarks can be tricky guides and results will vary. It depends on what CPU you have as well as what GPU. Note that most of the benchmarks by Tom's, Anandtech etc. are done using quad cores. Sometimes it's the Q6600, but more recently the latest Intel EE.

    Benchmarks are averages anyways when FRAPS is used. It can vary from session to session and Tom's and Anandtech rarely do averages from several sessions. Though H tries to do "real world" game tests, they often pick better drivers and settings for their favored Nvidia cards over ATI.

    I'm CPU limited until I get a new LCD monitor this Friday, but I still get up to 60 fps at 1024 x 768 in The Witcher, but it drops down to 19 fps in some intense combats, but it averages 30-40 . Anandtech calculated 46 fps in The Witcher using the first in game cut scene at 1680 x 1050. I've noticed that I get more fps in cutscenes than I do in gameplay where there are quite a few monsters in a group.

    That shows me that Crossfire's recognized in that game, whereas with LOTR online, I get 42 fps max can can drop to 20 fps in DX10 mode with Vista. When I get to play on the new 20" LCD then I won't be as CPU limited. When I get a Phenom 9750 in May on a Crossfire board then I definitely won't be CPU limited.

    As for AF, I found that I could only get 20 or so fps in LOTR online with 16x AF, but it looks just as good with very sharp detail. So, if you don't like less than 50fps, cut down on the AF. The next generation of ATI cards are supposed to do better with AA and AF. Lastly, some games just don't recognize Crossfire or SLI.

    So, you are probably doing okay because you don't have a quad core and you are getting real world gameplay. Note that your Athlon X2, while better than mine, is still a weak CPU by today's standards. If you can get a decent average fps with everything enabled at 1680 x 1050, then what's to complain?
  3. hmm. you really think the x2 6400 is limiting? i mean it performs on par with the E6750 or so. Cod4 specifically does not utilize quad cores yet so that cant be a factor. I am running win xp sp2 with 2gb of ram which I think should be enough.

    looking back, i remember when i had my athlon 700 with 256mb ram and a radeon 7200 (original), i was getting good fps in quake3 but the problem was it kept dipping low. I solved it buy adding another 512mb stick to the system.

    But i really doubt an extra 2gb will solve my problem with fps dipping. Win xp only reconizes 3gb so having 4gb would be a bit of a waste. I refuse to upgrade to vista. I would consider it if it was free but otherwise i cant justfiy paying $$$ for that os. I wouldnt even buy it for $20.
  4. With framerates that vary like that it's most likely memory or CPU. If your framerates were low but still holding really steady then I would say you could safely look at the fault being with the GPU. But in your case with frame rates going very high and dipping very low, it's going to be memory or CPU handling tasks that ocassionally bottle neck them, for example CPU could be any background task, specially norton or mcafee which are notorious for doing whatever the hell they want in the background.
  5. yipsl said:
    That last bit of Nvidia fanboyism wasn't helpful, mousemonkey. At least ATI is honest and doesn't fudge drivers the way Nvidia did with Crysis (i.e. not displaying the water the way it's meant to be played just to get a few extra fps).

    Not fanboyism mate, although I do have a preference to nVidia cards over Ati's. And as both companies been equally guilty over the years of 'fixing' drivers to gain a few more points in futuremark and other benchmarks, I was merely out to show that either COD4 is an nVidia title or the OP should question whether or not their card is any good as 95fps can be achieved with a single card that has a single GPU.
  6. quanger said:
    hmm. you really think the x2 6400 is limiting? i mean it performs on par with the E6750 or so. Cod4 specifically does not utilize quad cores yet so that cant be a factor. I am running win xp sp2 with 2gb of ram which I think should be enough.

    If your CPU is limiting you then how the heck does my 939 3800x2 @2.0ghz manage over 200fps with all eye candy maxed out @1680x1050?
  7. It sounds like a problem other than your hardware. You definately have the horsepower to run that game, try different drivers?

    Hmmm mousemonkey I think something is being reported wrong if it says 200fps

    One time fraps told me Assassins Creed was running at 150 fps but next time I ran the game it said it was 40-60 so I think it is reporting it to you wrong.

    But maybe you have some futuristic technology we dont know about? Just givin you a hard time.
  8. nkarasch said:
    It sounds like a problem other than your hardware. You definately have the horsepower to run that game, try different drivers?

    Hmmm mousemonkey I think something is being reported wrong if it says 200fps

    One time fraps told me Assassins Creed was running at 150 fps but next time I ran the game it said it was 40-60 so I think it is reporting it to you wrong.

    But maybe you have some futuristic technology we dont know about? Just givin you a hard time.

    No futuristic technology just consistently high fps's on all the indoor bits and 50's and 60's outdoors, oh and it's not FRAPS that I'm using :kaola:
  9. i will definately rull out the issue if it is memory related. luckily, my friend has the same 2 sticks of ballistix ddr800 and i will pop that in my pc. so basically, instead of 2gb ram it will be 3gb since winxp doesnt reconize 4gb. i wish microsoft can fix that crap. they are well aware that MB can support over 3gb so why limit it to 3? lame. just another excuse for us to buy vista.
  10. i get around 75-125 (capped at 125) All max settings, AA and AF all the way up.
    1440x900 res.
  11. Quote:
    Sorry monkey boy, but that card will smoke your GT's.

    Maybe so, but not on this occassion

    Quote:
    Well hitting 95fps and averaging 95fps are two different things. I consistantly hit 95fps with my 256mb card but at 1280 x 1024 and I average about 65fps.

    Then put your hand in your pocket and shell out some dollars for a better card or cards :kaola:
  12. quanger said:
    hmm. you really think the x2 6400 is limiting? i mean it performs on par with the E6750 or so. Cod4 specifically does not utilize quad cores yet so that cant be a factor. I am running win xp sp2 with 2gb of ram which I think should be enough.

    looking back, i remember when i had my athlon 700 with 256mb ram and a radeon 7200 (original), i was getting good fps in quake3 but the problem was it kept dipping low. I solved it buy adding another 512mb stick to the system.

    But i really doubt an extra 2gb will solve my problem with fps dipping. Win xp only reconizes 3gb so having 4gb would be a bit of a waste. I refuse to upgrade to vista. I would consider it if it was free but otherwise i cant justfiy paying $$$ for that os. I wouldnt even buy it for $20.


    The first wave of 3870x2 reviews did tell of big 'dips' in the framerate, being something to do with heavy data transfers between the system memory and the cards memory (memories), and that future drivers should help eliminate/smooth this out.

    Is it a sustained low, say standing still (if you can in cod4 lol) and after 10 seconds its still at 28fps?. Btw you have super-human eyes if you can feel when somethings dropped under 50pfs...
  13. Mousemonkey said:
    If your CPU is limiting you then how the heck does my 939 3800x2 @2.0ghz manage over 200fps with all eye candy maxed out @1680x1050?


    Eye candy is subjective. Settings are not. I also doubt that your X2 3800+ is not holding you back.

    Tom's reports that CoD4 at 1680 x 1050 gets 59.2 fps, but 45.3 with 4x AA and 8x AF. I can't see you getting 200 fps unless your eyes are accustomed to candy that's more or less 800 x 600 with low res textures. :lol:

    http://www.tomshardware.com/2008/01/28/ati_r680_the_rage_fury_maxx_2/page15.html

    Yes, a single GPU 8800gtx Ultra wins in this game. The 3870x2 wins in others. That's the way it goes. Overall, the differences are minor, especially when the price is considered.

    Anonymous said:
    The first wave of 3870x2 reviews did tell of big 'dips' in the framerate, being something to do with heavy data transfers between the system memory and the cards memory (memories), and that future drivers should help eliminate/smooth this out.


    I noticed that with the first drivers I used, the 8.2 Catalyst, but it's less of an issue with 8.3. Still, the biggest dips I get is in LOTR online, but there are both Crossfire and SLI issues with that game, even though it's one of Nvidia's "The Way It's Meant to Be Paid" program.
  14. yes i honestly do feel like everything slows down\choppy when the fps dips because its much harder to aim.

    the fps usually dips during heavy action. sometimes if i am faced a certain way on a certain map, the fps will drop and stays drop until i look elsewhere.

    talking about future drivers improving things, i have yet to see such thing. catalyst 8.3 was a big disappointment. almost like Hyundai releasing the Pony of the 80s.

    and ati's CCC in a big pita. and thats an understatement. Overall I really have to say that I am quite disappointed though in purchasing a 3870x2. The card is a monster and looks cool but in terms of best bang for bucks, I shouldve gone SLI 8800GTS OC.

    I wonder if anyone wants to trade.
  15. quanger said:
    Overall I really have to say that I am quite disappointed though in purchasing a 3870x2. The card is a monster and looks cool but in terms of best bang for bucks, I shouldve gone SLI 8800GTS OC.

    I wonder if anyone wants to trade.


    Okay, you have an Nvidia motherboard. Do you have bios issues that affect the 3870x2? When I had an MSI Nvidia 405 chipset board, the MSI documentation said the chipset did not support ATI X300 to X700 cards, and their customer support said that it did not support the X1000 series either, so I went 7600gs on that board. I switched to a cheap ATI board to hold me over so I could use this card.

    Does your motherboard have limitations like the MSI K9N6SGM-V :

    http://www.msicomputer.com/product/p_spec.asp?model=K9N6SGM-V&class=mb

    If it doesn't have "chipset limitations", and if it's SLI, then could Nvidia specific settings in the bios be mucking things up with an ATI card? The opposite problem has occurred too. Here's a thread here at Tom's with virtually identical issues:

    http://www.tomshardware.com/forum/248345-33-major-3870x2-performance-issues

    I am guessing that, aside from old drivers wrecking havoc, that there need to be bios tweaks that allow an ATI card to work on an Nvidia motherboard without issues, and vice versa. The MSI tech people said "use Nvidia with Nvidia" and that's good advice, which is why I use ATI with ATI.

    How is CCC bad? I didn't find Nvidia's interface all that better the year I had the 7600gs. If you went 3870x2 after years of Nvidia, perhaps it's just what you were used to? Why don't you sell it and get a 9800gx2 when it arrives? I'm sure you'll have fun with that monstrosity.

    http://www.nordichardware.com/index.php?news=1&action=more&id=7501
  16. quanger said:
    yes i honestly do feel like everything slows down\choppy when the fps dips because its much harder to aim.

    the fps usually dips during heavy action. sometimes if i am faced a certain way on a certain map, the fps will drop and stays drop until i look elsewhere.

    talking about future drivers improving things, i have yet to see such thing. catalyst 8.3 was a big disappointment. almost like Hyundai releasing the Pony of the 80s.

    and ati's CCC in a big pita. and thats an understatement. Overall I really have to say that I am quite disappointed though in purchasing a 3870x2. The card is a monster and looks cool but in terms of best bang for bucks, I shouldve gone SLI 8800GTS OC.

    I wonder if anyone wants to trade.


    Mate you've only had 2 different drivers on the card, and quad crossfire and AA support in UT3 were the 'only' things coming in 8.3 anyway. Hold on until catalyst 8.6 before you make a big judgement about the card. The next driver (8.4) is gonna be about bug-fixes anyway in all likelihood.

    "was a big disappointment. almost like Hyundai releasing the Pony of the 80s." Mate what you on about, the pony was total class!!

    = )
  17. Your CPU is not holding you back. A high end GPU + any dual core cpu is more than good enough for any game. You don't need mega quad core OC'd to hell to play games at max/high resolution (except maybe FSX and Supreme Commander).

    Here's an important point though, make sure you have Catalyst AI enabled, because if it's not only one GPU on the card is going to be doing the work. Also turn off V-sync if you have it on! V-sync with the X2 doesn't work well for me in COD4 or ET:QW, but I can play both with highest AA and AF maxed at 1680x1050. Hope this helps.
  18. yipsl said:
    I can't see you getting 200 fps unless your eyes are accustomed to candy that's more or less 800 x 600 with low res textures. :lol:

    Which is why the screenshot was taken at 1680x1050, sorry I've just got a better rig (and eyes) than you. :kaola:
  19. nkarasch said:
    It sounds like a problem other than your hardware. You definately have the horsepower to run that game, try different drivers?

    Hmmm mousemonkey I think something is being reported wrong if it says 200fps

    One time fraps told me Assassins Creed was running at 150 fps but next time I ran the game it said it was 40-60 so I think it is reporting it to you wrong.

    But maybe you have some futuristic technology we dont know about? Just givin you a hard time.


    how are you able to play a game that hasn't even been released yet? (if you say beta tester, can u hook me a copy of that game? :P)
  20. I agree with Nevesis, I also want a copy of Assassins Creed, and I don't want to wait till April/May!
  21. No, you should not be satisfied with the results. Its not your CPU, and its not you ram.

    With the rig in my sig, I never dip below 50fps, with all settings maxed, 1680 x 1050, in an online game with 20+ players.

    Something is up my friend.

    Id suggest checking your running processes and look for a program running in the back ground that may be sapping performance. Also, make sure you have the AMD Dual-Core Optimizer, which if you google search, is easy to get. Also, make sure Vista is updated.

    Instead of getting into an debate with other people, I will be happy to solve the problem at hand. :)
  22. Really best way is to just make sure you got latest 8.3drivers, check card and CPU temps. What kind of cooling you got on your case? Run a defrag too, wont hurt.


    Tbh though when you say AA to max, do you mean 8x? because there is simply no need for that, or in fact no need for several other settings as they add no visual aid yet kill performance. I had 1280x720 2x AA and around 1/3rd settings turned down slightly and I never went below 60FPS on any scene. It was really good.

    http://uk.gamespot.com/features/6183967/index.html

    Use this as a guide.
  23. I have a 3870x2 with the 8.2 drivers, and CoD4 almost never dips below 50 fps in multiplayer. And that's in heavy foliage or smoke at 1920x1200 with everything maxed (except soft smoke edges) with moderate AA and AF (full AA and AF makes things look blurry to me). Something's up on your machine.

    (Processor is an A64 X2 6400+, not OCed, running XP with 2 gig of RAM).
  24. Again, I'm guessing you either don't have Catalyst A.I. enabled or you are using V-sync fill us in.
  25. Vsync is off. I have to check about the catalyst AI when I get home from work.

    I dont have anything running the BG because I just formatted the pc just for the sake of cod4.

    my cooling is more than adequate. I have a coolermaster 212 with dual fans and a couple of system fans.

    I am currently using 8.2 instead of 8.3 because I find it being much faster (3dmark06 definately confirms this).

    1. Do you guys click on (dual GPU) in the cod4 settings?
    2. I do have filters set to trilinear, with everything on extra. The AF bar is push all the way to the right (maximum).
    3. I am running win XP Sp2, not vista.
    4. My motherboard is not Nforce\SLI, its 690g board by asus. So there are no nvidia related parts in it. Bios is updated.
    5. AMD dual core optimizer is also installed.

    Well by the time catalyst 8.6 is released, it will be in the fall. Thats a bit long of a wait i think. I hope 8.4 will do wonders.
  26. btw mousemonkey, how do u display that? i just use \com_drawfps 1 i believe.

    and I can get 200+ fps if its indoors. But of course most of the massacre is outdoors with lots of lighting and smoke effects which brings my fps down. I hope to be able to test out a friend's ram by sat. and see if it solves my dipping issue.
  27. quanger said:
    btw mousemonkey, how do u display that? i just use \com_drawfps 1 i believe.


    I don't doubt he has a better rig than mine. I just doubt his fps. That might be the max he gets, but I doubt it stays at the max for long. As you said, 200fps is indoors. Sort of like dungeon Oblivion vs. outdoor Oblivion with HDR and a gate spawning four or more Daedra.

    Since I don't play COD, I can't say, but even with 8800gtx's in SLI, I have my doubts at the resolution he plays at. When proof is offered, then my doubts will be over.
  28. Why the big fuss?

    You've maxxed the game... its running great and looking that way too.

    Do you complain when you're watching a movie on your PC with FRAPS enabled?

    ''Damn this graphics card sucks Im only getting 25 FPS at all times!''
  29. ^ that's the funniest post I've read today :D

    Yes, I know, I need to read more posts...
  30. YEA, that's pretty funny!! And I though my GFX card sucked!! lol
  31. Quanger, I do select dual gpu settings in COD4, I even did it in COD2 when I used an x800xl and a 6800gt because I heard it helped FPS ieven if you didn't have dual gpu setup.
  32. quanger said:
    btw mousemonkey, how do u display that? i just use \com_drawfps 1 i believe.

    and I can get 200+ fps if its indoors. But of course most of the massacre is outdoors with lots of lighting and smoke effects which brings my fps down. I hope to be able to test out a friend's ram by sat. and see if it solves my dipping issue.

    http://img245.imageshack.us/img245/6924/proofrequestem8.th.jpg
    Rivatuner 2.06 with Everest plugins enabled. It requires a bit of setting up in the 'hardware monitoring' tab but you can have all sorts of things displayed on screen whilst you frag.

    yipsl said:
    I don't doubt he has a better rig than mine. I just doubt his fps. That might be the max he gets, but I doubt it stays at the max for long. As you said, 200fps is indoors. Sort of like dungeon Oblivion vs. outdoor Oblivion with HDR and a gate spawning four or more Daedra.

    Since I don't play COD, I can't say, but even with 8800gtx's in SLI, I have my doubts at the resolution he plays at. When proof is offered, then my doubts will be over.

    If a screenshot taken at 1680x1050 is not proof then what would you deem as good enough to satisfy?, And I never said that 200fps was an average just that it can be reached no matter how briefly, outdoors with foliage and a face full of incoming ordnance the fps can drop to 48's.
  33. Sounds fine to me.
    My SLi'd GTXs (which should in fairness whip your 3870X2 :))give upto about 325fps in CoD4, full everything @ 1280x1024, but occasionally it'll drop to 40-50fps in intensive scenes (heavy smoke with explosions all going on at the same time etc.) but probably averaging around 100-140FPS.
  34. LukeBird said:
    Sounds fine to me.
    My SLi'd GTXs (which should in fairness whip your 3870X2 :))give upto about 325fps in CoD4, full everything @ 1280x1024, but occasionally it'll drop to 40-50fps in intensive scenes (heavy smoke with explosions all going on at the same time etc.) but probably averaging around 100-140FPS.

    Nice! [:mousemonkey]

    *strokes Alpha Dogs* "Don't worry my puppies, scary man with GTX's will go away soon"
  35. Mousemonkey said:
    Nice! [:mousemonkey]

    *strokes Alpha Dogs* "Don't worry my puppies, scary man with GTX's will go away soon"

    Ha ha, thanks! :D
    It's taken me a long time to get it (my system) to where it is now and I'm just waiting for a nice quick Phenom B3 and then I'm going to lock the case up and leave it alone (for once!) :lol:
    I'm ridiculously CPU-limited though as in 3DMark the 2nd GTX doesn't even ramp it's fan up! So next purchase will be a 24"! :)
    Your GTs are more than enough though! :sol:
  36. quanger, 4gb ram would't hurt. What are your ram timing settings? What does your system score in 3dmark06? Is the CPU over clocked? Is the GPU overclocked? The GPU performance will improve in 8.3 and others versions to come probably. Take a look at BIOS settings and also in windows to make sure that nothing is hindering the machine. Remember that sites like Tom's start with a clean OS install so there isn't a lot of old software running. They don't have MS office, AOL running or an anti virus/firewall. They use just a clean standalone system. So depending what software you have also using cpu clocks and the condition and state of tune of your hardware this will affect your FPS. Try a quick house cleaning run 3dmark and let us know how it comes out. What MB do you have?

    Just for the gallery...the guy is asking about his machine, his VC, his Ram. He and we don't need to know that you have Nvidia sli or how great you think Nvidia is. His question is about his hardware.
  37. Cheers mate, and yes you really do need a 24" at the very least, I'm using a 22" Samsung that I found on the clearance shelf in PC World of all places for £224, USB hub, speakers, web camera & mics and most importantly a DVI socket and as 1680x1050 seems to be the sweet spot for the 512 GT's I took the one box that was still sealed and headed for the checkout :) . Your GTX's I would think, will perform best at 1900x1200 which is 24" territory and quite a leap up in price here in rip-off Blighty as you well know, so I don't envy you on your quest.

    @ OP, have you tried enabling the 'dual GPU' option yet? and if so what happened?
  38. I wouldn't worry about getting a Quadcore phenom for gaming to replace your more than capable 6000+. At least wait for the 45nm phenoms.

    edit. Isn't two 8800GTX a little overkill at 1280x1024? Spend that money for a bigger monitor if you really want to see your cards shine.
  39. topper743 said:
    Just for the gallery...the guy is asking about his machine, his VC, his Ram. He and we don't need to know that you have Nvidia sli or how great you think Nvidia is. His question is about his hardware.

    Ooohhh :kaola: , Actually the OP was asking whether he/she should be satisfied with the performance of a $500 dollar card in but one game. And in the 'Graphic Cards' section not the 'Ati' sub section.
  40. San Pedro said:
    I wouldn't worry about getting a Quadcore phenom for gaming to replace your more than capable 6000+. At least wait for the 45nm phenoms.

    edit. Isn't two 8800GTX a little overkill at 1280x1024? Spend that money for a bigger monitor if you really want to see your cards shine.

    Sorry to thread hijack a little (but I did anwser your question on the last page! :)) but yes, 2 GTX's at 1280 is stupid overkill, I was running one and found the other at a great price on eBay two weeks ago and thought what the hell! :) I already planned to get a new monitor, so in a couple of weeks that'll be my next purchase! :)

    mousemonkey- Not bad at all with the monitor! systemlord recommended a BenQ 14" that has DVI & HDMI which I managed to find over here for £255, which is pretty outstanding for the money really! So that'll be my next purchase, just need the money now! But until then, my GTXs laugh at the resolution! :lol:
  41. well i have 2 saphire 3870X2 and with crossfire turned off i just checked on a 27 in screen i dont go lower then 117 with it all maxed...... im runing vista, 2 gig ddr3 at 1800 mhz 9650 at 3.5 8.3 drivers.
  42. honor said:
    well i have 2 saphire 3870X2 and with crossfire turned off i just checked on a 27 in screen i dont go lower then 117 with it all maxed...... im runing vista, 2 gig ddr3 at 1800 mhz 9650 at 3.5 8.3 drivers.

    Really? :heink:
  43. yay i solved my problem. basically did reformatted my whole pc and started from scratch. I am using the 8.2 because the 8.3 is brutal.

    ypsl>>i had a lot of problems getting catalyst to work previously. Everytime i tried to run it, it would give me an error pop up and then close. I hated it dearly.

    My cpu\video card is not o/c'd. I get about 12500 with 3dmark06. THe fps is much better now, it will dip but not as low and not as often. And i do have dual-GPU selected in cod4. Maybe its time to buy another 3870x2.
  44. quanger said:
    yay i solved my problem. basically did reformatted my whole pc and started from scratch. I am using the 8.2 because the 8.3 is brutal.

    ypsl>>i had a lot of problems getting catalyst to work previously. Everytime i tried to run it, it would give me an error pop up and then close. I hated it dearly.

    My cpu\video card is not o/c'd. I get about 12500 with 3dmark06. THe fps is much better now, it will dip but not as low and not as often. And i do have dual-GPU selected in cod4. Maybe its time to buy another 3870x2.

    Have you had a chance to play with Rivatuner yet?, as I for one would be interested in what fps's your card can output both indoors and outdoors.
  45. Mousemonkey said:
    Have you had a chance to play with Rivatuner yet?, as I for one would be interested in what fps's your card can output both indoors and outdoors.

    Yeah, it's a shame there isn't a benchmark built in for that like (like the Crysis one) as it would be great to compare cards in the 'real world', not in 3DMark's! As my system proves, my 3DMark is relatively poor for what I have, but in game you'd never know!
    With regard to my FPS. While I was playing last night the lowest recorded FPS was during the nuclear explosion with all the 'copters dropping out of the sky etc. I went down to 48FPS, average was more than I thought at about 130-160 and edging upto 200 fairly frequently and upto 300 not very often at all! In the loading screens though (during the vid) I get upto 700FPS, does that count? :lol:
    I would be really interested to see how the 3870X2 performs in comparison :)
  46. Quote:
    Sorry monkey boy, but that card will smoke your GT's.

    What are you talking about? SLI 8800GT will beat the HD3870x2 way more often than not. Sometimes by quite a bit too.
    Here's COD4, but look over every game not just this one: http://www.anandtech.com/video/showdoc.aspx?i=3209&p=4

    Here is SLI 8800GT vs Crossfire 3870:
    http://www.legionhardware.com/document.php?id=716&p=8
  47. wow I play 1900x1200 with max everything and it never dips below 46 fps with an average of 59. the a gts 512 that is.
Ask a new question

Read More

Graphics Cards Games FPS Graphics