570 gtx sli

i read somewhere that when the 570 is in sli playing battlefield 3 on ultra it uses all the vram then starts to stutter
would using 1 of these cards as primary http://www.amazon.co.uk/EVGA-025-P3-1579-KR-GeForce-Graphics-2560MB/dp/B0051FFEHI/ref=sr_1_fkmr0_3?ie=UTF8&qid=1344718187&sr=8-3-fkmr0 and another 1280mb ram standard card secondary would it use the 2560mb or just 1280mb
12 answers Last reply
More about tomshardware
  1. mrtwing said:
    i read somewhere that when the 570 is in sli playing battlefield 3 on ultra it uses all the vram then starts to stutter
    would using 1 of these cards as primary http://www.amazon.co.uk/EVGA-025-P3-1579-KR-GeForce-Graphics-2560MB/dp/B0051FFEHI/ref=sr_1_fkmr0_3?ie=UTF8&qid=1344718187&sr=8-3-fkmr0 and another 1280mb ram standard card secondary would it use the 2560mb or just 1280mb


    It would use 1280mb of V ram.

    Why don't you sell your gtx 570 and get a 670 and you will see no stuttering :)
  2. VRAM across multiple cards is mirrored in SLI/CFX, not joined. It would not help your VRAM capacity bottle-neck to have a second GTX 570. You would need to replace your GTX 570 with a similarly performing or better performing card that has higher memory capacity if you want to solve this problem.
  3. it is mirrored then so only way is to have 2 2560 vram 570s a 2560 with a 1280 it would just use 1280
  4. you did notice it was a card with twice the standard vram of a standard gtx 570 i knew it didnt double vram just wondering if it would just use the primary cards ram rather than the secondary
  5. mrtwing said:
    you did notice it was a card with twice the standard vram of a standard gtx 570 i knew it didnt double vram just wondering if it would just use the primary cards ram rather than the secondary


    No, it would be limited to the weakest link in the SLI setup, so both cards would be limited to 1280MB. The 2560MB card would only be able ot use as much VRAM as the lowest VRAM capacity card that it is in SLI with, so it would be limited to 1280MB (meaning that half of its VRAM is effectively inaccessible) if it is in SLI with a GTX 570 1280MB. In order to use the full 2560MB, it would need to either be by itself or in SLI with another GTX 570 2560MB (or more).
  6. maxh22 said:
    It would use 1280mb of V ram.

    Why don't you sell your gtx 570 and get a 670 and you will see no stuttering :)



    with your sig very surprised you said buy a 670 not a 690
  7. ok blaze i best look into buying a cheap card just to do physics rather than putting my 570 into sli
  8. mrtwing said:
    with your sig very surprised you said buy a 670 not a 690


    If I had the money I'd rather get two 670's than a 690 simply because it's $200 cheaper and it has similar performance.
  9. i got the 570 for £180 with borderlands 2 which i thought was a great price thinking i could put it in sli at a later date to get similar performance to a 680 but i mainly just play battlefield 3 and then i read the vram was bottlenecking it in ultra settings on bf3 so im a bit gutted the 570 is pushing my psu to the limit i was going to go sli in a whole new system currently my cpu is bottlenecking my 570 still contemplating what to do when it comes to my new system ive got quite a bit of money coming soon wether to splash out now or live with my current system maybe add a ssd to my current specs
    foxconn a76gmv mobo amd phenom II 850 cpu evga gtx 570 hd superclocked at 797mhz ocz mod xtreme 500w psu and 2 7200 rpm 500gb hardrives tried to set them in raid but there 2 different models and 1 wont do it
  10. Sorry, but when you thought that you'd get to SLI that 570 to great effect, you were incorrect. I've been saying it for a long time now, people need to make sure that their cards have much more than enough VRAM for their time if they want them to last in multi-GPU setups a year or two or even more after they came to market. 1.25GB was simply not enough. Even the 580's and 480's 1.5GB is a large improvement, yet shows weakness. Nvidia likes to use non-future proofed amounts of VRAM on many of their cards (especially their most high end cards) and if I had to guess why, I'd say that it's to get their customers to upgrade often.

    If your CPU is insufficient and your PSU is being pushed to its limits, then maybe you should stick with what you have and save up to replace more of the computer at a later time. First off, I'd suggest overclocking the CPU to about 4GHz and upgrading the graphics and later on upgrading the CPU and motherboard as well as giving the graphics a boost either by overclocking, another upgrade (such as buying one GTX 670 or Radeon 7950 and the next upgrade being a second one), or both.

    Right now, I'd say that at least 2GB is best for you. I'd think that 3GB per video card would be even better and that is a big part of why I think that the Radeons would be the better option, in addition to the fact that the Radeon 7900 carsd better handle heavier and heavier graphics loads than the GTX 600 cards do.
  11. i have not trusted ati cards since my 9200se caught fire years ago with no overclocking may have been to do with psu but it ran my mx440 overclocked fine for years after in the same motherboard
  12. mrtwing said:
    i have not trusted ati cards since my 9200se caught fire years ago with no overclocking may have been to do with psu but it ran my mx440 overclocked fine for years after in the same motherboard


    Ati and AMD are two different companies. AMD bought Ati, but AMD is not Ati. AMD has been on mop-up duty of Ati and has been steadily improving pretty much everything. Also, a card catching fire is almost definitely not Ati's fault and was most certainly not AMD's fault because they hadn't even bought Ati by then and beyond that, Nvidia released an official driver that fried the GPUs of cards using the driver by disabling the cooling of the card (although not catching fire. Again, that is unlikely to be the fault of Ati/AMD or even Nvidia), so it's not like Nvidia hasn't personally screwed up similarly. Version number 192.75 or something like that if I remember correctly. It was even a WHQL release too, so it wasn't like it was an alpha or a beta.
Ask a new question

Read More

Nvidia Gtx Battlefield SLI Graphics