Why do ATI use GDDR5 and nvidia use GDDR3?

I thought on the new nvidia geforce 295 that they should use GDDR5 like ATI do. What is the reason they didnt do this? Is there much of a difference ?
26 answers Last reply
More about gddr5 nvidia gddr3
  1. There is a difference GDDR 5 is better but there are other contributing factors such as the GFlOPs

    the pixel fill rate

    the texture fill rate

    the memory, core and shader clocks also play a part in how good a card is.

    the interface

    and the bandwith

    just go to


    You can choose 2 cards compare them by overclocking them, putting them in CF or SLI all of this being


    some things might be off but it is as accurate as VIRTUAL comparison's can go (out of everything I've seen)
  2. I was thinking about going from a geforce 8800 GTS 640 mb version to a geforce GTX 295. Would i notice a big difference in upgrade?
  3. yeah probably like 4 times faster
  4. NOT if he is cpu limited...more like no performance :sarcastic:
  5. My CPU is only a Q6600. So would this this i wouldnt see much difference in upgrading from 8800 gts to gtx295?
  6. Nope the Q6600 should do pretty well what have you got it clocked at? if you raise your clocks you will see some more performance as well...

    What resolution are you playing at that you think you need a gtx 295? atleast 19xx/10xx i hope
  7. Only using 1152X864 resolution but want to get a new 24 inch monitor so that i can use a different resolution. Main reason for wanting a new one was to improve my FPS on World of Warcraft. It drops to around 15 FPS in busy areas even with 4gb of ram.

    Will the gtx295 fit onto the intel asus maximus formula?
  8. Has WoW really gotten that much better graphically over the years? When it came out, I played it at 1280x1024 on an Athlon XP 2000+ w/ a 9600XT 128MB and barely experienced any slowdowns in the capitol cities at all, even during capitol raids.

    A GTX295 will work with that mobo, as would any PCI-Express card. If you're worried about it "fitting" then I suppose you're really asking if it will fit in your case. As most versions of both ATI and nVIDIA's more powerful cards are between 10 and 11 inches long, only you can answer that question. Get out the tape measure and check.

    When you said "only Q6600" like you did, I think it's safe to say that you haven't realized just how potentially powerful a CPU you actually have. If you're willing to do so, that CPU will often overclock like crazy.
  9. If you really want to upgrade , first buy a 19x10 res display then gtx 295.cause i learned recently that bigger displays are required to get greater performence.
  10. Not entirely true, Upid. In your thread, the point some of use were discussing was how stronger video cards don't really show how much stronger they are until they are forced to draw the increased quantity of textures that higher resolution monitors require. But, I will say this, buying both a powerful video card and a bigger, higher res monitor together provides an incredible level of excitement.
  11. Bigger monitor = more pixels meaning a cleaner image so you get a better picture with it but there is more pixels so more things to draw, which is why more memory is used 1 gig 2 gig card. Now what ATI uses GDDR5 is they found it probably to be cheaper to use faster memory to get the throughput in memory power the ability to take x amount of data and processes and put it back out there. While Nvidia doing a more classic rout increase the memory interface.

    Basically the top performing cards have about the same power or somewhere near.

    ATI is like a high pressure hose it has a small nozzle but uses more force to push more water out faster.

    Nvidia is like a hose with a much bigger nozzle so water sloshes out in greater quantity.

    ATI gets water out by using more force to get water out faster
    Nvidia gets water out by widening the hole so more water can get out at once.

    But overall when the day is done they both fill a bucket at the same time or just about...

    Hope that analogy worked out for you, only real test in performance in look up benchmarks.
  12. Ok Nvidia dont use GDDR5 because at this point they dont need to. They made a good archetecture when they made the 8800 and if its not broke dont fix it has been their motto for a while now. This aproach has since bitten them on the ass big time and now with its new many SP (ATI's SP's are differant from Nvidia's) and GDDR5 cards ATI are very much back in the game performance wise.
    Who knows where they (Nvidia) would be perfprmance wise now if they had pushed on when they were on top. Even so they are still competative performance wise and i guess still dont feel the need to change.

    Sound slike you have experianced some CPU restriction due to resolution, When the CPU cant feed a good GPU as fast as it would like then it stops the card running as fast as it could. Sometimes and this isnt always the case and is getting increasingly rare these days increasing the res gives the GPU more to do and so gives the CPU some breathing room to get more work done, ie it can give more info to the GPU.

    Not really correct, i see where you are coming from but actually GDDR5 gives ATI more bandwidth than Nvidia and as such it is they who have the larger size hose (alright behave children :) )

  13. @RazberyBandit: subsequent patches done to WoW have added the following:
    - fully programmable shaders are now used in several cases, like version 3.0's HQ shadows
    - texture sizes for new areas has increased: Shattrath and Dalaran cities require twice to 4 times more video RAM than, say, Ironforge
    - texture sizes for lvl 70+ items have been quadrupled (3.1.0)
    - viewing distances have been increased by 40% (2.4.0)

    In short: If you played WoW with everything maxed out in version 1.0.0, your settings would now equate to low/average settings (but you would still have the same rendering and speed that you had at the time). However, maxing out everything WILL cause slowdowns, especially in complex/populated areas (Dalaran streets bring my HD4850/512 Mb down to 10 fps - jumps back up to 40 fps when entering, say, a tavern, even when said tavern is crowded)
  14. The recent patch has really dropped my FPS on max settings. Around 15 FPS in a popular city, i read that RAM has the biggest effect on WoW but i have 4gb of RAM so the only thing i can think of is upgrading my graphics card.
  15. Wow is really cpu bound as well. Make sure to overclock your processor! On max settings I was at a constant 60 FPS while running a 4870x2. When I swapped to a 4870 it stayed the same, this was at 1920 x 1200.

    I would say that a GTX295 is overkill and I'm not even sure if wow supports dual graphics cards. My 4870x2 didn't work very well at all with that game, and I know there were a lot of posts on the forums about it not being supported.
  16. I havent overclocked my Q6600 because everytime i do my computer starts to crash more. I have an intel asus maximus forumula and i am overclocking it by going to the BIOS and selecting the CPU i want to overclock it to. Then it changes all the settings itself.
  17. Bidy, OC'ing is kinda done in a baby step fashion, and often in latter stages requires increases in voltage. The guide in the overclocking forums regarding Intel processors is pretty good. Give it a look. http://www.tomshardware.com/forum/240001-29-howto-overclock-quads-duals-guide

    And I really can't believe WoW's graphics changed that much. Still not enough to make me want to play again, though. I enjoy sleep so much more than phat lewtz.
  18. Will the geforce gtx 295 work on the intel X38?
  19. PCI-Express card on a PCI-Express enabled motherboard? Take a guess, Bidy :)

    Kidding aside, the answer is yes. However, I don't think I'd bother investing in a socket 775 system currently. Intel has clearly moved on to their new 1366/Core i7 platform, so it's likely that the potential for future upgrades (CPU, RAM) is only going to become increasingly limited on 775 boards. It's safe to assume that sometime in the not so distant future that production of 775 CPUs will be gradually phased out. No, I don't have a timeframe for such events, but it will happen. Just something to consider, and it's totally debatable.
  20. Quote:
    Will the geforce gtx 295 work on the intel X38?

    seriously, get a console instead.
  21. wh3resmycar said:
    seriously, get a console instead.

    QFT! :kaola:
  22. The reason i asked is because i thought that intel boards did not use SLI so there might be some problem since the GTX 295 is a dual chip or something.
  23. No need to be rude guys, he's asking valid questions.

    Bidy, if you want my advice then you should start by overclocking your cpu. Read up on it a bit. You should be able to get to 3ghz on a Q6600 without too much trouble.

    Let's assume you get to 3ghz, you now need a gpu that fits it. A gtx 295 is just a little too much for a Q6600, even at 3ghz. What you want to do is balance your gpu and cpu. A good gpu can make stuff look nicer on a poorer cpu but it will still be a lot worse than what it should be.

    A stock Q6600 is only good enough for a HD4850/gts250. Overclock your cpu to 3ghz and you'll get the most out of an HD4870/gtx260. Others will argue this, but I've probably had more graphics cards and cpu's than they have had hot dinners so they can bring it on ^^

    As for the gtx295, forget about it. It won't help you in WoW or anything else. WoW is a very old game and was never supposed to be optimised for the tech level we are at. It might look like crap compared to most games now but it's still built around really quite ancient technology and that's why it's not performing very well even with far more advanced gpu's. Neither ATI or Nvidia built their recent gpu's around making crap looking games run faster and I doubt they ever will. :)
  24. i tell u
    because it was smart idea
  25. Its all about bandwith and cost. For one, GDDR3 is far cheaper. Secondly, NVIDIA typically uses a higher length data bus, so even though the RAM is slower, they can use the bus to transfer more data at once.
Ask a new question

Read More

Graphics Cards Nvidia Geforce ATI Graphics