I am having a hard time understanding the differences between DDR versus GDDR on a graphics card.
I am considering purchasing a new graphics card and I am looking for a good graphics card for under $100 dollars.
I also want a nvidia card, because I have had personal bad experience with ATI built cards. That is just a personal
I have read on wikipedia how ATI created and came up with the term GDDR, although NVIDIA was the first to use
this technology on their cards.
However, thats pretty much where the definition of GDDR stops.
Many other forums suggest the following:
- GDDR is the same as DDR on graphics cards, the graphic card manufactures just get the terminology wrong.
- The G in GDDR just means graphics
- GDDR is more environmentally friendly than DDR and can do more things than DDR can.
- GDDR takes less footprint power than DDR.
- GDDR is more expensive.
While other forums suggest the opposite:
- GDDR is totally different than DDR on graphics cards, it used to be that DDR was termed the same as GDDR, but
now graphic card manufactures are using either actual DDR or GDDR ram on their video cards.
- The G in GDDR means Graphics, but it is still different than DDR, and can not be used for regular motherboard ram, yet.
- GDDR is limited to under 512 MB of onboard graphic card memory, and although better than DDR, the GDDR still has
limitations and it might be better to use only DDR graphic Cards.
- GDDR takes more power to run and typically needs to be hooked up to a power supply fan of over 300.
- GDDR is less expensive in the long run.
Who am I suppose to believe? What is the real difference with DDR versus GDDR on a graphic cards?
Some manufactures label their Graphics card as being GDDR, but on the box it clearly states DDR.
It is just so confusing to try and buy a graphics card. I don't want to get jipped, when I only have
to pay $10 more dollars to get a GDDR card, but if there is no real difference between the two
in regards to a graphics card, then why not get DDR3 or DDR4, rather than GDDR3 or GDDR4?
If anyone could please help answer this question, I would be grateful.
You're wrong. DDR3 is used on Intel's iBoards (X58 and P55) and AMD AM3 Socket, I think.
On Video cards, DDR3 was used in nVidia's GT and GTX 200 series and DDR5 is used by AMD since the 4000 series and nVidia's GT240 series.
Both AMD 5000 and nVidia's Fermi are/will be DDR5.
It does have performance differences. Have you ever benchmarked a PC with DDR2 RAM and one with DDR3 RAM? The difference is huge, since both speed and data rate are affected. The same goes towards DDR3 and DDR5.
One thing, however, is that there's no such thing as DDR4 in mainstream (at least none that I know of).
About the deal with GDDR and DDR:
From what I have researched in the Web, the difference lays on the application. DDR memories will be used as computer RAM, while the GDDR are only for video cards, with G meaning Graphics.
I was able to get those results by consulting Google using this:"GDDR DDR".
Well, I contacted a manufacture in regards to this and this is what they have to say:
Thank you for contacting Sparkle,
GDDR3 power and heat dispersal is slightly reduced allowing slightly better
performance. But is a bit more expensive.
DDR3 consumes more power but it is less expensive.
If you have any more questions, please email us and we'll be more than happy
to assist you
So, from what I have learned over these few days is that Video Card Manufactures are using either GDDR or DDR for their video cards.
Typically, most video cards under $100 dollars use the standard DDR ram chips, which "produce more power or possibly heat" and probably cause more strain in gaming applications for pc's.
However, most cards above $100 dollars typically use the new GDDR ram, which "produce heat, but are dispersed differently" probably causing less of a strain in gaming applications.
However, from what I can tell with DDR3 versus GDDR3, there really isn't that much of a real difference. The video card is still going to get hot regardless and I probably could still get the same graphic quality as I would with DDR versus GDDR, its just the performance might be slightly different.
So... Why would anyone spend hundreds of dollars on a video card "GDDR3", while a DDR3 card could still perform the same quality?
This has been a really tough call, as I want to purchase a video card, but I don't want to get ripped off in the process.
But, DDR3 versus GDDR3 seems to produce the same quality, but GDDR3 seems to not cause as much strain to the card and can seem to run at slightly higher performance results.
So, if your just looking for a basic gaming card that can play default game settings, why pay more to get only a slight difference in performance, anyways?
Now, obviously this doesn't apply to GDDR5... as that is most likely a totally different beast to begin with.
I would like to reemphasize that last point. You cant change the ram on a card... so just accept the performance of the card as a whole and buy it. Its the processor inside the card that really matters.
So, your question is:
"I am considering purchasing a new graphics card and I am looking for a good graphics card for under $100 dollars. "
The best Nvidia card under $100 is a 9800GT. No other card comes close in this power/price range. All 9800GTs have GDDR3 so it doesnt matter. You can find them on newegg in the $80-90 range.
If you want to spend more, the best card in the $115 price range is the GTS 250.
(The newer GT 240s that you see with GDDR5 are actually slower cards--the better ram helps a little--but its a much cheaper processor inside.)