Is GDDR2 slower than GDDR3?

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790
yes,

technically,

DDR is the fatest memory in terms of latency, but DDR3 is the faster memory in terms of Hz...

its all to do with the artitechture DDR3 is meant to be faster and use less power than DDR2 although as with DDR2 the latencys will be higher untill the manufacturing process is improved in the same fashion that the DDR2 manufacturing process has improved.
 

physx7

Distinguished
Sep 21, 2007
955
0
18,980


So basically yeah GDDR3 is a good bit faster than GDDR2. But I don't think there is a huge difference between GDDR3 and GDDR4....

~Physx7
 

basketcase

Distinguished
Jun 1, 2006
561
0
18,980
Yeah, they have reached a barrier where they can't improve the latency anymore. So, GDDR4 isn't much faster than GDDR3 in final performance.

To the OP: GDDR3 is significantly faster than GDDR2 Whatever you do, do not buy a GDDR2 card, if you can avoid it. I don't have exact numbers, but I have heard it can hit performance anywhere from 10-20% slower than similar cards with GDDR3.
 

Asian PingPong

Distinguished
Jan 21, 2008
131
0
18,680



This is false considering latency is measured in the amount of clock cycles, not milliseconds, so the more clock cycles per second, the less time between clock cycles, thus translating into less latency.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

What? That's just wrong.

To the OP, GDDR3 is MUCH faster than GDDR2. Avoid GDDR2 if you are buying a gaming card.
 

Flakes

Distinguished
Dec 30, 2005
1,868
0
19,790


what are you on?

typical DDR3...

1600 hz

latencys, 7-7-7-20

typical DDR2

1066 hz

latencys, 5-5-5-15

typical DDR

400 hz

latencys, 2-3-3-6


as the Hz get faster the latencys get longer this is due to the manufacturing/design process not being good enough to support both.


 


First off i cant seem to post without quoting so sorry for that.

As far as Graphics cards go GDDR 3 is faster than GDDR2 thats just a fact i dont care about latencys.
As far as system ram goes well thats all totally differant and i think you two are maybe getting the two confused ?
Mactronix
 
G

Guest

Guest
Yeah, try not to buy a GDDR2 card. They're considerably slower than the GDDR3 cards.
 

Asian PingPong

Distinguished
Jan 21, 2008
131
0
18,680

You misunderstood me. what I'm trying to say is its pointless to get slower memory in an effort to get lower latencies, because 800mhz at a latency of 4 would take the same amount of time to return the data as 1600mhz as a latency of 8.
 

derek85

Distinguished
Feb 4, 2008
44
0
18,530
GDDR3 and DDR3 are NOT the same thing, for anyone who are not familiar with it, check out Wiki's entry on GDDR3...

GDDR3 is better than GDDR2 or simply DDR2 because they are more optimized for graphics use than DDR memories and can achieve higher frequency than GDDR2/DDR2.
 

cyberjock

Distinguished
Aug 1, 2004
305
0
18,780


First, there appears to be alot of confusion between the relationships of latency, clock speed, time, and performance of DDR(1,2,3) and GDDR(1,2,3,4,5).

Go here: http://www.thetechrepository.com/showthread.php?t=160

Now, with this newfound knowledge, let's look at 2 speeds....(I'll use only the grey boxes for this example).

DDR-1000(DDR2) spec is CAS4(8.0ns). DDR-1333(DDR3) spec is CAS6(8.99ns).

Now some people are gonna go "What!? 8 ns < 8.99ns, so how is DDR3 faster when it has more latency!

The answer is simple. The prefetch buffer is 4 bits deep for DDR2, but 8 bits deep for DDR3. That means 4 bits for DDR2 transfer, and 8bits for DDR3. We lose 0.99ns, but we gain twice the amount of data. Make sense?

Now, some of us will remember way back to when DDR2 first came out. Some applications ran faster with DDR than DDR2 in testing. This was because many programs were pulling data from many different locations and only using 1 or 2 bits of data. So even though you might have had "faster DDR2", you could be slower because you'd receive 4 bits of data at a slightly higher latency, only use 2 bits(of the 4bits), and the other data was extraneous and ignored. Latency was all that mattered. Of course, now that DDR3 is coming about, this problem is somewhat rearing its ugly head again. Some programs are hard coded to work with certain bit sizes to maximize the RAM performance, others aren't. This is why some benchmarks show DDR3 slightly slower than DDR2, and others show the opposite.

Overall, DDR3 is the future, because it can transfer more data per unit time.

Sidenote: Ever seen those cool PC ratings like PC2-5300, PC3-14400.

The PC shows Personal Computing, then the DDR type(in this case DDR2 or DDR3), and the last number shows the MB/sec performance for the bus(in this case 5.3GB and 14.4GB/sec).

GDDR family isn't really related to the DDR family. GDDR is designed for use for video processing cards, and DDR is more for general system memory. I remember reading back in the mid-90s when GDDR came about, the only significant difference was that GDDR can read and write to the same block of memory at the same time. I can't find that info on google though.

GDDR > DDR for video processing. DDR > GDDR for system processing.

To answer the OP, there are differences, and they are quite complex. What you need to know is that each generation is supposed to be superior to the previous. You are probably wanting to know if you should buy a video card that has GDDR2 or GDDR3. GDDR3 is better than GDDR2. What kind of performance difference will you see is hard to ascertain. I couldn't find anything on google that shows hard fast benchmarks for comparing the 2 types of memory performance-wise.

Sorry for the long post. I felt that a little good home schooling would help out everyone in this thread.


 

krypt2711

Distinguished
Nov 26, 2009
5
0
18,510
Since you guys are talking about GDDR2, will a DDR3 RAM from the mobo conflict with it, or it's both totally unrelevant things?
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290

Just here to put in that where are you getting these numbers from? as i don't believe that is GDDR which this topic is on

and if that is ddr2 and ddr3 is not typical as they require higher voltages to run at those levels if it can't run at the specified voltage the ram technically isn't really running at those levels naively but being overclocked. Just until like last year DDR2 800 was most DDR2 666 overclocked to 800 and same with ddr2 1066 is usually ddr2 800 oced few exceptions here and there from new products like from g.skill.

just pointing out as (i have very limited knowledge of GDDR only that it is a ways different from ddr) what i don't understand here. :{
All i know is that most cards run off GDDR3 for a reason and GDDR4 was pretty much skipped for another reason.
I'd like to see articles and references :D