I have seen articles claiming that DDR3 could result in significant savings on power bills for data centers since DDR3 has lower voltage requirements. However, the OCZ FAQ on DDR3 says differently. From the FAQ:
"The nominal supply voltage of DDR3 is lower than that of DDR2, that is 1.5V compared to 1.8V, respectively and that results in lower power per clock cycle. However, since DDR3 will run twice as fast as DDR2, the average power consumption will be 38% higher. This means that there will be 38% more heat as well that needs to be managed and, coincidentally, DDR3 will use SPD entries to define the presence or absence of heatspreaders."
Can anyone tell me if there is a definitive answer as to DDR3 being a net reduction or increase in power consumption compared to DDR2?
Let's see... Data centers use a huge number of DIMMs so yeah, it's possible. For consumer desktops, it's insignificant. You'd MAYBE save 10cents over a year with DDR3.
True, but if the overall power consumption is 38% higher, then the net change to a data center's power bill would be for it to be more expensive with DDR3 over DDR2, right? It seems to me that the OCZ FAQ is indicating that the savings in the nominal voltage going into the chip module is outweighed by the higher clock rate unless resulting in an overall higher power consumption, unless I am misunderstanding it.
I guess if you underclocked DDR3 RAM so that the net power consumption is less then it might be a win. Remember that memory latencies aren't JUST governed by the clock speeds, they're governed by the combination of clock speeds and timings. If you reduce the clock speed and tighten up the timings you can get pretty close to the same latencies, and perhaps do so at a lower overall power consumption.
Look at it thisy way, from a voltage standpoint, DDR2=1.8v to 2.2v, DDR3=1.3v to 1.65v, using less voltage equates to less power usage, plus the fact that DDR3 does have a thruput bandwidth higher then DDR2