From the manual:
"Memory - 4 x 1.8V DDR2 DIMM sockets supporting up to 16 GB of system memory"
I'm sure you are aware of memory latencies. I've written up a good example somewhere else recently, but I can't find it, so I'll try again: if you buy DDR2-800, its timings are likely 4-4-4-12; if you buy 1066, the timings are 5-5-5-15 or 18. The reason for this is that there are inherent physical limits to the speed at which the transistors and their interconnects can respond, so when you raise the speed (the frequency), you must raise the latencies (the delays to wait for a transaction to 'lock in' and be available to the CPU). People think, because 1066 is 33% higher than 800, that they should, therefore, have a 33% increase in memory throughput - not so! If you look at 'cycle time', at 800, a RAM bus cycle is 1.25 nS; at 1066, it is .938 nS; BUT, when you multiply by the latency, 1.25 x 4 gives you a 5 nS 'turnaround', while .938 x 5 is 4.7 nS - you really only get about a 6% gain ([5-4.7]/5)...
January 17, 2009 6:10:43 PM
Will this work great in terms of performance and reliability in an under $100 cost for 2 x 2GB ?
my error. Good info. Yes the reviews for the Mushkin 1066 that you suggested do have users with my board having success with 1066, even at factory voltage. i would like to manually tweak the voltage for the 1066. If you know the link to this info, please share, if not I will be back to search. THANKS! MUSHKIN 1066 IS NOW IN MY CART!
I would not personally buy an nVidia based board for three reasons:
-to take full advantage of an nVidia GPU, you really want a MOBO w/an nVidia chipset - they're 'sub-optimized' for each other;
-I have had, over the years, a huge number of strange problematic behaviors with nVidia drivers, culminating with a particular incident where I troubleshot a glitch in a forty-thousand line Excel macro, meant to feed processing plant error messages to a serial encoded rf paging unit, for two whole weeks before discovering it was an nVidia video driver problem (how the H could THAT interrupt DDE driven macro execution?); that led me to swear off, and use ATI exclusively: have had few video-related problems since, and always have gotten respectable tech services from them, though I viewed their recent purchase with great trepidation;
-I am particularly incensed by nVidia's handling of the ESA (Enthusiast System Architecture) debacle: much-lauded at its release as an 'open standard', it requires an nVidia chipset MOBO to get your fan controller (or any other d$#ned ESA device) to work properly! 'Open Standard' should mean 'SDK available', but nVidia guards this info like a pack of rabid wolverines. I know the writer of SpeedFan has been after them for some time now to get access, as have I and any number of people: the Gods at nVidia have not seen clear to smile on us...
Ok - rant over; that said, my policy is to buy last year's darling of the gaming set, now that it's 'obsolete to them', and selling for half what it cost them last year. (Frankly, I think they're all nuts - who would pay six thousand dollars for a system optimized to squeak a few extra frames per second out of some brain-numbing game? - but - God bless 'em, they keep the wars between AMD & Intel, and ATI & nVidia, et al, revved up and capitalized, and we profit from their cheap 'left-overs!) I'm running a pair of Sapphire 3850s which were well under a hundred a pop, and tame aero to the point that the machine seems telepathic... If I had to pick, I'd stick with ASUS, EVGA, or MSI, all of whom seem to do a good job of implementation; the GeForce 9600 boards seem to be a good value at that price-point - take a look at this one @NE: http://www.newegg.com/Product/Product.aspx?Item=N82E168...