More RAM, less performance

ViddyD

Honorable
Jul 24, 2013
53
0
10,630
I recently upgraded from 8gb of ddr3-1600 ram timed at 11-11-11-28 to 16gb (same speed) timed at 9-9-9-24. However, my Windows Experience Index (memory) dropped from 7.9 to 7.6 and now video encoding is SLOWER! It also takes an extra second or two to start the computer (booting from SSD) but that may be because there's more memory to test during POST. I ran an extended memory test using the Windows Memory Diagnostic tool and it all checked out fine (though took several hours to complete). I have XMP enabled in the BIOS and already checked the speed and timings there and it all checks out. So what gives? Any ideas?
 
Solution
If they were single sticks rather than a true set, might be they just don't want to play together, this happens all the time, which (as you may have guessed is why DRAM is sold in sets, with tolerances so tight, the manufacturers test sticks to ensure they'll play for 2-3-4-6-8 sticks sets - which is also why the sets cost more than single sticks), what you might do is manually set them up (type in the individual base timings and freq to lock them in, then as mentioned raise DRAM voltage to + 0.05 over spec and raise the VCCSA up a hair to around 1.00

ViddyD

Honorable
Jul 24, 2013
53
0
10,630


OLD RAM
G.Skill Ripjaws
DDR3, 8GB, 1600, 11-11-11-28, dual-channel, 1.5V

NEW RAM
Crucial Ballistix Sport
DDR3, 16GB, 1600, 9-9-9-24, dual-channel, 1.5V

I'm not using them together since my OS can only support 16GB memory max, so I wouldn't need to match their timings (that is what you mean, right?). Is there any more info you want? I don't know what other specs to list off the top of my head.

Also, my old G.Skill Ripjaws are supposed to be set to 9-9-9-24, but were timed higher because I did not have XML enabled at the time. It's enabled now so they're currently at their proper timing, but the fact remains that they, when set to 11, got better performance than my new ram set to 9.
 

ViddyD

Honorable
Jul 24, 2013
53
0
10,630
Motherboard: Gigabyte Z87X-D3H
CPU: i5-4670K (at stock)

I just flashed by BIOS from F5 to F7 and there is no difference. The test I'm running for encoding speed is converting a 350MB AVI file to MP4 using Any Video Converter. Is this a fair test? I've read everywhere that video encoding is heavily affected by RAM, unless I'm misunderstanding. If there is a better way to determine RAM performance using real world methods then I'd be glad to try it.

Tradesman1, you really weren't kidding when you said "see you around." Haha.
 

ViddyD

Honorable
Jul 24, 2013
53
0
10,630
Here are the results for the Crucial ram (16gb):

PASSMARK 7
Memory Mark - 4265.9
Allocate Small Block - 7278.8
Read Cached - 2976.0
Read Uncached - 2900.6
Write - 2962.6
Large RAM - 13017.9

AIDA64
Memory: Read (23660 MB/s), Latency (58.2 ns)

MEMTEST86+
Memory - 16G, 20859 MB/s
Settings - RAM: 0MHz, (DDR3- 0), CAS: 19-15-15-31, Triple Channel (this is way inaccurate)
Cached - 16G
RsvdMem - 960k
MemMap - e820
Cache - On
ECC - Off
Test - std
No errors


And for the G.Skill (8gb)

PASSMARK 7
Memory Mark - 2532.3
Allocate Small Block - 6391.7
Read Cached - 3096.6
Read Uncached - 2897.8
Write - 2962.3
Large RAM - 6324.6

AIDA64
Memory: Read (24122 MB/s), Latency (59.2 ns)

MEMTEST86+
Exactly the same except that it was 8080MB instead of 16G. Even the wildly inaccurate "Settings - RAM" reading was the same.



SiSoft Sandra was freezing up on me, and honestly the whole thing was confusing anyway. I'm too tired (and kinda busy) to keep retrying it and try to figure it out to boot.

I tested the encoding process with Any Video Converter again and here are the newest results which are pretty steady: Crucial 16gb takes 2 minutes and 17 seconds to convert and the G.Skill 8gb takes 1 minutes 57 seconds. Regarding the model number, I discovered something. The 16gb two stick kit is BLS2KIT8G3D1609DS1S00, but ones I have are both BLS8G3D1609DS1S00, which I looked up and are the single channel sticks sold individually. Would that make a big difference? If so I'm gonna be P.O.d at the Ebay seller (though I just checked and he listed the part number, but how was I supposed to know?). Anyway, I apparently changed the voltage in the BIOS but it made no difference. I say "apparently" because PC Wizard 2012 still tells me it's set to 1.5V.
 

Tradesman1

Legenda in Aeternum
If they were single sticks rather than a true set, might be they just don't want to play together, this happens all the time, which (as you may have guessed is why DRAM is sold in sets, with tolerances so tight, the manufacturers test sticks to ensure they'll play for 2-3-4-6-8 sticks sets - which is also why the sets cost more than single sticks), what you might do is manually set them up (type in the individual base timings and freq to lock them in, then as mentioned raise DRAM voltage to + 0.05 over spec and raise the VCCSA up a hair to around 1.00
 
Solution

ViddyD

Honorable
Jul 24, 2013
53
0
10,630
Note that this is in the Gigabyte UEFI DualBIOS on the z87x-D3H.

Okay, I went to the Performance tab, then Memory, and clicked Memory Sub Timings and manually set everything to the standard numbers listed for both Channels A and B. However, under the fields Round Trip Latency(DIMM1/Rank0) and its "Rank1" counterpart, the standard (or auto setting) shown is 39, but when I go to manual I can only choose -15 through +15. Similarly, in loLatR0D0, loLatR1D0, loLatR0D1 and loLatR1D1, the automatic settings are shown are -, -, 3 and 3, respectively, yet I can only choose -3 through +3 for any of them.

In the Performance > Voltage tab I increased it to 1.550v and clicked "on", yet when I went to a different tab and came back it switched to "off", but the manual control was still 1.550v. As stated earlier, PC Wizard 2012 still reads my memory voltage as 1.5. BUT, in the BIOS in the top left it shows the DRAM Voltage as 1.548V, as if it changed successfully though is a little off (and remains that way even when I restart the computer and reboot into BIOS).

I could not find anything that said VCCSA in the BIOS, I even checked under the CPU voltage settings.

After all this the converting process takes the exact same amount of time. I'm starting to think I wasted $112.50.
 

Tradesman1

Legenda in Aeternum
May be listed as System agent voltage - and PC Wizard is prob reading the stock DRAM voltage for the stick from the SPD opposed to what it's actually running at - sounds like sticks that just don't want to play - see if you can find System Agent voltage and bump it
 

PassMark

Distinguished
It isn't really conclusive that the speed of the RAM is what caused the 17% change in video encoding performance. Seems like a big change to attribute to the RAM. Especially when the RAM benchmarks don't show any significant difference (ignoring the 'Large RAM' benchmark as that depends on the quantity of RAM in the box and not the speed).

Maybe the cause is something more obscure. For example with a file of this size, disk caching could have a big impact. Or maybe the CPU's performance is changing (e.g due to higher temperatures).

Not saying it isn't the RAM, but it might be worth considering other possibilities.
Are there any other apps that show a change in performance?

 

ViddyD

Honorable
Jul 24, 2013
53
0
10,630
I went ahead and shipped the sticks in question back to the Ebay seller today and already received my refund. Thanks for all the help, everybody! Though I did run some more tests with rather intruiging results if you're hungry for more!

Last night I used my 2x4gb set (old G.Skill) in slots 1 and 2 and a single 8gb stick (new Crucial, the ram in question) in slot 3, giving me 16 total gigs while using both brands and avoiding redundancy (thanks to Windows 7 Home Premium's memory limitation). The results were the same as using only 8gb, either in the form of a single 8gb Crucial stick or both 4gb G.Skill sticks. I also used only a single 4gb G.Skill stick and the encoding process still finished in the same amount of time (2:22-2:24) as when I used either 8gb or 16gb.

So...what the heck does that mean? It's like Any Video Encoder simply doesn't benefit from having more than 4gb of ram (or less!), or maybe my old 2x4gb set is bad too... I dunno. Are there any real world programs (like encoders) besides benchmarkers that I can test with in order to see real benefits from having more ram? AVC just doesn't seem to show any difference at all. I can still test the difference between one 4gb stick and both in dual-channel.

Further note that I also ran a comparison test using the 2x4gb set and the 2x8gb set in VirtualDub, compressing a large, multi-gigabyte file (don't remember how big, 2-4gb I think). That program also showed no performance difference. I didn't test that with any other combination though since it took a lot longer and I simply didn't have the patience.
 

PassMark

Distinguished
The tests you have been running, video encoding and compression both make significant use of the hard drive.
A typical hard drive runs at maybe 150MB/sec.
Typical RAM runs at 20,000MB/sec.
So RAM is more than 100x faster.

So in any test that involves using the hard drive, the CPU and the RAM, the time spent getting data to and from the hard drive will tend to dominate.

What you could try is making a RAM drive and copying the video files onto the RAM drive. You can then remove the hard drive speed from the equation.

Very few applications make use of more than 4GB of RAM. The advantage of lots of RAM is normally that you can run lots of applications at the same time.
 

ViddyD

Honorable
Jul 24, 2013
53
0
10,630
Just did it to and from my sdd (Samsung 840 Pro) and there was no difference except the same ~1 second variance I always saw. It didn't even improve over my hdd speeds. (Note that this isn't regarding the 8gb vs 16gb tests, just the 4gb vs 8gb because that's all I can test now.)

Since I can now only test the difference between either one or both 4gb sticks, and those sticks are the only ones I have, creating a ram drive will prove problematic. But supposing I could, do you think it would show a difference in editing speed that the 840 Pro couldn't?