Well to be very very honest memories sticks that are below 1000mhz usually are slow, for eg. U have a memory stick of 780mhz and u run a benchmark,a graphics benchmark,I personally believe that a graphics benchmark is the best way to see if the clock speeds on,ur memory stick have affected the frame rates,well so as I was saying suppose that u have a memory stick of 4gb installed running at 780mhz and in the benchmark results u get 40 fps, now if u put in a 1333 MHz u will see a definite improvement of 4-5 frames but, above that like for 1600 MHz stick or 1800 MHz or 2430 MHz u might only be able to squeeze a man of 2 or 3 fps,In short not making a noticeable difference now next comes the memory size as the size increases it decreases the processing time,but only to some extent like if I had 4 GB stick,installed running at 1333mhz and it took me,30,secs to copy a 500mb file from one place to another then on increasing the size to 8gb will produce a significant difference and might reduce the time by 10 or 12 seconds that is a huge improvement, but then, if 16gb is installed then barely,0.5-1 second improvement will be observed and so, I can conclude is that the difference in times or latencies or fps alters to a definite extent because software these days is'nt pretty much optimized according to the more,rather pretty more advanced hardware
I hope this clears everything up for u
any questions any problems always available
my sys specs
i7 3770k @4.6ghz
asrock z77 extreme4
2x r9 290 oc versions
corsair vengeance 32gb
Seagate barracuda 3tb
1 tb Samsung ssd
corsair h55i with two noctua fans