The answer, like in may things computing is, it depends....typically we will see only a 2-5 % performance increase in average fps.....but the increase in total system cost is less than that, then it is kind of hard to argue against the investment. In a addition usually where ya see an impact, it's on min fps more than average.
In the above link, we see a 4.5% increase in min fps and that will come at an extra cost of $44 for 8GB. On a $2k box, that's a 2% increase in cost or about a 2 to 1 return on investment. Some would argue that you won't notice the difference, then again if ya spending $2k, ya ain't likely to notice the $44
Here's a other investigation where the results were a bit more dramatic
22.3 % (SLI) increase in minimum frame rates w/ C6 instead of C8 in Far Cry 2
18% (single card) / 5% (SLI) increase in minimum frame rates w/ C6 instead of C8 in Dawn of War
15% (single card) / 5% (SLI) increase in minimum frame rates w/ C6 instead of C8 in World in Conflict
Here's what I am currently putting (All DDR3-1600) in my builds:
I keep seeing your post http://www.anandtech.com/show/2792/12 and assume it translates to ALL MOBO's, but it doesn't with the LGA 1155. The 'fixed' BCLK and to a degree the SB IMC of the LGA 1155 has more like a wall with DDR3-1600 CAS 8/9. On anything else that allows BCLK/FSB adjustment i.e. higher settings then sure you get the additional FPS above a ±1~±2 which is within the margin of error.