Real world difference would show no noticeable difference. And synthetic benchmarks would show small differences.
A server should not utilize the higher speed RAM (Which is why all premade servers use "slower" 1333 and now 1600MHz RAM) do the the fact that the processor *must* be overclocked in order to use RAM above 1333MHz (AMD, 1st and 2nd gen i Series) and 1600MHz (3rd gen I Series). Servers also generally use ECC RAM, which is designed to check for memory errors and keep them from happening.
It is a general rule of thumb, that the faster you push something, the more likely errors are going to occur, and servers are designed to be rock solid and reliable.
Gaming however is a different story, you aren't looking for 24/7 uptime and zero error tolerances. And if you want to put a solid OC on your CPU, and you want to bring your RAM speeds up, you will get a pretty solid boost to your FPS, but this will be more related to your CPU OC than the RAM.
I have found the sweet spot for RAM speed to be around 1866MHz, above that can be touchy, without any real benefit. It can also be attained without out much of a CPU overclock.