Real world difference would show no noticeable difference. And synthetic benchmarks would show small differences.
A server should not utilize the higher speed RAM (Which is why all premade servers use "slower" 1333 and now 1600MHz RAM) do the the fact that the processor *must* be overclocked in order to use RAM above 1333MHz (AMD, 1st and 2nd gen i Series) and 1600MHz (3rd gen I Series). Servers also generally use ECC RAM, which is designed to check for memory errors and keep them from happening.
It is a general rule of thumb, that the faster you push something, the more likely errors are going to occur, and servers are designed to be rock solid and reliable.
Gaming however is a different story, you aren't looking for 24/7 uptime and zero error tolerances. And if you want to put a solid OC on your CPU, and you want to bring your RAM speeds up, you will get a pretty solid boost to your FPS, but this will be more related to your CPU OC than the RAM.
I have found the sweet spot for RAM speed to be around 1866MHz, above that can be touchy, without any real benefit. It can also be attained without out much of a CPU overclock.
What is the real difference for 1600Mhz compared to 2400Mhz
+0~+4FPS plus a huge amount of instability. So I'd stick with DDR3-1600 kits with either SB/SB-E or IB CPUs. A 'Server' uses ECC or Registered ECC RAM and never exceeds the CPU (Xeon or Opteron) rated frequency all in an effort to eliminate errors. Every single non-ECC kit will throw 1-2 errors a month, and with DDR3-2400 that number is closer 1-2 a day.