When chips are manufactured, it's not like they are limited to speeds of say, 2.8 GHz. Manufacturing a chip is something like printing a chip design. A wafer (large piece of silicon) has many chips 'printed' on it, which are then cut and then made into a processor.
Theoretically, there's no limit to the maximum clock speed of a processor. Which means theoretically, you should be able to run a Pentium 2 at 100 GHz. But the problem is heat. When you try to run a chip at too high clocks, it produces a lot of heat, and consumes a lot of power.
So what happens is that all chips on a wafer are of varying quality. The chips in the center are the highest quality products. They consume the least power and produce the least heat. They are put into portable devices like laptops.
The chips surrounding those in the center have higher heat production and power consumption, so they are packed for desktops, which have the ability to handle these high values.
Then depending on market conditions, manufacturing costs etc., hardware capability, chips are shipped out with different default clock rates (the numbers in GHz).
Say if the demand for i5 2400 is way too high, they just have to disable the multipliers and reduce the clock rate of i5 2500K, and then they can ship it as a i5 2400. Or if a chip 'meant' to be packed as i5 2500K is producing high heat and consuming high power, its clock rate is reduced and it's shipped as i5 2400.
What overclockers do is try to increase clock rate by adding better cooling etc. (to compensate for increase in heat, increase in power consumption is not so problematic from a desktop perspective) so as to get more speed at the same price.