well NVidia and AMD make GPU's, some are better than others, as in they run at a faster clock rate. Usually certain of the silicon wafer provide better chips, so those are sold to companies who then put them in OC versions because they can run faster than the standard rate so they charge a bit more money to get a better chip basically.
Usually they do overclock even more and better because they are the cream of the crop of the chips. You are more paying for the better chip than the small OC they usually give.
CPU's are the same way. It's to save money. With AMD, the 8350 and 8320, it's only a 500mhz difference in speed, the rest is the same. So 8350's that fail at 4ghz are factory set at 3.5ghz and sold as a 8320. If 1 of the cores is bad on the chip, they nuke off 2 cores, it becomes an 6350 or 6300 depending on the speed the remaining cores work at.
Back in the day it was better because usually with a GPU or CPU they would just disable part of it to make lower chips if they were in short supply. Some people got lucky with awesome GPU's that had parts of it locked because they needed to make demand for cheaper cards. Now they caught on and usually laser off additional cores or other ways to permanently disable cores, shaders, etc. Althon X2's were a good case of this. MB companies caught on and enabled the unlock ability in the BIOS. Usually you could get a 3rd core running, sometimes a 4th core and turn a cheap dual core into a quad core.