WARNING: WALL OF TEXT
"Now, I also know there are more complicated concepts at play, which I have questions about. For instance, how each core splits up the work of a CPU. In a four core processor, is the 3.4 Ghz divided among each core? So each runs at 850 Mhz? Or does it mean that each core runs at 3.4 Ghz? If each core runs at 850 Mhz, then is a 100 Mhz OC applied to each core, for 100 extra Mhz a piece, or split among each of them, for 25 Mhz a piece? Would a dual core at 3.4 then have each core running at 1600 Mhz, or would each run at 3.4 still?"
First, I can't find anything that matches your question on Google. Essentially, is it better to put 100Mhz OC on a CPU or GPU, and why?
To partially answer myself, and your query, each core runs at 3.4 Ghz if the CPU is 3.4 GHz. So if you OC a 3.3 Ghz, quad-core processor by 100 Ghz, you are running each core at 3.4Ghz. Each core can do 3% faster work on its job, each one cycles 100 extra time per second, or 6,000 a minute. So we have actually 24,000 extra cycles. The original speed was also much higher than I calculated. Instead of 3300 cycles per second, we have a total of 13,200 for 60 seconds = 792,000 total per minute.
24,000/792,000 = 3%
We have the ability to run 4 jobs at once at 3% more cycles for each of them per unit of time than before.
In a dual core, 100 Mhz OC is only 2 jobs at 3% better performance a piece. At 3.3 Ghz, we have 3300 * 60 * 2 = 396,000 cpm. At 100 Mhz per core OC, we get 100Mhz extra per core, or 6,000 extra per minute, a total of 12,000. At 12,000/396,000 you see exactly 3% relative increase.
Second, I want to qualify my answer. These percentages are all RELATIVE to your perception of the previous performance of your system. A 100 Mhz OC on a dual core processor is the exact same as a 100 Mhz OC on a quad core, mathematically. For instance, at a 100 Mhz OC there is a 3% performance increase in a 3.3 Ghz processor, be it a quad or dual core. You will see a similar RELATIVE performance increase. BUT THIS DOES NOT MEAN THAT A QUAD CORE AND DUAL CORE AT THE SAME CLOCK SPEED ARE EQUAL. The quad core will do much more work at the same 3.3 Ghz. It has 792,000 CPM as opposed to 396,000 CPM for the dual core. A 100 Mhz OC will put you at the exact same 3.03% RELATIVE better performance than before on both CPU's, but 3% of 792,000 is better than 3% of 396,000. So any OC on a quad core should make a larger objective difference than the same OC on a dual core, but the subjective difference will be the same.
In a GPU we have a totally different concept. We have parallel cores? That sounds like a CPU to me though. I know much less about how a GPU works. But we do know that we can calculate and edit the memory clock and core clock speed, despite whichever way the set-up theoretically works. If we assume that it is like a single core CPU then an OC on a single core GPU should make less of a difference, than we could expect from multi-core systems. It won't make nearly as large of an objective jump in clock speed as a CPU would. But considering my previous statement, that most GPU's run on slower clock speeds than CPU's, you would notice a much larger relative difference at 100 Mhz
Considering we aren't computers, I doubt objective difference is nearly as important as subjective difference. So it may be more important to boost your GPU because it is a 10% relative increase, compared to a 3% CPU relative increase, despite the fact that a 100 Mhz Core OC is actually less total clock cycles per minute to add to a GPU than a CPU.