Sign in with
Sign up | Sign in
Your question

Would CPU load increase with decreasing clock speed.

Last response: in CPUs
Share
January 12, 2013 4:02:08 AM

An interesting article is posted at:
http://mymediaexperience.com/intel-hd-3000-vs-discrete-...

The article is comparing HD 3000 graphics to a Geforce card and indictates that a demanding graphics task will load the Core i3-2105's 3.1 GHz CPU cores to 25%.

Does this mean the same task would load the Core i3-2337M's 1.5 GHz cores to about 50%?
a c 202 à CPUs
January 12, 2013 4:07:00 AM

The graphics portion of these CPUs are all separate from the actual x86 cores. So tehcnically it would not load the CPU cores but the HD3000 ALUs.

But this may be easier to answer if you explain why you are asking?
m
0
l
a b à CPUs
January 12, 2013 4:21:23 AM

A task that loads a Sandy ~3GHz i3 to about 25% will probably load a Sandy 1.5GHz i3 to around 50%, but that is far from a guarantee. It's a little more complicated than that because i3s have two HyperThreaded cores, so without knowing how the workload applies to each thread, it'd be pretty much impossible to guarantee how the workload will run on the lower frequency model without testing it directly.
m
0
l
Related resources
January 12, 2013 5:02:17 AM

Novuake said:
The graphics portion of these CPUs are all separate from the actual x86 cores. So tehcnically it would not load the CPU cores but the HD3000 ALUs.


The article indicates the CPU load with the independant Geforce card is only 7%. This suggests the HD 3000 is using CPU driven EUs and not independent GPU cores.

If that is true then the HD 3000 cannot be tested by itself but only in conjunction with a CPU and overall performance should vary with the pairing.

It also means the device will score high on graphics benchmarks or CPU benchmarks but have significantly lower performance with applications that tax both the GPU and CPU at the same time.
blazorthon said:
...it'd be pretty much impossible to guarantee how the workload will run on the lower frequency model without testing it directly.

Yes, and I was hoping that someone on this forum would have a lower speed Core i3 and could run a high resolution video and post the result.
m
0
l
a c 202 à CPUs
January 12, 2013 5:27:03 AM

Murray B said:
The article indicates the CPU load with the independant Geforce card is only 7%. This suggests the HD 3000 is using CPU driven EUs and not independent GPU cores.

If that is true then the HD 3000 cannot be tested by itself but only in conjunction with a CPU and overall performance should vary with the pairing.

It also means the device will score high on graphics benchmarks or CPU benchmarks but have significantly lower performance with applications that tax both the GPU and CPU at the same time.

Yes, and I was hoping that someone on this forum would have a lower speed Core i3 and could run a high resolution video and post the result.



From what I can tell the only ONE of the benches, mpeg playback, taxes the CPU considerably more without the GPU.

One is not enough to make it conclusive that the ALL tasks are heavily offloaded to the CPU EUs. If this was the case then gaming on an I3 would have been near impossible. Since it is not, its obvious that the graphics ALUs are definitely utilized properly.
m
0
l
a c 79 à CPUs
January 12, 2013 8:07:24 AM

blazorthon said:
A task that loads a Sandy ~3GHz i3 to about 25% will probably load a Sandy 1.5GHz i3 to around 50%, but that is far from a guarantee. It's a little more complicated than that because i3s have two HyperThreaded cores, so without knowing how the workload applies to each thread, it'd be pretty much impossible to guarantee how the workload will run on the lower frequency model without testing it directly.


Agree, it should be testable by simply finding something that is not 100% load (that's the difficult bit) and then changing the turbo level to a smaller number on an Asus board (as thats how they OC).
m
0
l
January 12, 2013 3:33:38 PM

Novuake said:
From what I can tell the only ONE of the benches, mpeg playback, taxes the CPU considerably more without the GPU...

It was not clear to me from the text if the HD video playback was some sort of artificial benchmark or just playing back a HD video. From what I read the benchmark results were similar but the GeForce card performed significantly better when playing the game.

If the HD 3000's performance depends heavily on the CPU then it could follow that its performance will vary signficantly with the CPU it is paired with.

I remember when the minicomputer-on-a-chip (later called microprocessor) first came out. Many early systems used the CPU to generate the video but some later ones had a graphics co-processor. Benchmark results on the two types were similar but there was a huge difference in real-world performance. It mattered a great deal if a system was expending 15% of its CPU cycles on system tasks versus 75% or 85%. Even the mighty 68000 slowed to a crawl when it had to draw the screen too.

Tom's has a GPU list that does not differentiate between different HD 3000 - CPU pairs and I fear that it must do that to be accurate.

Knowing if CPU loading increases with lower clock speeds using HD 3000 graphics will help eliminate the possibility that the 25% loading is an anomaly.
m
0
l
!