specs are the same as in the images; i3 2120 with stock cooler and Asus P8H67-V 0806 BIOS
cpu voltage 1.045 offset -.120 (default 1.100 offset +.065)
gpu boost=enabled (increases Bclk to 103)
the gpu boost increases the speed of the 2120 from 3.3Ghz to 3.3994Ghz (whoopee!)
this is running prime for about half an hour (ambient 20.5c but this isn't really about temps)
this is two minutes after stopping the workers
i am pretty stoked to see a max of 39 watts underload and just under 5 at the lowest but mostly 5.3 and the volts .67-.81 idle and max 1.08.
though what i am wondering is could i go farther down?
the default cpu voltage in my bios was 1.100 on the left with a selection of either +/- and the setting to the right. when i changed the setting reboot back into my bios the setting would be reflected in the voltage to the left.
i started going up in negative value by .05 save/exit, run prime with no errors for 20-30 minutes, reboot and increase value. until the voltage was 1.040 offset -.125 and fail three minutes into prime (bsod) then i got curious and wondering about how low post offset -.185 and no post. i hate looking for my cmos rest jumper.
along with how low can it go,
would disabling speedstep thereby keeping my voltage constant, like overclocking, be of use?
i sorta figure it would keep the cpu from going too low keeping it enabled.
also would decreasing the pll be of any use?
and finally could this cripple my possessor?
i do not think so but i do not want to save watts and have a giggle by chopping off performance.
a small update of cost performance while waiting for some feedback:
looking at the prices on newegg
i3 2100T $135
i3 2120 $128
i3 2130 $135
so since enabling GPU boost, raising the Bclk to 103, which sets the speed close to the 3.4Ghz of a 2130 and using slightly more power than the rated 35W TDP for the 2100T; i saved a few bucks on my purchase.
however $8 isn't all that stunning but, if i consider the power savings of 26W (the difference of 65Wof a 2120) a day:
26w*24=0.62Kwh of savings. (remember the old saying, a penny saved is a penny earned?)
ok, so i won't be running a cpu @100% load 24/7 so the saving will be less idling. it would be nice to see other i3 users post their volts and wattage to have a comparison.
a little bit more about performance while encoding a Mpg-4/avi container to a VOB using convetX 4:
this was an 1h 48minute 908Meg movie using "quality" (short project)1 pass settings in convertX (IMO the best DVD encoding and burning software)
as you can see the watts were 28.37 @1.06 volts under 58/64% load. i am believing since that the wattage does not scale to the load on the cores since 61% load avg across both cores. my first post ( prime 95 bench ) reflected a max watts of 61.5% (rounding to 40 watts) of the TDP of 65 watts.
if i look at the 28.37 watts compared to the 40 watts max it would be 70.7% (under the 61% load)
but if i look at the TDP of 65 watts it would be 43.6%.
so i think it begs the question:
does undervolting cause less efficiency (does not scale) under lighter loads?
is it perfectly acceptable that a 60% load generate a 71% (rounding up) usage in wattage?
i am thinking i should reset my bios to the defaults and see just how much juice my 2120 sucks up on full load to use as a control.
as before, feedback, comments and criticism is welcomed.
btw, i did not start HW monitor until after i started encoding so the minimum values are for the current process. and i am really happy to encode a DVD x12 the frame rate of the media
Heh, sounds like something I had planned once a while back. Buy an ultra low wattage CPU, then under clock/volt it as much as possible to squeeze even more, erm... less electricity out of it. Was going to build a 'green' machine out of it. Ah well, cheers to your efforts though!
a bit of performance playing COD MW3 - single player-spec ops- resistance
1900x1080 4xAA ambient occlusion and everything enabled. (looking to put a load on the grfx card)
i ran AIDA64 logging to a .csv file while playing.
then i sorted the cells based on highest GPU load, column O:
what threw me off is how COD MW3 used the hyper threading of the second core (#4) but seemed to ignore the thread of the first one (#2). it did not appear at no time while under the 550ti heaviest loads where was the CPU under a load over 20%. now since core #2 was ignored, if i throw out that value, the heaviest load on the CPU was 27.33% when the GPU was @ the same 59% load. however looking at the multi it seems speed step had throttled down the CPU. the largest load on the GPU was 67% while the CPU was at 25.66% again throwing out the #2 of 0%. though again, speed step had the multi throttled to 31x as opposed to 33x.
then sorted based on CPU load column F:
it seems obvious that the heaviest CPU loads came when the game was loading. it might be ridiculous to think a 0% and 3% GPU load would bottleneck the CPU. since the game play scenario has instances where there are waves of attackers loading; i am going to guess that the times that are close to each other by a few seconds were those times. regardless, at no time did the CPU seem to be under such a load (34% throwing out core #2)that it would cause a bottleneck for the GPU.
i took a snapshot of the minimum and maxium though the lowest values where logged just prior to game play:
i snipped this just to see that all values were accounted for in the data i analyzed. i smirk at the 0 watts, though i am pretty encouraged that the maximum watts were 28.17; if i look at the CPU load data that was @ 1.064 volts. i'll look later to sort the whole log to see what the watts were @ 1.072 volts.
so really it does not seem to HINDER gameplay to undervolt the i3 2120. i know i should have run fraps during this but the gameplay at no time lagged or stuttered - so i don't know if it is that important ATM.
however considering i an using a 550ti (a basic mid-entry gaming card)i really would like to see the difference with a 560ti or 570 or 6970 but that isn't in my budget
thanks phyco126 for your comment and everyone for looking. feedback welcomed.
ran more appropriate benches in battlefield bad company2, more of a stress on the gpu and i hoped the cpu also.
settings, yeah ima sissy with aa:
the raw cvs file from aida64, yes i figured out how to convert data in a column to a graph using excel.
you can see when i started the game in relation to the row number which helps seeing when what affects the cpu and gpu.
a more focused view of the charts. you can see the jump in the cpu when the game started but then leveled out at about 50% load while the gpu jumps to 99% for most of the time.
the volt ( max 1.072)and watt (max 39.2) load:
a snap shot of the minimum and maximum. again if you noticed all four "cores" have a load on them and we all know that an i3 is a dual core with hyper threading. i thought games didn't use hyper threading.
and also i ran fraps to see the fps. towards the end i kamikaze'd to get as much action as i could. this was in the last mission where you try to rush to the cockpit after finding the weapons room.
40 - 45 fps with a dell e2211h @ 1080. yes it could be better though with my perception it seemed playable.
i was satisfied to get the 550ti maxed out and see the i3 2120 @3.4Ghz came nowhere near choking, though i, along anyone else, doubted it would. this really make me anxious to see how a gtx570 or HD 6970 would be handled.
Thank you for the infos!
It's interesting to see that an i3 2120 is only at about 50% usage on Battlefield Bad Company 2 at 1920x1080 with a GTX 550.
Not much use to buy an i7 as it seems. Even an i5 would be pretty useless.