Q6600 in a HTPC

chrssmale

Distinguished
Jul 8, 2011
4
0
18,510
Hi,
I've had this CPU (Q6600 Go stepping) coupled with a P35 motherboard, ATI 3850 and 4 gb ram, ample of storage (64gb SSD, 2x 500gb & 1x 1 gb harddrives), a DVB-T and a DVD-S2 TV cards. I also have decent cooling from my old overclocking days.
This PC is now hooked up in my living room and is primarily used for just watching and recording TV, watching video files and the odd game of streetfighter.
I think the CPU is an overkill in this system and for the majority of the time I have the CPU underclocked and undervolted. But am looking at reducing its running costs, the PC is running for most of the day (and sometimes night).
I'm not planning on updating the motherboard, but am possibly wondering on replacing the CPU with a dual core possibly bought from Ebay, maybe funded by selling my Q6600.
Would this make a noticeable difference to the annual running cost of the PC. Or would replacing the graphics card be a better solution.
Thanks in advance.
Chris
 
going from the Q6600 105w TDP to E6600 65watt
source - http://ark.intel.com/Product.aspx?id=29765
wouldnt make a drastic difference
and I think keeping the Q6600 would better

going from the HD 3850 to a lower powered HD 5k or 6k
would help alot

for example look at the HD 6570
http://www.tomshardware.com/reviews/radeon-hd-6570-radeon-hd-6670-turks,2925-15.html

less power draw than a HD 5670 (i have a HD 5670 and it maxes 64W)
remember your HD 3850 has a 6 pin 75watt power connector
plus the 75w from pci express slot

so going to a card like the HD 5670,6570,6670 would definitley lower power
could even run a HD 5450 for home theater
they use under 30watts on max if I remember right
 
There are three ways to decrease power consumption.

1. Replace the power supply (PSU) with one that is 80PLUS certified. That basically results in a minimum efficiency of 80% across all loads. PSUs that are not 80PLUS can be as inefficient as 60% depending on the load. 80PLUS PSUs are rated from Bronze up to Platinum; 80PLUS Platinum PSUs are up to about 91% efficient.

Assuming in total your computer needs 150w went idle, that means a non-80PLUS PSU may use as much as 250w from the AC outlet (150w / 60%) if your PSU is only 60% efficient. Or about 214w if your PSU is only 70% efficient (150w / 70%). An 80PLUS Bronze PSU that is 80% efficient means the PSU will draw about 187.5w from the AC outlet (150w / 80%). An 80PLUS Platinum PSU would lower that to about 165w of power (150w / 91%). The excess power (beyond the 150w) is generally wasted mostly as heat and a tiny, tiny bit as noise.

2. Replacing the CPU can help lower power consumption, but at the cost of performance. That said, based on what you want to do the drop in performance should not be too noticeable. The Q6600 consumes about 36w of power while idling and 87w under load. An option would be to replace it with the dual core E6600 that idles at about 21w and uses 52w of power under load.

burn.png


From:
http://www.xbitlabs.com/articles/cpu/display/core2extreme-qx6700_11.html#sect0

power-2.png


From:
http://www.xbitlabs.com/articles/cpu/display/core2duo-shootout_11.html#sect0

Under load the Q6600 consumes 35w more power than the E6600 under load. However, watching a movie, or recording a TV show will not max out the CPU so it unlikely the Q6600 will use 87w in your situation. Your HTPC will probably be idle a good deal of the time so the power saving is only 15w under those conditions. While difficult to estimate, going from a Q6600 to an E6600 will probably save you on average of about 25w of power under mixed conditions.

3. Replacing the video card can reduce power consumption and it is possible to have a video card that is just as powerful as the old video card, or even more powerful. First and foremost, do not buy a Radeon HD 6xxx series video card. You have an old motherboard that only has a PCI-e 1.x slot. All Radeon HD 6xxx series cards are PCI-e 2.1 cards. There can be compatibility issues inserting a PCI-e 2.1 card into a PCI-e 1.x slot. The initial Radeon HD 5xxx series cards were PCI-e 2.0 cards (no compatibility issues with a PCI-e 1.x slot), but later Radeon HD 5xxx series cards were PCI-e 2.1 cards.

I would generally recommend a Radeon card vs. a GeForce card because they generally consume less power. I would also recommend a Radeon HD 5xxx series card (PCI-e 2.0 of course) vs. the older Radeon HD 4xxx series card because of the lower power consumption. The Radeon HD 3850 uses about 63w of power. As a point of reference, the HD 3850 is less powerful than the GeForce 9600GT. It is probably as powerful as the Radeon HD 4670.

gbt3000_power.gif


I recommend buying a PCI-e 2.0 HD 5670 card. It is more powerful than the GeForce 9600GT which means it is more powerful than the HD 3850. It also consumes about 29w of power. However, since you do not seem to play games very often, you may want to consider the slower HD 5570 instead which should be roughly equal to the HD 3850; probably a little slower. The HD 5570 consume roughly 20w of power from what I remember.

axl5670_power.png


From:
http://www.xbitlabs.com/articles/graphics/display/axle-radeon-hd5670-1gb_3.html#sect0

Just beware that those maximum power consumption only occurs when playing games. Watching a movie will not stress the video card though so under normal conditions you are not going to see that 34w power saving with the HD 5670. (43w for the HD 5570). The HD 3850 uses about 21w for 2D/HD video conditions, while the HD 5670 uses about 16w for those same conditions.

Here is a PCI-e 2.0 HD 5670 for $63 after rebate:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102870&cm_re=hd_5670-_-14-102-870-_-Product

Unfortunately, the only PCI-e 2.0 HD 5570 card on Newegg is selling for $80. Therefore, it does not make sense to buy a less powerful card for more money. Perhaps you can find a similar card elsewhere.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814103177&cm_re=hd_5570-_-14-103-177-_-Product


===========================================================

Clearly the best way to lower power consumption is to replace the power supply if it is inefficient. It is difficult to determine exactly how inefficient or efficient your PSU is unless you can find a professional review about it that goes into that type of detail.

Replacing the CPU with a dual core CPU will probably be the best choice, if you already have an efficient PSU, but my guess is that on average you may only save about 25w with mixed conditions. If you play games or encode video, then you will save more power.

The video card doesn't really seem to provide you with much power savings unless you play a lot of games. The difference between the HD 3850 and HD 5670 in 2D / HD video playback is around 6w.
 

puttsy

Distinguished
Aug 14, 2010
287
0
18,860
It is likely that the money you spend attempting to decrease the systems energy footprint will be higher than if you simply kept the Q6600. Graphics may be a place to change, I'm not sure what "light" video cards will still keep games running. What program(s) do you primarily use on this media center? I usually just use Windows Media Center due to it's tight integration and that it "just works" with most cards but, let us know if you use something different. VLC is also 'decent' but, it isn't a real 10ft friendly interface so it isn't huge in the HTPC arena.

It is also possible your board won't support more "efficient" processors. Look into 45nm 775 processors and see if your board will even support them. I have a Q6600, e6600, and e6750 and none of them draw all that much power over my e8200, which is 45nm, 65w vs. the Q6600 at 65nm, 105w. I don't notice a difference in my electricity bill and over a longer period on a Kill-a-watt, it averages to a negligible difference. Even in the same system, the E8200 and E6750 were not far enough off to make the purchase equivocate the unit cost vs. electricity cost.

Looks like graphics would be your choice then. I can't speak on that end though. Good luck! The GeForce 210's are okay units but, if you want glitch free watching, look at something a bit better. I have one and it doesn't even keep up with a 1680 x 1050 playing 720p video so...your call.
 
Anyone know how disabling cores through Windows affects power usage? I'm in a similar situation... looking to relegate my Q9650 from my desktop to other duties, possibly an HTPC. A 3.0GHz C2Q is overkill of course. I was wondering if I disabled two of the cores (effectively making it an E8400) and then ran it underclocked at 2.0GHz if it would be more appropriate. I'm thinking more about heat output rather than power usage, though the two are linked.

In short, if you disabled two of your cores in your 105W Q6600, will it be in the same power envelope as a 65W E6600? Do the disabled cores still suck up power and generate heat?

I've got a kill-a-watt, I might do some experimenting with my Q9650 when I get some time.
 


Changing the video card for a newer video card like the HD 5670 that I mentioned above would only be beneficial if the OP plays games a lot. The Radeon HD 3850 consumes 63.1w of power under those conditions compared to 28.7w for the HD 5670. That's a significant enough difference.

When watching a movie or video the HD 3850 would consume 20.1w of power, while the HD 5670 would consume 15.5w. A small difference of only 4.6w.

If the video card is idling, then the HD 3850 consumes 13.5w of power vs. 8w for the HD 5670. A small difference of only 5.5w.

If you want some gaming and low power then get a 5750 or 6750 and under clock/volt it.

The HD 5750 consumes 58.7w of power under typical gaming conditions vs. 63.1w of the HD 3850. A good choice if you want more performance for games, but a game like Streetfighter does not need a powerful video card like that. Plus the energy savings is small.

The OP may have compatiblity issues with a Radeon HD 6750 since the old motherboard only has a PCI-e 1.0 slot and all Radeon HD 6xxx video card are PCI-e 2.1 cards. There are some people who have had issues installing a PCI-e 2.1 video card in a PCI-e 1.x slot.
 


Even if it is possible to "disable" half the cores, those disabled cores will still consume electricity and generate heat since they are still drawing power from the socket.

The only way to stop those from drawing power is to physically cut the circuitry inside the CPU from the other cores and the power pins which supplies the power from the socket.

Your best bet is to simply underclock your CPU.
 
I think I will still try the experiment ... I'm curious about the outcome and no tech sites have done it AFAIK. I'll see how 1 core, 2 core, 3 core, and all 4 cores scale in power usage and temperature when stressed. It should be similar to AMD's X2 and X3 chips that have disabled cores but use less power. I'll post back when I get some data.
 
Yes, I would underclock/undervolt regardless. The Q9650 is too hot for my HTPC location.

The reduction of core count occurred to me when a suggestion was offered that the OP get an E6600 to replace the Q6600 and my brain was thinking: the Q6600 is just two E6600's stuck together... he's already got two E6600... why spend $$$? :lol:

Anyway I'm curious, especially after looking at AMD's offerings. For example, the Phenom II X4 955 3.2GHz has a TDP of 125W, and the Phenom II X2 555 3.2GHz is the exact same deneb chip, just with two cores and L2 disabled and has a TDP of 80W.

If it doesn't do anything, no harm done. I'm just curious how power draw is affected. Actually, I'm waffling on retiring my Q9650. I can't really justify to the wife that I need to upgrade. :D
 

chrssmale

Distinguished
Jul 8, 2011
4
0
18,510
Sorry for the delayed reply guys. I’m currently offshore on an oil rig in the North Sea and the net was down all last night.
All I can say is wow, you guys are awesome, I wasn’t expecting this many comments, and graphs!!! You guys rock. Thanks very much for all the replies, just the info I was hoping for.
Off the top of my head the mobo is a MSI P35 Neo-F, so the PCI-e is definitely a 1.0. As for the PSU, can’t remember the brand, it’s a Coolermaster 450w modular one, about 4 years old. So I think that should be ok for now, though a more energy efficient one may be planned in the future. I've managed to clock the CPU to 1800ish off the top of my head and RAM timings and voltages too are as low as I can stably get it.
So for now I’ll think about the GFX card, a mate has a 6750 (yeah AMD fan, ever since the awesome 9500 pro mod ;) ), so I’ll see if I can borrow that one.
Once again, many thanks for all the advice.
Cheers,
Chris
 
OK, I satisfied my curiosity. I had some time to play around with my C2Q to see what happens if you disable cores. For my experiment I disabled fan control to run at a constant rpm to reduce temperature and power draw variability, load was taken after running Prime95 for 10 min., watts were measured at the wall (whole system, not just CPU) using a kill-a-watt meter, temps are the reported CPU temp from SpeedFan. Here's what I got:

Idle
1 Core 113W 39°C
2 Core 113W 39°C
3 Core 114W 39°C
4 Core 114W 39°C

Load
1 Core 141W 52°C
2 Core 161W 52°C
3 Core 178W 51°C
4 Core 186W 52°C

So, it looks like the dormant cores suck power disabled or not since sitting idle everything was equal. However, when loaded the difference between power usage scales with the number of active cores. But, my CPU temps didn't change at all which I found strange. Less power should equal less heat.

The only possible explanation I can offer is that perhaps my load temps are all at equilibrium with the case temperature... basically rock bottom and can go no further since I have a massive HDT cooler. During load testing the system temp was reading 46 to 48°C in all tests and perhaps my HSF is capable of cooling down to 6-8°C above ambient case temp. I wonder what would happen if I had a less massive HSF that would struggle more with all cores blazing away? Maybe the stock cooler would show a bigger delta.

Interesting. Doesn't help the OP much though.
 

chrssmale

Distinguished
Jul 8, 2011
4
0
18,510
LOL, doesn't help me at all rwpritchett. But definitely interesting mate, I would have expected a more linear scale as each CPU is activated. 20W difference between 1 & 2 cores, yet only 8W difference between 3 & 4 cores.
I wonder how AMD's quads would fair?
 

axipher

Distinguished
Mar 2, 2010
220
0
18,710
I'm running an old slimeline Acer on its side with a 220 W power supply, Q6600, and ATI 5670 with 2 HDD's and a BD-Drive. It's been on for 24/7 for about 6 months now and never gets over 60 degrees and runs any video like a champ and acts as a file/back-up server for a house with 5 students.

As long as you're not running stress tests on it, you shouldn't have to worry about using a lower power processor.