Sign in with
Sign up | Sign in
Your question
Solved

How does the Intel Q6600 Quadcore hold up to todays cpu's?

Last response: in CPUs
Share
August 25, 2013 8:16:36 AM

Hello

How well does a Intel Q6600 Quadcore oced to 2.9 ghz(3) compare to todays cpu's? Is it still a viable cpu?
a c 101 à CPUs
a b å Intel
August 25, 2013 8:23:34 AM

Tom's did a comparison of some Core 2 CPUs to the Ivy Bridge stuff a while ago. The Q6600 isn't there, but overclocked it's probably close-ish to the Q9550 performance.
m
0
l
Related resources
August 25, 2013 8:24:33 AM

It really depends what you want to do with it.

Its pretty outdated, I used to have a Q6600 on my old old office machine and it had problems when it came to games.
Also emulating ps2.

It would do ok with a gtx 460 for example or something like that. If you would put a 780 with a q6600 it would be a massive bottleneck.

I would look for another CPU if i was you, just because that CPU is pretty outdated (It was good in its prime, trust me.)
m
0
l
a c 101 à CPUs
a b å Intel
August 25, 2013 8:25:05 AM

ACTechy said:
I mean obviously it's outdated and it's not going to get near the performance of intel's modern quad-cores, but it's still a decent CPU...

Here's how it stacks up next to a middle-of-the-road core i5:

http://cpuboss.com/cpus/Intel-Core2-Quad-Q6600-vs-Intel...

CPU Boss is unreliable.
m
0
l
a b à CPUs
August 25, 2013 8:52:20 AM

Sakkura said:
ACTechy said:
I mean obviously it's outdated and it's not going to get near the performance of intel's modern quad-cores, but it's still a decent CPU...

Here's how it stacks up next to a middle-of-the-road core i5:

http://cpuboss.com/cpus/Intel-Core2-Quad-Q6600-vs-Intel...

CPU Boss is unreliable.


That's a pretty blanket statement. I don't think the comparison charts are debatable facts. If you're referring to their conclusions, I'm not arguing.
m
0
l
August 25, 2013 8:55:35 AM

I read the article and the 9550 didnt do too badly.
Why do you feel that CPU boss is unreliable?
m
0
l
a b à CPUs
August 25, 2013 9:04:20 AM

g335 said:
I read the article and the 9550 didnt do too badly.
Why do you feel that CPU boss is unreliable?

they only use synthetic benchmarks no real world situations like encoding a video in handbrake, gaming or even compressing a file.
m
0
l
a c 101 à CPUs
a b å Intel
August 25, 2013 9:14:56 AM

ACTechy said:
That's a pretty blanket statement. I don't think the comparison charts are debatable facts. If you're referring to their conclusions, I'm not arguing.

The comparison charts are also highly unreliable. They compare irrelevant things, include incorrect information, and use only synthetic benchmarks.
m
0
l
a b à CPUs
August 25, 2013 9:35:29 AM

Sakkura said:
ACTechy said:
That's a pretty blanket statement. I don't think the comparison charts are debatable facts. If you're referring to their conclusions, I'm not arguing.

The comparison charts are also highly unreliable. They compare irrelevant things, include incorrect information, and use only synthetic benchmarks.


How are clock speed, l2 cache, power consumption, multipliers, and other pertinent architecture details irrelevant? And please do show where there is incorrect information in the link shared.

Like I said, I'm not arguing for their conclusions and point system, but you don't throw the baby out with the bathwater.
m
0
l
a c 158 à CPUs
a b å Intel
August 25, 2013 9:51:24 AM

You can't compare clock speed unless it's the same architecture. 1ghz can beat 2ghz, so throw that out. Even in the cpus you linked to, the i5 has 1mb l2, vs the 8mb c2q. 1mb wins, so let's throw that out. Tdp is not power consumption, misleading info. Multiplier is irrelevant, that is just one part of the equation that equals the cpu speed. Other architecture info is irrelevant, really who cares how many transistors you have? In the end real world performance for the price you're paying is all that matters. Throw the spec sheet out the window. Synthetics can give you an idea of performance but different software has different workloads. Even if it were a multi threaded software, they will perform differently. Let's take ps and pr for example. In the other link posted, the q9550 is faster than an i3 in ps by a good amount but slower, almost equal to it in pr.
m
0
l
a c 101 à CPUs
a b å Intel
August 25, 2013 10:19:52 AM

ACTechy said:
How are clock speed, l2 cache, power consumption, multipliers, and other pertinent architecture details irrelevant? And please do show where there is incorrect information in the link shared.

Like I said, I'm not arguing for their conclusions and point system, but you don't throw the baby out with the bathwater.

The clock speed is irrelevant because the amount of work done per clock cycle varies dramatically. L2 cache is not irrelevant, but in this case it's a retarded comparison because the Q6600 unlike the Core i5-3450 has no L3 cache. And the cache bandwidth and latencies can again vary dramatically. I can guarantee that the cache in the Core i5 is superior, but CPU Boss makes it seem like the Q6600 has a huge advantage there. The multiplier is even more irrelevant than the clock speed because what the multiplier is multiplying by can vary. The Core i5-3450 has a 100 MHz BCLK while the Q6600 has a 1066 MHz FSB. Which brings us to one huge difference between the two - the FSB doesn't exist with a Core i5-3450. It's one of the largest steps forward from the Q6600 to the Core i5-3450, yet CPU Boss doesn't even mention it in passing.

But I'm going to be generous here and only say that CPU Boss is utterly terrible and useless.
m
0
l
a b à CPUs
August 25, 2013 10:24:04 AM

haha touché, gents
m
0
l

Best solution

August 27, 2013 8:31:09 PM

Well I currently use a q6600 and am a sucker for those most demanding games.

My specs:
q6600 @ 3.33ghz
MSI GTX 560 TI twin frozr 2gb
GA-EP35-DS3P Mobo
4gb DDR2 800 mhz ram @740ish due to OC

Crysis 3 is punishing my CPU. This is the first game to do it to the extent where I am thinking "i need to upgrade soon". Of course I realise that every single game I own would benefit well from an upgrade to a 4670K, but with that comes around $500 cost to replace cpu mobo and ram. I've got a pretty sweet custom graphics setting going for crysis 3 that has got me averaging around 40-45 fps, with drops around 30. It looks fantastic. I can barely tell the difference between it and the max preset. Most games have been able to utilize the gpu enough to sit on 99% usag nearly constantly, but due to the CPU bottleneck in crysis 3 , it is sitting around 85 at minimum ad bouncing around up to 99 within that range.

Thankfully though, crysis 3 is going to be about as demanding as it gets for the next year or so. Battlefield 4 might be more taxing, by it's only going to be marginally more demanding (if it isn't in fact around the same or less). So I would expect to get at least until mid next year till I needto upgrade. And even then it would solely be so I don't have to play games less that the second highest preset.

2.4 ghz stock to my 3.33 OC doesn't actually kill performance that much either.. sometimes when I'd mess around with my bios I'd forget to put the oc on and end up playing far cry 3 or something on stock cpu.. Not much difference. Maybe 10-15fps..
Share
!