Sign in with
Sign up | Sign in
Your question

E8600 beats out Q9550? Help quickly!!! Need to send back.

Last response: in Overclocking
Share
November 15, 2008 7:41:32 AM

This cant be so is it? According to this benchmark http://www.tomshardware.com/charts/desktop-cpu-charts-q... the E8600 is way better then the Q9550 that I just bought for more money.

I can only assume its because the E8600 is at 3.3Ghz and the Q9550 is at 2.83Ghz and Crysis doesn't utilize more then two cores.

So if I OC mine to 3.4Ghz which I have it at right now stable, does that mean now mine is better and would perform better on that benchmark? Please tell me it is, or please just tell me the truth quickly so i can send it back and get the E8600 instead. Hurry I only have a few days left to send it back to Newegg.

What I am trying to figure out is, is it just the Ghz that is making the difference? If it is then I guess I will keep it for the future quad games and keep it OCed to 3.4Ghz.

Please let me know right away!!!!!
November 15, 2008 8:07:18 AM

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

http://www.newegg.com/Product/Product.aspx?Item=N82E168...


Gee, $50 difference. That's okay, enthusiasts aren't supposed to be very price sensitive.



Have you seen the highly clocked e4300s take on processors worth four times as much?

Have you considered the fact that you get two more cores for that money?

Further, its not hard to figure out whether its Ghz or other related even without using google.
Clock your CPU to 3.3Ghz and see what happens under the same conditions.

I'm amazed you can overclock considering your logical ineptitude.
November 15, 2008 5:15:49 PM

The thing is yeah it was about $70 difference in price and when I wetn from an e6600 to the Q9550 only notice a slight performance increase and when I OCed it to 3.4Ghz I didnt notice any difference at all. So why pay for something that you dont use or dont see a difference with. Thats all i am saying, but if you guys say no I should be seeing a significant difference with the Q9550 then I may have a broken one and will just replace it. I like getting what I pay for and not just paying for a name and so far I have only payed for the name, so I feel.
Related resources
a b à CPUs
November 15, 2008 5:28:35 PM

My initial reaction is..... "Over Reaction". But that's just me. I think you're freaking out over much of nothing, personally.

It's your money. But I'd never spend that much on a single component. But that's cause I can't afford to blow that much on one component either.

Dual-Cores typically beat out Quads in games right now. Period. Cause games aren't designed well enough for multi-core CPUs yet. Crysis isn't always the absolute best benchmark either, as it stresses every system.

How big a difference you see from your e6600 to a Q9550 isn't entirely CPU related. Memory bandwidth, video card capabilities, motherboard chipset, and the software you're using play big roles in whether you can unleash the absolute abilities of that Q9550 in the first place.

All depends on what you're doing. Gaming, video editing, etc. But, right now, high clock speed dual-cores typically beat out Quads for most games. Just the way it is so far.

Give it a couple years, dual-cores will be bargain bin. Quads will be mainstream, and octo or whatever we get to will be the high end stuff. ;0
November 15, 2008 5:34:54 PM

jerreece said:
My initial reaction is..... "Over Reaction". But that's just me. I think you're freaking out over much of nothing, personally.

It's your money. But I'd never spend that much on a single component. But that's cause I can't afford to blow that much on one component either.

Dual-Cores typically beat out Quads in games right now. Period. Cause games aren't designed well enough for multi-core CPUs yet. Crysis isn't always the absolute best benchmark either, as it stresses every system.

How big a difference you see from your e6600 to a Q9550 isn't entirely CPU related. Memory bandwidth, video card capabilities, motherboard chipset, and the software you're using play big roles in whether you can unleash the absolute abilities of that Q9550 in the first place.

All depends on what you're doing. Gaming, video editing, etc. But, right now, high clock speed dual-cores typically beat out Quads for most games. Just the way it is so far.

Give it a couple years, dual-cores will be bargain bin. Quads will be mainstream, and octo or whatever we get to will be the high end stuff. ;0


So all I do pretty much is play games right now, so you think I should send it back and get the e8600 and some money back? honestly, will i see a bit of improvement in gaming performance going down to the e8600?
November 15, 2008 5:40:56 PM

The Intel processors work pretty self explanatory in the same architecture.
Higher clock = better frames but thats only in games or any app that uses only 2 Cores.
The E8600 is at 3.33 versus ur 2.83 so yes it will perform better. If u overclock ur quad to 3.33 ull be about the same.

Also i wouldnt use that Crysis benchmark as a suitable test. For one, Crysis is limited by GPU power first before ur processor. I mean srysly, GPU is first and by the time a dual/quad matters ur already pulling off crazy frames so it doesnt matter. I mean i get 230+ fps in CoD4 usually unless theres a lot of action. But 210 or 230 its not like it matters ffs. U think u can tell the difference of that 20 frames?

And Habitat...Ive seen the Q9550 stable at 4 GHz. Most do it on water for temp reasons, but ive seen many builds with teh Q9550 at or just under 4GHz with a good air cooler

Now i kno ill be flamed here but its the truth.
Some games are moving towards quad utilization. Not soon but definitely it will happen. And when it does ur 2.83 quad will win or be around the same as a 4.5 GHz dual, but once u overclock it ull win.
November 15, 2008 5:49:35 PM

Silverion77 said:
The Intel processors work pretty self explanatory in the same architecture.
Higher clock = better frames but thats only in games or any app that uses only 2 Cores.
The E8600 is at 3.33 versus ur 2.83 so yes it will perform better. If u overclock ur quad to 3.33 ull be about the same.

Also i wouldnt use that Crysis benchmark as a suitable test. For one, Crysis is limited by GPU power first before ur processor. I mean srysly, GPU is first and by the time a dual/quad matters ur already pulling off crazy frames so it doesnt matter. I mean i get 230+ fps in CoD4 usually unless theres a lot of action. But 210 or 230 its not like it matters ffs. U think u can tell the difference of that 20 frames?

And Habitat...Ive seen the Q9550 stable at 4 GHz. Most do it on water for temp reasons, but ive seen many builds with teh Q9550 at or just under 4GHz with a good air cooler

Now i kno ill be flamed here but its the truth.
Some games are moving towards quad utilization. Not soon but definitely it will happen. And when it does ur 2.83 quad will win or be around the same as a 4.5 GHz dual, but once u overclock it ull win.


Isn't there anything to say about the 12MB cache with the Q9550 or does that also not really matter compared to the 6MB with the E8600.

So my Q9550 at 3.4Ghz is equivalent if not better in game performance then the stock E8600 3.3Ghz?

Also, keeping it at 3.4Ghz, I wouldn't notice any difference in game performance by getting the E8600 at 3.3Ghz?
November 15, 2008 5:53:29 PM

ATM, the intels go by clock in games. (and architecture but were only talking about Penryn here)

A dual at 3.33 will do about the same as a quad at 3.33
The Cache helps but doesnt make an amazing difference.

The only reason id change is rly if u want to Overclock the E8600. They OC to around 4-4.5 which is out of reach of the quads.
But if u got the E8600 and left it stock there would be no difference between the stock 3.33 and the OCed Quad at 3.33
November 15, 2008 5:58:17 PM

Ill lay it out simply using several numbers:
Q9550: stock 2.83 and OCed 3.4
E8600: stock 3.33 and OCed 4.0+

Q9550 2.83 stock<E8600 3.33 stock (because 3.33 is more than 2.83)
E8600 3.33 stock~=Q9550 3.4 OC (about the same with more favor to Q9550)
Q9550 3.4 OC<E8600 4.0+ OC

Now keep in mind that this is with dual processed games only which is where we are now. In any quad utilizing game ur Q9550 will win.
All i can say is i have a Q9550 and love it. I average around 50 fps in Crysis. Indoors i get around 70-100fps (dont ask how) and outdoors i go to 30-50 (the 30 is in rly intense environments.
Some maps i do only avg. 30 but i just turn off my lights and it looks fine
November 15, 2008 6:03:26 PM

Thanks Silverion77. You ever get into Flight Sim X? Curious how your 9800GX2 with your Q9550 is doing with that game. They say FSX is a fully Quad utilizing game and needs it.

Also, I see you have made the choice of the Q9550 over the E8600. What clocks are you reaching with your CPU?
November 15, 2008 6:08:21 PM

Sadly i havent tried OCing yet....i kno its pathetic :( 
But i havent gotten a lot of time lately

Also i only have a stock cooler which isnt doing too hot right now (lol didnt mean the Pun ROFL)
Im either going to get a nice air cooler, but im doing some builds soon and if i can make some cash im gonna see if i can go H2O but thats cause im crazy and OCD.
For a good air cooler on Xtremesystems.com i generally see good OCs in the 3.6-3.85 and some going to 4.0

I havent tried Flight Sim X yet...never rly got into those but i may try it.
November 15, 2008 6:12:46 PM

Silverion77 said:
Sadly i havent tried OCing yet....i kno its pathetic :( 
But i havent gotten a lot of time lately

Also i only have a stock cooler which isnt doing too hot right now (lol didnt mean the Pun ROFL)
Im either going to get a nice air cooler, but im doing some builds soon and if i can make some cash im gonna see if i can go H2O but thats cause im crazy and OCD.
For a good air cooler on Xtremesystems.com i generally see good OCs in the 3.6-3.85 and some going to 4.0

I havent tried Flight Sim X yet...never rly got into those but i may try it.



I am looking for a new GPU. What are your thoughts on the GTX 280. Should I wait and save for that or just get the GX2 at a significantly lower price?
November 15, 2008 6:24:52 PM

I personally do not like the Price/Performance of the 280

Id either get the new 216 processor 260 (remade one) or the HD4870
I got teh GX2 becasue it had great price/performance beating the GTX 280 in all benchmarks of the games i played. But with newer drivers that changed. Im still happy with it though
a b à CPUs
November 15, 2008 6:36:46 PM

Spitfire7 said:
Isn't there anything to say about the 12MB cache with the Q9550 or does that also not really matter compared to the 6MB with the E8600.

So my Q9550 at 3.4Ghz is equivalent if not better in game performance then the stock E8600 3.3Ghz?

Also, keeping it at 3.4Ghz, I wouldn't notice any difference in game performance by getting the E8600 at 3.3Ghz?


The 12MB cache vs 6MB cache really doesn't matter with existing games. Basically, the 6MB cache in the E8600 is being used by two processors. The 12MB cache in a Q9550 is used by 4 processors, therefore two share 6MB, the other two share the other 6MB.

So if the game only uses a maximum of TWO processors, you're still only using 6MB of cache.

Silveron77 is right though. The future is Quad +. If you like to keep background tasks running (antivirus, Fraps, etc) the quad might give you a small edge there, if those apps are actually doing stuff. Not that I recommend running Antivirus scan while playing a game!! Your hard drive would limit you there. ;) 
November 15, 2008 6:58:09 PM

Let it eat.

You won't be getting back enough cash, after the restocking fee, for the aggravation. Later you will be happy you had the quad.
November 15, 2008 7:05:00 PM

Quote:
Have you seen the highly clocked e4300s take on processors worth four times as much?



e4300's have a overclocking cap of around 3.4ghz, that's with watercooling

Quote:
2. The quad can't do 4.0 ghz easily on stock or any voltage stable for that matter. Let alone put a fight with the 4.5 ghz below 1.4v. I don't even want to mention the 5.0+ ghz I've seen on air, that's just brutal.


that's a lie, i've seen the q9550[e0] hit 4.2 on air with a certain batch of processors... dude put up benchmarks, prime orthos and all that, he was 110% stable @ 1.35v

which also makes the purchase of the q9650 very questionable... the q9550 can pretty much OC just as high even without the addition of the .5x multi as the q9650, and w/o the guarantee of an E0 stepping
November 15, 2008 7:09:40 PM

Alright so I should keep it and see if I can get her up to about 3.8 to 4. Lets see. Hey if I am at 3.4Ghz right now at 1.35Volts, what do I make my voltage for 3.8 to 4Ghz?
November 15, 2008 7:11:12 PM

No way to tell....

Just keep OCing like u would.
Raise frequency, watch temps, and if u crash raise vCore
November 15, 2008 7:46:56 PM

Ok will do. Thanks. Hey Silverion, is there really even a noticeable reason to go higher? I mean I can crank this thing up to the max limit, but will there even be a point in doing that other then just showing off a benchmark that doesn't hold for every day game play? Should I really even do it?
November 15, 2008 8:00:16 PM

Not really.

Wait until you need the extra cycles for something specific. Based on your original question, I have to assume that you have limited OCing experience. You will shorten the life of the CPU at extremes, possibly to zero.

Don't fix it if it ain't broke.
November 15, 2008 8:12:46 PM

Zorg said:
Not really.

Wait until you need the extra cycles for something specific. Based on your original question, I have to assume that you have limited OCing experience. You will shorten the life of the CPU at extremes, possibly to zero.

Don't fix it if it ain't broke.


Zorg,

Yes you are right. I have only COed my GPU successfully and my previous e6600 from 2.4 to 3.2 successfully and stable. So I would still say I am new to it and cross my fingers when doing it.

So you suggest just going back to stock 2.83Ghz and walk away? I do have it stable at 3.4Ghz without any issues, but like I said before, not really any difference to be honest.
November 15, 2008 8:19:32 PM

if u have it at 3.4 just leave it :p 

free performance whether u see it or not and its not anything extreme so ur fine
November 15, 2008 8:32:08 PM

agree'd... only reason to really overclock your cpu nowadays is to relieve bottleneck, the lower the res, the higher the OC should be...only because games rely heavily on the gpu now

so @ 1680x1050, 3.4-3.6ghz should do just fine.

i've seen at 1280x1024, even an OC'd core2extreme at 4.0ghz just doesn't cut it
November 15, 2008 8:50:26 PM

Spitfire7 said:
So you suggest just going back to stock 2.83Ghz and walk away? I do have it stable at 3.4Ghz without any issues, but like I said before, not really any difference to be honest.
That depends, what are your Prime95 small FFT temps measured with Core Temp or Real Temp?
November 15, 2008 8:51:46 PM

Crysis Benchmark: Q9550 CPU

CPU 2.83Ghz Stock
Best Average fps: 29.79
Min fps: 14.96
Max fps: 41.96

CPU 3.4Ghz Overclocked
Best Average fps: 29.69
Min. fps: 14.87
Max fps: 41.54


Help me out here guys. Why did my fps all go down drastically when I overclocked it?
Anonymous
a b à CPUs
a b K Overclocking
November 15, 2008 8:55:37 PM

Drastically... drastic is like 10 fps... not .4 fps....

anyway whats your resolution... if its high t won't matter


yes keep the q9550

and @ habitat87

your an idiot... i have a C0 stepping q9550 ... not the newer and better E0 revision one... and i am at 3.85 ghz at 1.25 volts.... ON AIR...

So yeah its possible... yeah OF F*CKING COURSE your not going to be able to OC as far... its got 2 more cores... He had valid points and now you look like the idiot... i'd like to see your f*cked up rebuttel of this...
November 15, 2008 9:01:09 PM

1781333,28,305739 said:
Drastically... drastic is like 10 fps... not .4 fps....

anyway whats your resolution... if its high t won't matter


yes keep the q9550



ok not drastically, but still lower frames for what should be increase by OCing right? I have my res at 1680x1050. Is my GPU bottlenecking?
a b à CPUs
November 15, 2008 9:07:22 PM

Spitfire7 said:
1781333,28,305739 said:
Drastically... drastic is like 10 fps... not .4 fps....

anyway whats your resolution... if its high t won't matter


yes keep the q9550



ok not drastically, but still lower frames for what should be increase by OCing right? I have my res at 1680x1050. Is my GPU bottlenecking?
said:


The change in your FPS you showed in your post is really no change at all. It's so minor. If you ran that benchmark 3 times you'd probably get slightly different results each time, much like the results you got.

So basically, your CPU was not bottle necking your GPU in the first place. That's why you did not receive any gains in FPS from OC'ing, cause your CPU was already keeping up with the GPU.
November 15, 2008 9:15:28 PM

Right a half a frame, let it go.

By the way, and more importantly, what are your Prime95 small FFT temps?
November 15, 2008 9:16:29 PM

Quote:
@ habitat87

your an idiot... i have a C0 stepping q9550 ... not the newer and better E0 revision one... and i am at 3.85 ghz at 1.25 volts.... ON AIR...

So yeah its possible... yeah OF F*CKING COURSE your not going to be able to OC as far... its got 2 more cores... He had valid points and now you look like the idiot... i'd like to see your f*cked up rebuttel of this...
A little stressed?

My Q6600 OCs to 4.6G, and that's with no HS at all. :lol: 

Let it go, it's not worth it.
November 15, 2008 9:21:47 PM

Zorg said:
Right a half a frame, let it go.

By the way, and more importantly, what are your temps?



My temps for 3.4Ghz are under load at 47 degrees and idle at 42. They pretty much stay like that each time. Is that good, bad, or great?
November 15, 2008 9:29:40 PM

habitat87 said:
For idle it's a bit high but load is good. 5 degree temp differences between idle and load? Ummmm... Anybody want to comment on that also?


Habitat87, I know you are being sarcastic, but to a noob like me, what did you mean by "want to comment on that"? Were you saying 5 degree difference is a good thing or a bad?
November 15, 2008 9:37:37 PM

ok I have reduced idle temps instantly. My CPU halt state was disabled so I enabled it which brought my volts from 1.3v constantly down to 1.1v at idle. Also, my CPU multiplier drops to 6.0x rather then always staying at 8.5x at idle. Its better now. Dropped by about 5 degrees instantly after running all day. I should notice more of a drop after my computer is off for a while or over night.
a b à CPUs
November 15, 2008 9:41:51 PM

habitat87 said:

LOL! WTF! It's all drawing from the same cache. Only the previous gen chips had this feature, and currently AMD I believe. Also, it will say x2 or x4 with the actually sizes split accordingly. How the hell did you come up with that? Yeah, if your using TWO processors in the quad it's still using 12 MEGS of cache.


My statement was based on my knowledge of the Q6600 chipset. Which is basically 2 x Dual-Core chips. Each dual-core chip has 4MB of cache. However, the total of 8MB cache cannot be shared between individual cores.

http://www.sharkyextreme.com/hardware/cpu/article.php/3...

Quote:
Although the Kentsfield does include two 4MB Core 2 processors, it also means that many Core 2 features are not fully realized. Core 2 processors feature either 2MB or 4MB of Advanced Smart Cache, which is shared between the two cores. This allows the Core 2 processor a great deal of flexibility, and it could allocate all 4MB of L2 cache to a single core in gaming scenarios, while dynamically shifting the L2 cache between cores in a multi-threading environment. This type of flexibility is not present in the Kentsfield core, there is no facility for sharing of the 8MB of total L2 cache, and instead it acts like dual Core 2 processors, each sharing 4MB of L2 cache. This also affects the entire processor, with absolutely no shared resources between the dual processor dies, resulting in duplication between the two distinct processors.

There are still some advantages to the architecture. The entire L2 cache is not shared, but there is still a total of 8MB, with each Core 2 unit able to dynamically allocate its own 4MB share. The Core 2's L2 bus is fully 256-bit, and the L1 cache is 32KB instruction/32KB data caches per core, and each has 8-way associativity. The Core 2 includes support for Intel SpeedStep technology, and so too does the quad core Kentsfield. This is a huge advantage, as with four cores under the same roof, power management is a major concern. The Kentsfield also supports features like Execute Disable Bit, Intel 64 Technology, SSE4, and Intel Virtualization Technology, among others.


Now that's for the Kentsfield, including my Q6600. The Q9550 is a Yorkfield chip. From what I can find so far, the Yorkfield (which are Penryn) chips are no different, just have larger cache and a faster clock cycle. So far it appears the architecture is the same as the Kentsfield, only a die shrink (65nm to 45nm). Both are still 2x Dual-core chips. And each dual-core splits its own cache, but cannot share its cache with the other on board dual-core.

I'd look more but I'm at work right now. If you can find something to show that my earlier post was wrong I'd be interested to see it and correct my knowledge. ;) 
Anonymous
a b à CPUs
a b K Overclocking
November 15, 2008 9:53:17 PM

Habitat... Do you want to see it as a fact? I can and I will... but i won't as its a waste of time and energy proving to you that I can...

I have a q9550 on a P5Q Deluxe with a FSB of 453 GTL voltages set to auto (haven't messed with them... could definitely get more out of the processor though) 1.25 voltcore 1.36 northbridge

And a Core Contact Freezer Pro Cooler... load temp is 70 c in a 30 c ambient room....

Why would i lie about it? And also why would i regret going from a e6420 to a q9550... I noticed a MASSIVE increase in performance... not only is my computer noticeably faster in games... it doesn't get all choppy when doing lots of multi-tasking

Seriously habitat... your an idiot
a b à CPUs
November 15, 2008 9:55:43 PM

habitat87 said:
No, you got it wrong, I actually wanted to hear what someone had to say. Seriously, a 5 degree difference between idle and load. You ever hear of that before? That's why I asked what cooler you had also.


My question would be, are we talking 100% load on all 4 cores? Or load as in.... opened Firefox and downloaded something. Realistically, if you only go from 42C to 47C when going from idle to 100% load, you've got an extremely impressive cooling system on that CPU.
a b à CPUs
November 15, 2008 9:57:01 PM

habitat87 said:
However, the total of 8MB cache cannot be shared between individual cores.


Right, I'm pretty darn sure that's what I said in the first place. So what on earth are you arguing about? :heink: 
November 16, 2008 12:02:54 AM

Spitfire7 said:
My temps for 3.4Ghz are under load at 47 degrees and idle at 42. They pretty much stay like that each time. Is that good, bad, or great?
I am loving this feud... it gets the blood flowing.

Are these Prime95 numbers with Core Temp monitoring? I think not.


Listen, I don't care if you cook your CPU. I will help you if you care.

Give me the data I asked for, or not.
November 16, 2008 12:13:36 AM

Spitfire7 said:
ok I have reduced idle temps instantly. My CPU halt state was disabled so I enabled it which brought my volts from 1.3v constantly down to 1.1v at idle. Also, my CPU multiplier drops to 6.0x rather then always staying at 8.5x at idle. Its better now. Dropped by about 5 degrees instantly after running all day. I should notice more of a drop after my computer is off for a while or over night.
I was enjoying the fight and I just saw that you were searching and learning.

Very good.

So Where are my temps?
a b à CPUs
November 16, 2008 1:05:13 AM



What's your point? The Q6600 on NewEgg.com shows 2x4MB cache. The Q9550 simply says "12MB", and the AMD Phenom (which was never a part of our conversation) says L1 cache 4x128KB, L2 cache 4x512KB, L3 cache 2MB. Again, we're talking about Core 2 Duo vs Core 2 Quad in this thread. Or did you somehow miss that?

I'm not denying saying anything. You're arguing against nothing. Here you come into this thread blowing steam about how mighty you are, and yet you're arguing nothing.

The Intel Quad core processors, are TWO Core 2 Duo chips put together. If Core 2 Duo = 2 CPUS, then Core 2 Quad = 4 CPUS.

A Q9550 has 12MB L2 cache. Half of that (6MB) is allocated specifically to ONE of the Core 2 Duo chips (which is TWO processing units). The other half (6MB) is allocated specifically to the SECOND Core 2 Duo chips (which is also TWO processing units).

If current games are only programmed to take advantage of Dual-Core chips, and not quads (therefore use TWO processing units, and not FOUR), then that game will only be using 6MB of that L2 cache in the first place. The second half of the L2 cache (the extra 6MB of that 12MB total) cannot be used by the game you're playing.

Therefore, the difference between a 6MB L2 cache on an E8600 vs 12MB L2 cache on a Q9550 won't matter, in a game which is only capable of using two cores since the Intel Quad Core processors (not AMD mind you, since we never mentioned AMD) do not allow for the two Core 2 Duo chips within the Quad core to share L2 cache with one another.

Therefore, shut up with your pointless argument over nothing.
a b à CPUs
November 16, 2008 1:13:53 AM

habitat87 said:
Quote:
However, the total of 8MB cache cannot be shared between individual cores.


I know this, that's a feature of that chip gen. The specs on the q6600 actually does say 4mbx2, and yes, they did split it between the two duals in the processor. As for why, I asked myself the same thing, I guess it was better that way.


And BTW, I love how you edited/changed this post AFTER I quoted it and replied to it.
November 16, 2008 1:22:40 AM

where are my temps?
November 16, 2008 1:41:33 AM

Bottom line is whether you buy a E8500 or Q6600, aint a dimes worth of difference
November 16, 2008 2:05:59 AM

All i can say is wow....i leave for 5 hours and this happens...

The only thing i see about Habitat is:
1) hes a fool
2) Hes a troll that loves dual cores
3) He knos little of other's experiences. Habitat go to f*cking xtremesystems and look at their OCs for a Q9550. DO SOME DAMN RESEARCH
4) "LOL!" is his trademark line
5) i rly feel like hes just jealous of quad owners....i mean he thinks we all regret the quad. Why in hell would i regret a quad?? Cause i dont have a cheaper dual. So i can OC and get 3 fps?! O right we did prove that its 3 fps. Lets do another test. Lets run Norton Antivirus, fraps, and hmmmm what another hogging program...O i kno Extracting a file thats good...on a dual and a quad while gaming and lets see what happens...
November 16, 2008 3:08:30 AM

You guys are all responsible for hijacking this thread, about something that is not that important.

That gets in the way of my request for Prime temps.
November 16, 2008 3:10:06 AM

THG is messing around. They have the right.
November 16, 2008 3:57:57 AM

Temps?
November 16, 2008 7:38:34 AM

Remember several months ago when everyone was saying what habitat87 is saying now. So he is a little behind the curve. Let it go.

spitfire7, I can't answer your question unless you give me the Prime Temps.

Can you do that, or should I just move on?

Or you didn't need an answer?
November 16, 2008 9:03:25 AM

habitat87 said:
Well, things have changed a lot in the last few months now hasn't it? Actually, I pulled that from a recent thread. He mentioned that quads are in a bad position as of now, so that's got to be recent. The nail in the coffin is the i7 release.

I want see some links though. They can't talk all that **** and not give links.
What you are saying now has been beaten to death for at least six months. He!! I argued this before the Q6600 came out. It's a very very old argument and you are are on the wrong side at this point. It's so old that it's boring. Think of something new.

Although I have to admit, you did get some takers, and a bad rep in the process.

I just recommended a dual core very recently. It's not a war, quad is the way it's moving, it's not a big deal.

As you said, i7 is the future and we are all on board, after we see the OC results on mobos that are available and prices come down.


I am personally holding out for the Octo HTT, 16 virtual cores.

I'm guessing that the developers are going to accelerate the multicore optimization... or die.

And a little Ray Tracing anyone? That might actually get me excited again.
!