Sign in with
Sign up | Sign in
Your question
Solved

I5-3570K vs i7-2600K?

Last response: in CPUs
Share
May 14, 2012 8:39:47 PM

I have two choices,

i5-3570K with Z77 with a 7850 so PCI 3.0
i7-2600K with z68 with a 7850 so PCI 2.1

Which one will be better for gaming?

More about : 3570k 2600k

May 14, 2012 8:44:55 PM

ok........(for a little change...please, no negativity intended)... as a starter for us.....what are your (op's) thoughts on the two? ...then we can continue.

m
0
l
May 14, 2012 9:53:02 PM

oh ok, sure, I know the 2600k is about 10% faster than the 2500k and therefore somewhere in between like 5-10% better than the 3570k.

However, without an ivy bridge, i cannot enable PCi 3.0, and so far, I have no idea what effect PCI 3.0 has on gaming, will PCI 3.0 over 2.1 more than make up for the 10% increase by the 2600k?

EDIT: oh i forgot as well, out of the 2500k, 2600k and 3570k, which one has the best overclocking capabilities?
m
0
l
Related resources
a b à CPUs
May 14, 2012 9:54:00 PM

Because you are choosing k series ships and z77 and z68 chipsets, I assume you will want to overclock. They are the same card and if the rest of the system is the same, then I would shoot for the 2600k if you plan to OC. Ivy Bridge runs really hot when overclocked.

Hope this helped! :) 
m
0
l

Best solution

a b à CPUs
May 14, 2012 9:57:11 PM

zijin_cheng said:
oh ok, sure, I know the 2600k is about 10% faster than the 2500k and therefore somewhere in between like 5-10% better than the 3570k.

However, without an ivy bridge, i cannot enable PCi 3.0, and so far, I have no idea what effect PCI 3.0 has on gaming, will PCI 3.0 over 2.1 more than make up for the 10% increase by the 2600k?

EDIT: oh i forgot as well, out of the 2500k, 2600k and 3570k, which one has the best overclocking capabilities?

PCI-E 3.0 just gives more bandwidth. You don't need it and a PCI-E 3.0 card will work with a PCI-e x.x board. They are backwards compatible. In gaming., the 2500k and 2600k are the same chip just the 2600k has 100Mhz more speed per core and 2mb more L3 cache. The core i7's main difference from the 2500k is Hyper-Threading! The 2500k should overclock better than both those chips due to it not having Hyper-Threading and it not being Ivy Bridge. Haha :lol:  Go for the 2500k if you plan to game, 2600k if you plan to video edit/encode.
Share
a b à CPUs
May 14, 2012 10:03:57 PM

2600K brings hyperthreading and nothing to the gaming arena (at this time - but, haven't seen anything to indicate it will in the future either.

In stock configuration, the 3570K is 5-15% faster than the *2600K* depending on the app/game you are running.

How far do you plan to OC? Are you trying to achieve a specific number or just a reasonable bump?

IB is not that much hotter up to around 4.5 or 4.6 (which is the gaming equivalent of a 2600K at 4.7 or 4.8). I left voltage to auto (max 1.119) and got a stable 4.4 at 60C under OCCT small data set.

As a point of reference, Battlefield 3 never hits 100% utilization in multiplayer on a 3.8GHz clock except during initial level loading. At 3.8, I barely break 40C.

Don't believe the zomgz it's hot hype.
m
0
l
a c 188 à CPUs
May 14, 2012 10:04:48 PM

HostileDonut gave a good answer; really it comes down to overclocking and how much you think you are going to be trying to get out of the processor. Also if you are or might use the IGP at some point then you might want to go with the Intel Core i5-3570K since it does have the better Intel HD 4000 graphics. Mild overclock and you might as well go with the Intel Core i5-3570K if you are looking for some higher numbers then go with the Intel Core i5-2500K.



Christian Wood
Intel Enthusiast Team
m
0
l
May 14, 2012 11:11:34 PM

I know PCI 3.0 are backwards compatible, but I won't see any performance increase?

Also the 2600K barebones kit is $80 more expensive than the 3570K barebones kit, I should probably get the 3570K kit right? (assuming everything else is the same)
m
0
l
a b à CPUs
May 14, 2012 11:21:48 PM

zijin_cheng said:
I know PCI 3.0 are backwards compatible, but I won't see any performance increase?

Also the 2600K barebones kit is $80 more expensive than the 3570K barebones kit, I should probably get the 3570K kit right? (assuming everything else is the same)

Say a PCI-E 3.0 slot has a road that can hold 30 cars on it, a PCI-E 2.x slot 20 cars, and a PCI-E 1.x 10 cars. Your card will be putting less than 10 cars through it. It's all about bandwidth, but your card won't use that much bandwidth anyways. You're good with 2.0. ;) 

If you plan on OCing, go for the 2600k as it runs much cooler! :) 
m
0
l
a c 283 à CPUs
May 14, 2012 11:26:27 PM

Quote:
I know PCI 3.0 are backwards compatible, but I won't see any performance increase?


With a PCIe 2.0 GPU, absolutely not. With a PCIe 3.0 GPU, MAYBE, if it's a top end card running all out. PCIe 3.0 is nothing more than window dressing for now, in gaming at least.
m
0
l
a b à CPUs
May 14, 2012 11:27:55 PM

DJDeCiBeL said:
Quote:
I know PCI 3.0 are backwards compatible, but I won't see any performance increase?


With a PCIe 2.0 GPU, absolutely not. With a PCIe 3.0 GPU, MAYBE, if it's a top end card running all out. PCIe 3.0 is nothing more than window dressing for now, in gaming at least.

It's not about the card being a PCI-E 3.0 card. That doesn't mean how much bandwidth it needs, it just means it's compatible. PCI-E x.x doesn't really mean anything.
m
0
l
a c 283 à CPUs
May 14, 2012 11:29:47 PM

HostileDonut said:
It's not about the card being a PCI-E 3.0 card. That doesn't mean how much bandwidth it needs, it just means it's compatible. PCI-E x.x doesn't really mean anything.


I used to think that, but I've been corrected quite a few times now by people saying that PCIe 3.0 is beneficial and useful in GPGPU usage. But in gaming, it doesn't matter a bit.
m
0
l
a b à CPUs
May 14, 2012 11:34:52 PM

DJDeCiBeL said:
I used to think that, but I've been corrected quite a few times now by people saying that PCIe 3.0 is beneficial and useful in GPGPU usage. But in gaming, it doesn't matter a bit.

The way you get PCI-E 3.0 to run is by having a PCI-E 3.0 card and board. It really doesn't matter if your hd7870 (or whatever card that supports it) says PCI-E 2.x or 3.0. They will use the same bandwidth. I think what you are thinking is about PCI-E 1.x not having enough bandwidth with very high-end cards running synthetic benchmarks. PCI-E 2.x has more than enough bandwidth for any card out at the moment.
m
0
l
a c 283 à CPUs
May 14, 2012 11:39:18 PM

HostileDonut said:
The way you get PCI-E 3.0 to run is by having a PCI-E 3.0 card and board. It really doesn't matter if your hd7870 (or whatever card that supports it) says PCI-E 2.x or 3.0. They will use the same bandwidth. I think what you are thinking is about PCI-E 1.x not having enough bandwidth with very high-end cards running synthetic benchmarks. PCI-E 2.x has more than enough bandwidth for any card out at the moment.


Again, I used to think that and I agree that PCIe 2.0 is more than enough for gaming, but the people saying the PCIe 3.0 is beneficial for GPGPU usage were using SLI 680's, so I tend to believe them.
m
0
l
a b à CPUs
May 14, 2012 11:41:39 PM

DJDeCiBeL said:
Again, I used to think that and I agree that PCIe 2.0 is more than enough for gaming, but the people saying the PCIe 3.0 is beneficial for GPGPU usage were using SLI 680's, so I tend to believe them.

SLi would be for each GPU. If you split a 16x PCI-E 2.0 lane, it turns into a x16 PCI-E 1.0 lane. If you run two PCI-E 2.0 x16 lanes and two GTX 680s it's the same exact bandwidth as one single GTX 680 and one single PCI-E 2.0 lane.
m
0
l
a b à CPUs
May 14, 2012 11:42:48 PM

What is GPGPU?
m
0
l
a c 283 à CPUs
May 14, 2012 11:44:17 PM

HostileDonut said:
SLi would be for each GPU. If you split a 16x PCI-E 2.0 lane, it turns into a x16 PCI-E 1.0 lane. If you run two PCI-E 2.0 x16 lanes and two GTX 680s it's the same exact bandwidth as one single GTX 680 and one single PCI-E 2.0 lane.


OK, I'll trust you, since I don't do GPGPU stuff myself, but I've gotten into the same argument before (arguing YOUR point) and I got shot down multiple times.
m
0
l
a c 283 à CPUs
May 14, 2012 11:45:12 PM

HostileDonut said:
What is GPGPU?


General Purpose GPU usage. Anything that that uses the GPU besides gaming (ie. video encoding etc.).
m
0
l
May 14, 2012 11:58:08 PM

DJDeCiBeL said:
Again, I used to think that and I agree that PCIe 2.0 is more than enough for gaming, but the people saying the PCIe 3.0 is beneficial for GPGPU usage were using SLI 680's, so I tend to believe them.


So you're saying that you won't see any performance increase going to 3.0 from 2.0 because current cards can't max out the bandwidth transfer of a 2.0?

If the above is true, will current cards max out 2.0 anytime soon? If so, would it be better to get a 3.0 to future proof it or will 2.0 be enough for 2-3 years?
m
0
l
a c 283 à CPUs
May 15, 2012 12:02:51 AM

zijin_cheng said:
So you're saying that you won't see any performance increase going to 3.0 from 2.0 because current cards can't max out the bandwidth transfer of a 2.0?

If the above is true, will current cards max out 2.0 anytime soon? If so, would it be better to get a 3.0 to future proof it or will 2.0 be enough for 2-3 years?


You have the right idea. PCIe 3.0 is just a future proofing thing in the realm of gaming. 1 or 2 more generations of GPU's (who knows what that will be in years, though 2 or 3 is my best guess) will be what it takes for PCIe 2.0 to be an issue in gaming.
m
0
l
May 15, 2012 1:03:10 AM

Same problem in here. The only thing that's currently holding me back from buying 2500k vs 3570k is the PCIE 3.0 support. I'd like to see more opinion on how much longer will 2.1 stand-out before it eventually exhaust's it's bandwidth since I do upgrades of CPU+Mobo in 5-6 years and the only thing I do replace every 2-3 years is the GPU.
m
0
l
May 15, 2012 1:40:15 AM

tobats120 said:
Same problem in here. The only thing that's currently holding me back from buying 2500k vs 3570k is the PCIE 3.0 support. I'd like to see more opinion on how much longer will 2.1 stand-out before it eventually exhaust's it's bandwidth since I do upgrades of CPU+Mobo in 5-6 years and the only thing I do replace every 2-3 years is the GPU.


Well I already decided and I bought a 3570K barebones kit, but I'll wait until you get your answer then I'll choose a best answer
m
0
l
a b à CPUs
May 15, 2012 2:27:07 AM

Always buy the latest tech, z77+Ivy Bridge, sure pcie 3.0 maybe be useless now but later on who knows. With good cooling H100 or DH-14 you can get the i5 3570k up to 4.8Ghz, around 80-85 c on load.
m
0
l
May 15, 2012 2:37:16 AM

Best answer selected by zijin_cheng.
m
0
l
a b à CPUs
May 15, 2012 9:21:21 PM

zijin_cheng said:
So you're saying that you won't see any performance increase going to 3.0 from 2.0 because current cards can't max out the bandwidth transfer of a 2.0?

If the above is true, will current cards max out 2.0 anytime soon? If so, would it be better to get a 3.0 to future proof it or will 2.0 be enough for 2-3 years?

Yes, PCI-E 1.x is just about getting maxed out now and PCI-E 2.x runs twice the bandwidth, and PCI-E 3.0 twice PCI-E 2.x. :) 

Thanks for the best answer too! :) 
m
0
l
June 25, 2012 6:53:22 PM

Just to clarify some points for people.

PCIe 3.0 = 2x bandwidth PCIe 2.x = 4x bandwidth PCIe 1.x
(so PCIe 2.x = 2x PCIe 1.x)

Also, when using SLI on a motherboard with a CPU less than 2011 pins,
each slot becomes a (that was PCIe 2.x x16) becomes PCIe 2.x x8 (NOT PCIe 1.x x16).
Bandwidth wise they are equivalent, but the specifications are different. (Which is not just about bandwidth, but also data encoding etc.)
In a 2011 pin mb they would both be x16. (or with 3 cards, x16 x8 x8 etc)

Which is another important improvement of PCIe 3.0 over 2.0.

2.0 has about a 20% data overhead, where as 3.0 only has a 1.5% overhead.
So that makes the bandwidth difference even bigger.
3.0 is also has enhanced signaling and data integrity.

As far as gaming goes, right now it isn't going to change much.
But as far as GPGPU goes, it will make a significant difference.
Read this article about that: http://www.anandtech.com/show/5261/amd-radeon-hd-7970-r...
Which in games that support CUDA physics (eg Mirrors Edge, [if you don't have it, buy it]),
this could make a difference.

Just my 2 cents.

*DISCLAIMER*
I am in no way an expert about this subject. All information was found online and subjected to scrutiny by me.
I am currently completing a degree in Mechatronic engineering, computer science, and mathematics, so I do know a little about using GPUs for General Purpose stuff, and about how computers actually work (Those lectures are worth attending, interesting stuff).
m
0
l
June 25, 2012 6:59:22 PM

oh and I am also planning on getting an i5 3570k, though I'm hoping when I OC it, that it doesn't get as hot as 80-85 C under load, that's fairly hot considering it's rated at somewhere like 65 C max. (or maybe 80-85 is the core temp, in which case it's probably fine. The max 65 is on top of the CPU (ie the case, hence; Tcase)

Though I'm going to invest in an awesome case that should have some serious airflow happening so that will reduce the temp a bit more. It'll have 8 Noctua 12cm fans + NH-D14 and a giant 20cm exhaust fan. One of the fans will be behind the CPU, which is kind of cool.

It's going to cut my compile time down by 99%. I can't wait.
m
0
l
!