Sign in with
Sign up | Sign in
Your question

2500k, 3570K or 2700K

Last response: in CPUs
Share
a b à CPUs
October 11, 2012 5:53:28 AM

I've got a z68 board and I last winter when I built my system I cheaped out and I went with an i5-2310 I got on some daily deal. I didn't get a K chip because I told myself I would never want to overclock. Of course, now I do. I've got my 2310 running 3.6Ghz with a 103 BCLK, but that's pretty boring.

Anyway I am heading into a town that has a Microcenter next week, so I thought I would take advantage of their low CPU prices and upgrade to something unlocked. They have the 2500K for $160, 3570K for $180 and 2700K for $230. 2700K is doable budget wise, but only if it is really going to be a noticeable jump over the i5's after overclocking. I'm not sure I am sold on the usefulness of HT.

Anything I save on the CPU will help me get a nicer GPU which I am also upgrading shortly. So maybe another way to look at is would be 7850 and 2700k or 2500k and 7950, etc.

I basically do games and photo editing with some occasional short video editing.

Your advice and opinion is appreciated.

More about : 2500k 3570k 2700k

a b à CPUs
October 11, 2012 6:13:06 AM

i5-2500K for your motherboard
a c 455 à CPUs
October 11, 2012 6:17:59 AM

HT can be useful for photo editing and video editing programs. Other than HT the only significant difference for you between the 2500k and 2700k (both are Sandy Bridge CPUs) is 200MHz. So you need need to ask yourself is the extra 200MHz + HT work the extra $70? I don't believe the 2700k would give you any better OC than the 2500k. Then again I never really looked into the 2700k.

Before buying the i5-3570k (Ivy Bridge CPU) make sure that your mobo's BIOS is flashed a version that will recognize the 3570k. If such a BIOS version was released. The 3570k consumes a little less power than a Sandy Bridge CPU. It is also on average about 5% faster than a Sandy Bridge CPU at the same clock speed. But does not OC as high as a Sandy Bridge CPU because of heat.

I would simply choose the 3570k. If your mobo supports it.
Related resources
a c 184 à CPUs
October 11, 2012 6:20:04 AM

Well, those really wouldn't that much of an upgrade over your current CPU.
a b à CPUs
October 11, 2012 8:18:28 AM

jaguarskx said:
HT can be useful for photo editing and video editing programs. Other than HT the only significant difference for you between the 2500k and 2700k (both are Sandy Bridge CPUs) is 200MHz. So you need need to ask yourself is the extra 200MHz + HT work the extra $70? I don't believe the 2700k would give you any better OC than the 2500k. Then again I never really looked into the 2700k.

Before buying the i5-3570k (Ivy Bridge CPU) make sure that your mobo's BIOS is flashed a version that will recognize the 3570k. If such a BIOS version was released. The 3570k consumes a little less power than a Sandy Bridge CPU. It is also on average about 5% faster than a Sandy Bridge CPU at the same clock speed. But does not OC as high as a Sandy Bridge CPU because of heat.

I would simply choose the 3570k. If your mobo supports it.



How did you join at 1970 lol?
a b à CPUs
October 11, 2012 8:22:58 AM

Wouldnt bother with any of them or at least the i5's won't see you any upgrade for your gaming your better off getting a high end GPU. The i7 may help your editing but is your editing at such a level (i.e. profesional paid work) that you need the extra grunt, if so you should have got the i7 to begin with.

In short I would get a top shelf GPU and run that CPU/mobo combo till at least the next generation
a b à CPUs
October 11, 2012 9:00:13 AM

Can't sleep tonight so I've been monkeying with my overclocks and I'm getting essentially stock i5-3570K results in benchmarks (Got my 2310 at 3.71Ghz max turbo). So maybe you are right about not bothering. I mean I am not going watercooled, so I am probably going to get low 4's on my overclock which wouldn't be night and day in any apps vs 3.7 Ghz

I also overclocked my 5770 pretty well and now I can play bf3 at 1440x900 at High (on a 1680x1050 LCD) so maybe I could hold out on GPU, too.
!