Sign in with
Sign up | Sign in
Your question

I3 / i5 / i7 Uses and discussion?

Last response: in CPUs
Share
November 11, 2012 12:21:51 PM

My thinking for the Intel Core i3 / i5/ i7 series is that they are used for different things obviously, but would you agree if i said:

i3 = 2 Cores | Hyper Threading
i5 = 4 Cores | No HT
i7 = 4/6 Cores | Hyper Threading

i3's are used for general desktop use with some use of light gaming taking advantage of the 2 extra threads.
i5's are used for i3 purposes but with medium - heavy gaming and overall faster performance/some light workstation use.
i7's are used for i5 purposes but for heaving gaming, and utilizing thier quad / hex core hyper threading they excell at workstation tasks like photo / sound / video editing and are lot better at rendering / 3D modelling.

I'm curious as this is what i believe the Intel core iX levels to be for, and I am upgrading at some point to an i5, but may consider an i7 as i will use it for Video editing, gaming and photo editing etc.

Don't come back with anything AMD. I know how weak the current FX's are and so am not interested - I'm by no means an intel fanboy but i really do prefer Intel's current lineup to AMD's.

I'm not looking to start a flame war!

What are your opinions?

More about : discussion

a c 202 à CPUs
November 11, 2012 3:50:37 PM

I do not think I7's are meant for gaming at all, they benifit gaming very little. I believe they are made for content creation, photoshop etx.

The iX's are just extreme version for extreme multithreading. Also Adobe etc...
a b à CPUs
November 11, 2012 4:02:41 PM

It's pretty easy:

i3-fast
i5-faster
i7-fastest

They can all be used for the same things and most will buy based more on budget than intended use.
Related resources
a c 202 à CPUs
November 11, 2012 4:37:39 PM

twelve25 said:
It's pretty easy:

i3-fast
i5-faster
i7-fastest

They can all be used for the same things and most will buy based more on budget than intended use.


Mostly I agree. For gaming an i3 is not sufficient though, but above that all is sufficient. As for content creation you want minimum i7. But yes effectively people buy what they can afford, even if it is overkill...
a c 109 à CPUs
November 11, 2012 5:36:47 PM

The i3 is more than enough for gaming; even BF3 multiplayer. It's just not the most optimal CPU to go with considering it's a Dual Core CPU backed up by Hyperthreading.

It also depends on the games you want to play; if you're going to play single player games the majority of the time, then the i3 is meant for you. Pair it up with the beefiest GPU you can afford and you have yourself one heck of a budget gaming rig.

If you're going to play multi-player games the majority of the time, you would obviously want an i5. The extra two cores on the i5 will benefit you greatly apposed to the Hyperthreaded threads on the i3.

As others have stated, there is really no benefit to an i7 for gaming purposes; Hyperthreading isn't being taken advantage of at the moment so there's really no need. However, for massively CPU-intensive tasks such as video encoding/editing or photo editing, there is really no better CPU to turn to than the i7's. :) 
November 11, 2012 5:44:21 PM

I believe if you're looking for gaming purposes, you obviously want a quad-core CPU. Honestly, the i5 has anything you need for gaming, and it's obviously cheaper than the i7. Sure the i7 is faster than the i5, but the i7 has other features that wouldn't be necessary for gaming.
November 11, 2012 5:46:53 PM

Sorry, I just kinda re-stated what mocchan said. Didn't see that he posted that. My B!
a c 185 à CPUs
November 11, 2012 5:47:53 PM

Just compare this:
# of games that use more than 2 cores to # of games that use 1-2 cores.
a c 202 à CPUs
November 11, 2012 6:07:40 PM

amuffin said:
Just compare this:
# of games that use more than 2 cores to # of games that use 1-2 cores.


The fewer core utilizing games far outnumber the more core games, BUT think about it like this...

How many games have been release since Quad cores became mainstream, few...

Consider that more and more games are releasing that can utilize more than 2 cores. You can not just compute it like that as we have no idea if games will continue with 2 core trend, or will move on to more core usage. It would make sense to use more cores as that is what platforms are becoming. Heavily multi-cored. Even Android devices are quad core now, yet they have lower clock rates.

Bottom line :

Less than 4 cores are sufficient NOW but will not be pretty soon for mainstream games. (Crysis 3, can't wait!!!!)
November 11, 2012 7:39:26 PM

Doesn't the newest version of Intel Quick Sync (utilized with a 3770K, say) kick the tar out of even the 3960X in video encoding?
As with everything there are trade-offs. If you want something that can game like hell and can also take multi-threaded apps out to the woodshed, maybe go for an i7-3930K build at the most.

Now, suppose you ONLY worked with highly-threaded apps (photo-stuff, 3D rendering). Then some dual Xeon 16-core/32 thread beast would not only take those apps out to the woodshed, they would be begging for mercy and squealing like pigs; BUT, you sacrifice light and single-threaded performance (and pay thousands more to do so). Although, some specific functions on some of those applications can utilize your CUDA cores on your Nvidia graphics card (workstation vs gaming doesn't matter, in fact, Geforce's typically have more CUDA cores than their more expensive Quadro counterparts) instead of the CPU cores.

You have to decide where on the range of single-threaded performance to maximum multi-threaded performance your needs fall, it's hard to get the best of both at the same time unless you have the money to build separate rigs.

If you want to spend the dough, a 3930K system (especially overclocked) seems a good balance for your applications and for your gaming (I haven't checked the benchmarks in a while, but it should slay games that rely more on the graphics card if you do tri or quad sli, because of the greater amount of PCI-E lanes supported by the LGA 2011 processor).

If you want to be a little bit more budget (and power-use) conscious, a 3770K system might be a little slower, but not terribly slower than a 3930K in your applications, it would be slightly better for single-threaded apps, though.

If you are really budget conscious, a 3570K, non-hyperthreaded quad-core would kick-ass at gaming, but be a little slower yet in your photo/video editing/3D modeling & rendering, but not in the specific case of using Quick Sync for encoding.

Haswell is due out in March/April. Can you wait that long to see how that will perform? It might perform as good/better than the SB-E hex cores at even a lot of multi-threaded stuff......remember the 2600K vs the 980-990X?
a b à CPUs
November 11, 2012 8:39:30 PM

Novuake said:
Mostly I agree. For gaming an i3 is not sufficient though, but above that all is sufficient. As for content creation you want minimum i7. But yes effectively people buy what they can afford, even if it is overkill...


i3 does quite well at gaming. It does content creation just fine. It all goes back to how fast the tasks finish, which is again mainly a function of how much you want to spend.

November 11, 2012 9:23:45 PM

Pleased with the feedback on this thread guys and some good points so thanks! Any views on what to get on upgrade? I currently have a CX430 PSU but will upgrade to a 600w soon coupled with a gtx 660 ti. I have already listed my needs for a processor.
I'm prepared to wait for Haswell if need-be but i think an i7 would be best at the moment. My budget would be somewhere inbetween £180-220.
What are your guys thoughts? Any feedback is appreciated!
a c 109 à CPUs
November 11, 2012 9:58:59 PM

On a side note, instead of the GTX660 Ti, get the Radeon HD7950. With current driver updates, it performs better than the GTX660 Ti and the sheer overclockability of these cards puts it very close to GTX670 levels of performance.

If you're purchasing now, then the i7 3770k is primarily your go-to CPU. However, if you're gaming and not going to do any kind of video/photo editing, I would recommend sticking with an i5 3570k and beefing up your GPU :) 
November 11, 2012 10:40:56 PM

I would go for the 7950 IF i had 2 or more monitors, but i don't so i will stick with the 660 Ti & i dont need anything better to be honest! I'm not buying the CPU yet unfortunately, sometime around Q1 or Q2 2013 and i would be using it for video/photo production and rendering as well as gaming so I think i should get an i7 but is it really worth the price difference over a 3570K when I could just buy an SSD with an i5 over just an i7?
3570K: £160
120gb SSD: £60
That comes in at £220 whereas an i7 is about £250.
Is the i7 really worth it when i could get the i5 and SSD? And what difference would i see?
November 11, 2012 10:48:22 PM

ebalong said:
Doesn't the newest version of Intel Quick Sync (utilized with a 3770K, say) kick the tar out of even the 3960X in video encoding?
As with everything there are trade-offs. If you want something that can game like hell and can also take multi-threaded apps out to the woodshed, maybe go for an i7-3930K build at the most.

Now, suppose you ONLY worked with highly-threaded apps (photo-stuff, 3D rendering). Then some dual Xeon 16-core/32 thread beast would not only take those apps out to the woodshed, they would be begging for mercy and squealing like pigs; BUT, you sacrifice light and single-threaded performance (and pay thousands more to do so). Although, some specific functions on some of those applications can utilize your CUDA cores on your Nvidia graphics card (workstation vs gaming doesn't matter, in fact, Geforce's typically have more CUDA cores than their more expensive Quadro counterparts) instead of the CPU cores.

You have to decide where on the range of single-threaded performance to maximum multi-threaded performance your needs fall, it's hard to get the best of both at the same time unless you have the money to build separate rigs.

If you want to spend the dough, a 3930K system (especially overclocked) seems a good balance for your applications and for your gaming (I haven't checked the benchmarks in a while, but it should slay games that rely more on the graphics card if you do tri or quad sli, because of the greater amount of PCI-E lanes supported by the LGA 2011 processor).

If you want to be a little bit more budget (and power-use) conscious, a 3770K system might be a little slower, but not terribly slower than a 3930K in your applications, it would be slightly better for single-threaded apps, though.

If you are really budget conscious, a 3570K, non-hyperthreaded quad-core would kick-ass at gaming, but be a little slower yet in your photo/video editing/3D modeling & rendering, but not in the specific case of using Quick Sync for encoding.

Haswell is due out in March/April. Can you wait that long to see how that will perform? It might perform as good/better than the SB-E hex cores at even a lot of multi-threaded stuff......remember the 2600K vs the 980-990X?

Haha good old 2600K ;)  It just so happens my birthday is in April so maybe i'm in luck with Haswell but we'll have to wait and see. There should be a bigger difference between IB and HW than there was between SB and IB. In terms if my budget, a 3930K is out of the question, a 3770K is my limit really, look at my last post.
November 11, 2012 11:13:40 PM

That's sounds good. I don't think you will feel "limited" at all rockin' a "meager" 3770K. The 3930K would be a little faster depending on the app., but not enough to be a big deal. I think of the 3930K as the upper limit that someone who is wanting an all-around gaming/workstation would go with, if they wanted bragging rights and to shave a couple of minutes here and there from certain tasks. The 3960X, at $1,000, is kind of ridiculous, even from a bragging rights standpoint. Like someone else pointed out, you could do a lot of that stuff on an i3, but I bet the difference between an i3 and an i7 in some of those applications is more significant than the difference between a 4core/8 thread i7 and a 6 core/12 thread sandy or ivy bridge E. You kind of get slightly diminishing increases for ever more money after the 3770K.

If you were inclined to wait for Haswell, I'd be interested to see how a "4790K" (or whatever the sku ends up being) runs with the six-cores - could be pretty close (Haswell 4-core might even best the mighty SB-E), even in highly threaded stuff.
a c 202 à CPUs
November 11, 2012 11:18:10 PM

ebalong said:
That's sounds good. I don't think you will feel "limited" at all rockin' a "meager" 3770K. Like someone else pointed out, you could do a lot of that stuff on an i3, but I bet the difference between an i3 and an i7 in some of those applications is more significant than the difference between a 4core/8 thread i7 and a 6 core/12 thread sandy or ivy bridge E. You kind of get slightly diminishing increases for ever more money after the 3770K.

If you were inclined to wait for Haswell, I'd be interested to see how a "4790K" (or whatever the sku ends up being) runs with the six-cores - could be pretty close (Haswell 4-core might even best the mighty SB-E), even in highly threaded stuff.


Haswell is reputed to not be so much of a performance increase but more of a massive reduction in power usage. Top be honest I don't think Intel knows in what direction they are going.
November 11, 2012 11:28:01 PM

Novuake said:
Haswell is reputed to not be so much of a performance increase but more of a massive reduction in power usage. Top be honest I don't think Intel knows in what direction they are going.



It seems like they are thinking of throwing their loyal (but probably small) group of "enthusiasts" - who want a premium desktop, but don't want to/don't need to spring for the server chips - under the bus, in an attempt to increase their market share from the mobile/laptop/tablet segment (a much larger group of consumers).
November 11, 2012 11:35:27 PM

Thanks again for the feedback guys! Like i said before what would the performance difference be in multithreaded apps between a 3570k w/ ssd and a 3770k? The i5 setup is cheaper...
November 11, 2012 11:39:21 PM

I'm currently using an i3-3220 with radeon hd 7770. plays all the games I play on max settings @1280 x 1024 5ms W/50,000,000:1

Imo it's beautiful (:
a c 109 à CPUs
November 11, 2012 11:46:48 PM

calumconroy said:
Thanks again for the feedback guys! Like i said before what would the performance difference be in multithreaded apps between a 3570k w/ ssd and a 3770k? The i5 setup is cheaper...

For multithreaded apps, once you get things loaded, the 3770k will absolutely shred everything in its path. The i5 3570k will work fine, however, but it wouldn't be as productive as an i7 due to the lack of Hyperthreading.

All-in-all, if you don't care about how fast applications load, or having the fastest rig on the planet, and just want raw CPU horsepower, the i7 would definitely be your clear pick.
November 11, 2012 11:46:55 PM

mocchan said:
For multithreaded apps, once you get things loaded, the 3770k will absolutely shred everything in its path. The i5 3570k will work fine, however, but it wouldn't be as productive as an i7 due to the lack of Hyperthreading.

All-in-all, if you don't care about how fast applications load, or having the fastest rig on the planet, and just want raw CPU horsepower, the i7 would definitely be your clear pick.

Thats a bit contradictory dont you think? Talking about how much power is in an i7 then saying if you dont care how much there is buy an i7?
a c 109 à CPUs
November 12, 2012 12:00:20 AM

Woops, I may have been a bit misleading with my previous post, I apologise. I was meaning "If you don't care for how fast applications load" regarding OP's purchase of an i5+SSD rather than an i7+HDD.
November 12, 2012 12:01:56 AM

Oh okay, thanks for clearing that up.
November 12, 2012 12:30:46 AM

calumconroy said:
Thanks again for the feedback guys! Like i said before what would the performance difference be in multithreaded apps between a 3570k w/ ssd and a 3770k? The i5 setup is cheaper...



Have a look at these benchmarks to give you an idea.

http://www.tomshardware.com/reviews/core-i7-3970x-sandy...
November 12, 2012 12:58:40 AM

So...even the i-5 stacks up pretty well next to the big boys, and even comes out on top, along with the 3770K in stuff that taxes one core at a time. All those setups used an ssd.
a c 116 à CPUs
November 12, 2012 2:31:24 AM

Novuake said:
Top be honest I don't think Intel knows in what direction they are going.

I think where Intel is going for the desktop is crystal-clear.

Most desktop applications and games require nowhere near as much processing power as what current CPUs could deliver if software could fully and efficiently use them and programmers are having a hard time making efficient and effective use of multiple threads/cores so until software catches up, Intel is focusing on power efficiency to get in a better position in the tablet, ultrabook and notebook markets which is where all the growth is going.

Desktop PC sales shrunk for the first time ever by 7% in 2011 and the last predictions I have seen for 2012 say the desktop shrink may hit 14%. On the other hand, portable computing device sales are in the double digit growths.

Most people simply do not want to have a bulky, expensive and inefficient conventional PC if some form of reasonably affordable portable computing can get their stuff done in a more convenient form factor.

Within the next decade, conventional PCs will become something only people with atypical requirements will bother with.
a c 109 à CPUs
November 12, 2012 2:33:05 AM

^ +1

I remember talking to a few relatives while I was in Japan this summer; no one buys desktops over there anymore due to space restraints. Portable, fast, and cheap is where the money is at.
a c 283 à CPUs
November 12, 2012 2:48:38 AM

InvalidError said:
Within the next decade, conventional PCs will become something only people with atypical requirements will bother with.


If gaming counts as an atypical requirement, I agree. Of course gamers are a very small slice of the overall market.
a c 116 à CPUs
November 12, 2012 4:01:07 AM

DJDeCiBeL said:
If gaming counts as an atypical requirement, I agree. Of course gamers are a very small slice of the overall market.

A year ago, IGPs were still laughing stock for games.
This years, AMD's A10 brought IGP frame rates in the realm of the playable with performance approaching their HD5670.
The next generation of IGPs will likely push playable frame rates equivalent to mid-range GPUs with performance possibly in the neighborhood of HD8770 if AMD pushes aggressively.
The generation after that might very well make discrete GPUs unnecessary for the typical gamer and that would be 6-7 years in the future.

Intel is planning to push IGPs fairly aggressively on their side as well with Haswell IGPs having up to three times the performance of HD4000 and I would not be surprised if Broadwell bumped that by another 3-4X as well to catch up with AMD's..

For the people who want to run their 9-displays panoramic 3D setup with Ultra details, hooking up external GPUs using Lightning v2 will likely be an option by then. I would not be too surprised if GPUs became either embedded or available as plug-in modules for smart displays somewhere along the way.

Basically, the PC ecosystem, the way we look at it and possibly even the definition of 'PC' are going to have to reinvent themselves.
a c 283 à CPUs
November 12, 2012 4:09:16 AM

Eh, I'm gonna be that old fogey who won't let go, lol.

I hate laptops and tablets and I always have (only ever owned one laptop in my life, and that was just for college, years ago). I'll use a desktop until they literally don't exist anymore. Mobility doesn't mean much to me, since a smartphone can do what I need (internet, mostly), if I'm away from my desktop.

I'll always choose my desktop over any "mobile" device for general computing, though.

As you say, there may come a time when I don't necessarily need a discreet GPU, but that doesn't change my opinion of wanting to have a desktop instead of a "mobile" device (and as long as I do have a desktop, there will probably always be a discreet GPU in it, unless GPU prices soar way out of control).

I guess all of that becomes moot, though, if what we currently think of as a "desktop" is reduced to a tiny little 5"x5" box, and add-ons and upgrades cease to exist. That will be a very, very sad day. I'm not sure I even want that future, lol. Having more power than you need and being able to upgrade without throwing out the whole damn thing is part of what makes this whole thing fun for me. Just buying "throwaway" systems every 2 or 3 years is boring.
November 12, 2012 4:58:52 AM

DJDeCiBeL said:
Eh, I'm gonna be that old fogey who won't let go, lol.

I hate laptops and tablets and I always have (only ever owned one laptop in my life, and that was just for college, years ago). I'll use a desktop until they literally don't exist anymore. Mobility doesn't mean much to me, since a smartphone can do what I need (internet, mostly), if I'm away from my desktop.

I'll always choose my desktop over any "mobile" device for general computing, though.

As you say, there may come a time when I don't necessarily need a discreet GPU, but that doesn't change my opinion of wanting to have a desktop instead of a "mobile" device (and as long as I do have a desktop, there will probably always be a discreet GPU in it, unless GPU prices soar way out of control).

I guess all of that becomes moot, though, if what we currently think of as a "desktop" is reduced to a tiny little 5"x5" box, and add-ons and upgrades cease to exist. That will be a very, very sad day. I'm not sure I even want that future, lol. Having more power than you need and being able to upgrade without throwing out the whole damn thing is part of what makes this whole thing fun for me. Just buying "throwaway" systems every 2 or 3 years is boring.



I agree. I use my computer(s) for architectural/engineering work, and there is still a significant difference in performance between a high end desktop setup and a high end laptop. Not to mention the ergonomics. I need my expensive 13-way adjustable chair, my custom desk that allows me to roll in real close so my mouse arm is fully supported, my multitude of displays, etc. Comfort while working is far more important to me than carrying some tablet around that does everything under the sun except warn me that I am about to be run over by the light rail because I wasn't paying attention to my surroundings because of my nifty does-everything mobile computing device.
a c 109 à CPUs
November 12, 2012 5:01:54 AM

DJDeCiBeL said:
Eh, I'm gonna be that old fogey who won't let go, lol.

I hate laptops and tablets and I always have (only ever owned one laptop in my life, and that was just for college, years ago). I'll use a desktop until they literally don't exist anymore. Mobility doesn't mean much to me, since a smartphone can do what I need (internet, mostly), if I'm away from my desktop.

I'll always choose my desktop over any "mobile" device for general computing, though.

As you say, there may come a time when I don't necessarily need a discreet GPU, but that doesn't change my opinion of wanting to have a desktop instead of a "mobile" device (and as long as I do have a desktop, there will probably always be a discreet GPU in it, unless GPU prices soar way out of control).

I guess all of that becomes moot, though, if what we currently think of as a "desktop" is reduced to a tiny little 5"x5" box, and add-ons and upgrades cease to exist. That will be a very, very sad day. I'm not sure I even want that future, lol. Having more power than you need and being able to upgrade without throwing out the whole damn thing is part of what makes this whole thing fun for me. Just buying "throwaway" systems every 2 or 3 years is boring.

+1 I couldn't agree more.
a c 116 à CPUs
November 12, 2012 5:28:43 AM

DJDeCiBeL said:
I guess all of that becomes moot, though, if what we currently think of as a "desktop" is reduced to a tiny little 5"x5" box, and add-ons and upgrades cease to exist. That will be a very, very sad day.

They might not necessarily cease to exist... but I hope they won't exist in the inefficient space-wasting form factors we have today.

Imagine how much space could be reclaimed by simply:
1- switching from PC-XT legacy expansion slots to motherboard edge connectors
2- switching from having wires between the motherboard and HDDs/SSDs to mSATA connectors on the motherboard's edge like laptops do - connect the drive to motherboard and screw it down onto the motherboard's tray, no cables to mess and waste space with.
3- reducing drive bays to quarter-height - slot-load laptop drives are thinner than this
4- make SO-DIMMs standard on the desktop

With those relatively simple changes alone, you could have almost the same expandability as a conventional tower packed in a 1.5-2" thick box.
a c 116 à CPUs
November 12, 2012 5:34:28 AM

ebalong said:
I agree. I use my computer(s) for architectural/engineering work, and there is still a significant difference in performance between a high end desktop setup and a high end laptop.

I did say "within the next decade".

A laptop may not have the oomph to do what you need to do today but 5-10 years down the road will likely be a completely different story.
a c 283 à CPUs
November 12, 2012 5:37:24 AM

Yeah, that's all well and good, but we have a long way to go for that, I think.

And that also may as well be a laptop, at that point. The two would basically be indistinguishable, it's just that high end (expensive) models could be probably upgraded and and low end (cheap) models probably couldn't (oh wait, I think it's pretty much like that already, with laptops, lol). All that's missing is the screen, in your scenario.



a c 283 à CPUs
November 12, 2012 5:42:00 AM

InvalidError said:
A laptop may not have the oomph to do what you need to do today but 5-10 years down the road will likely be a completely different story.


All I can say to that is "we'll see". The power requirements alone may not allow for that for a long time to come (and yes, I know efficiency is at the forefront of designing the newer CPU architectures. I still think it'll be hard to squeeze top flight, desktop level performance out of a mainstream mobile level CPU for a very long time).
a c 116 à CPUs
November 12, 2012 10:54:25 AM

DJDeCiBeL said:
I still think it'll be hard to squeeze top flight, desktop level performance out of a mainstream mobile level CPU for a very long time.

CPU-wise, the TDP cap on desktop is shrinking at least on Intel's side... 135W, 95W, 77W... Haswell might be capped to 55-60W, close enough to the 45W upper end of mobile CPUs that we may see the days desktop chips in laptops return.

IGPs/GPUs may take a few years longer but as I said before, external/dockable GPUs can take care of that.
a b à CPUs
November 12, 2012 1:53:32 PM

We already have the i7 3920XM, which is a mobile 55W processor that performs really, really close to a stock 3770. Of course it is 3-3.5x the price.

You can also get graphics cards in laptops like the GTX 680M or the 7970M, which perform like full size GTX 600 and 7870. Again, they are expensive, but it can be done.

The problem is when comparing desktop to laptops of the same price. If you want $1500 desktop performance, you probably need to spend $3000+ on a laptop. It's expensive to architect cooling and power for a high end system that is also portable.

Of course there is an upper limit. You can't build a mobile that performs like a $3000 workstation.
a b à CPUs
November 12, 2012 1:58:59 PM

InvalidError said:
I did say "within the next decade".

A laptop may not have the oomph to do what you need to do today but 5-10 years down the road will likely be a completely different story.


The problem is that performance is a moving target. A new $1000 laptop already beats a 5 year old desktop PC in just about everything. 5 years ago people did high end engineering work on computers that would be "slow" today. So in 5-10 years, the laptops will be much faster, but still lagging behind the best desktops and certainly desktops of the same price. An iphone blows the pants of any computer I had as a teenager by a huge margin, but does that matter anymore?

a c 116 à CPUs
November 12, 2012 2:26:55 PM

twelve25 said:
The problem is that performance is a moving target.

My point is that the gap between mobile and desktop processing power is shrinking quickly because chip manufacturers are clamping down hard on power requirements and there is a lot more power to axe off desktop components than mobile as we can see from Intel axing 22W from SB to IB. Based on Intel's claims about Haswell power efficiency, another 20W there seems very likely.

Once both desktop and laptop CPUs end up at something like 45W, there won't be much of an advantage left in splitting hairs between desktop and laptop CPUs and this might happen as early as Broadwell in 2015... if Intel over-charges for CPUs, laptop manufacturers will move their desktop replacement product lines back to desktop CPUs like they did back in the P3 Coppermine/Tualatin days before desktop power requirements started ballooning.
a c 146 à CPUs
November 12, 2012 2:33:07 PM

Novuake said:
Mostly I agree. For gaming an i3 is not sufficient though, but above that all is sufficient. As for content creation you want minimum i7. But yes effectively people buy what they can afford, even if it is overkill...


That's compleatly wrong. The I3 can play any game on the market right now even BF3 multiplayer on a 64 player server. You might have to turn down the setting from ultra high to high but games won't completely smother any recent intel dual core in any meaningful way.
November 12, 2012 2:41:34 PM

Thanks for all the replies guys! Although i don't see how this went onto laptops and 5" x 5" desktops xD
I think i will go for the i7 as the difference in gaming is minimal and it fares quite a big better than i5s at content creation/rendering.
How do i select a best answer? (If possible on a discussion like this)
November 12, 2012 3:01:25 PM

InvalidError said:
They might not necessarily cease to exist... but I hope they won't exist in the inefficient space-wasting form factors we have today.

Imagine how much space could be reclaimed by simply:
1- switching from PC-XT legacy expansion slots to motherboard edge connectors
2- switching from having wires between the motherboard and HDDs/SSDs to mSATA connectors on the motherboard's edge like laptops do - connect the drive to motherboard and screw it down onto the motherboard's tray, no cables to mess and waste space with.
3- reducing drive bays to quarter-height - slot-load laptop drives are thinner than this
4- make SO-DIMMs standard on the desktop

With those relatively simple changes alone, you could have almost the same expandability as a conventional tower packed in a 1.5-2" thick box.



Noooooo!!! I love my bulky Antec P280 and will take it with me to the grave. From my cold, dead hands!
a b à CPUs
November 12, 2012 3:35:51 PM

InvalidError said:

Once both desktop and laptop CPUs end up at something like 45W, there won't be much of an advantage left in splitting hairs between desktop and laptop CPUs and this might happen as early as Broadwell in 2015... if Intel over-charges for CPUs, laptop manufacturers will move their desktop replacement product lines back to desktop CPUs like they did back in the P3 Coppermine/Tualatin days before desktop power requirements started ballooning.


There will still be heat control and again, a moving target for battery life expectations. People have higher expectations for the size and shape of laptops now. Can you imagine trying to cool even a low voltage i3-2100T in a 1" thick case? That's only a 35W processor.

November 12, 2012 5:26:46 PM

Okay then don't answer me...
a b à CPUs
November 12, 2012 5:48:39 PM

Your question is too vague to get a concrete answer. i7 are better than i5 are better than i3. They all can do everything you want, so it's just a matter of performance desired and amount you can spend.

a c 283 à CPUs
November 12, 2012 5:51:11 PM

twelve25 said:
Your question is too vague to get a concrete answer. i7 are better than i5 are better than i3. They all can do everything you want, so it's just a matter of performance desired and amount you can spend.


He meant how to select a B.A. I think, but this is a Discussion thread, so you can't.
November 13, 2012 12:14:49 AM

Ah. Even though it's called 'discussion with an answer' ?
!