Great... it's Quad Core... but why?

EDIT: Please read through this whole thread, it's got a LOT of great points and some reply's with GREAT information. THANKS!

I've gotta say that 4 cores doesn't really seem all that appealing to me at this point in time. At least NOT for your typical home user. Especially since dual core is finally starting to pick up some real steam. I'm hoping that app and game programmers will start actually utilizing these "multi-core" CPU's before we take off in to the "10-core" realm. It just seems like WAY too soon to already be thinking 4 cores. (anyone else share these thoughts?) I wish that even now the current dual cores could use the second core for something like physics processing. (a PPU just seems like a waste of money in it's current form... just check out the most recent Tom's review) The only way I can see wanting a quad core machine is if, for example, in games they could work it so that one core does AI, one core does physics, one runs your background apps/resources, and one does whatever single core would normally process during games. That seems like it would be good utilization, but hey... what do I know? I'm sure there are other uses like high end video editing, or CAD, or for Enterprise Servers, or something along those lines that some people may be able to use the multi core CPU's for; but not so much for the average user... at least not yet.

EDIT: My real fear is that with C2D and such coming out now only high end enthusiasts and Enterprise users will find any real value or use for quad cores or more at this time. (Enterprise b/c obviosly they can actually use the power in high end servers and such, and the enthusiasts really just for bragging rights) And because only high end consumers and Enterprise will find value it there may be a seriously large inventory left over... which could hurt both Intel and AMD. It just seems like too soon. My opinion is that they need to wait just a little bit until apps can actually utilize all these cores to make it worth the money. Dual core is great for now.
98 answers Last reply
More about great quad core
  1. Rule of Extensibility: design for the future, because it will be there sooner than you think.
  2. As to the Question why..

    I would believe it to be because of the money to be made off of Enterprises that need that kind of processing power.

    Or money to be made off of people who think they need that kind of processing power.
  3. Quote:
    I would believe it to be because of the money to be made off of Enterprises that need that kind of processing power.


    Ok, fair enough. I can see it for servers and such, but doesn't it seem like waste and/or overkill for the consumer desktop right now? "Yay... I've got 4 cores... but I'm never really using more than 2 at any given time... but I've got 4 of em."
  4. Well for a 4 core desktop home user, I would find ridiculous. But of course I'm not ripping/gaming/defragging/downloading/formating/and compiling programs all at the same time.

    I guess you can also compare it to SLI. I mean in most cases your going to get some what little increase in performance for the amount of money spent on a single GPU setup.

    I just think its all about the money, and perhaps bragging rights of the person who purchased it.

    Edit: Until they come up with programs that the majority of the popluation who uses computers that take good advantage over a 4 core system would be the reason to get one. I mean 64 bit is in its upcoming moments.
  5. because we can do it, why stop?
  6. Quote:
    But of course I'm not ripping/gaming/defragging/downloading/formating/and compiling programs all at the same time.


    LOL That's exaclty what I'm saying! Honestly, I think it could actually work against Intel and AMD to go quad core right now. I'm afraid that most smart people will realize that quad core isn't worth the price tag and there will be a large inventory just sitting around... which of course could really hurt both companies. Sure some people will buy them, but I don't think the demand will be anything like for the C2D.
  7. Quote:
    because we can do it, why stop?


    Well I don't think we should "stop", just wait a little bit before we go quad core. They should get a little more power out of the current dual cores, and give dev's time to start progamming for multi-core CPU's, THEN start going quad core when they would actually be worth it.
  8. Quote:
    because we can do it, why stop?


    Well I don't think we should "stop", just wait a little bit before we go quad core. They should get a little more power out of the current dual cores, and give dev's time to start progamming for multi-core CPU's, THEN start going quad core when they would actually be worth it.

    got a point there but, if they can optimize dual cores, wouldnt it be easier to just optimize programs for use of all cores?
  9. Quote:
    because we can do it, why stop?


    Well I don't think we should "stop", just wait a little bit before we go quad core. They should get a little more power out of the current dual cores, and give dev's time to start progamming for multi-core CPU's, THEN start going quad core when they would actually be worth it.

    exactly, once the programming is written to take advantage of multiple cores, you will deffinatly see an increase in performace, meaning there will be much better gaming with physics(hopefully) and if you are rendering stuff it will take much less time over a single core processor any day(i have spent hours rendering stuff and it is a complete pain in the ass to sit and wait when you could be putting that time to good use) faster encoding with music and video. there are endless amounts of possibilities for them its just waiting for the programming to come along and catch up
  10. Man, what's with these calculator things?! My abacus does the job just fine!

    If you have the capacity, you'll find a need.
  11. Quote:
    got a point there but, if they can optimize dual cores, wouldnt it be easier to just optimize programs for use of all cores?


    Well maybe I'm wrong on this, but I would assume that once they optimize software the work with more than one core then it would just scale as more cores are added... at least that's how I think it should work. LOL So even if they program for dual core now, then when quads are available it would just scale for 4 cores, or 8, ect.
  12. There are several coding ways to detect multiplue cores, so if A program is made for dual core, alot of the time it would work on say...96 cores, with a preformance boost. Plus there could still be "reverse hyper threading"

    Going quad core, even if you can't use it now means that someone wouldn't have to upgrade when they need more power for multitasking.

    Quad Core Pwns (QCP)
  13. IMO amd will go with their what 2x2 core systems?
    Called 4x4 for I have no idea why.
    Then once they have their quad core they'll transition to what a total of 8 on the board.
    True this won't have a performance increase to scale.
    Also the price they can manufacture them at will come into effect.
    There are those with too much money though that'll walk around screaming they have 8 cores though.
    I'll just shrug bc my OC'd C2D will suit me fine and run cooler/more effeciently.
  14. Quote:
    IMO amd will go with their what 2x2 core systems?
    Called 4x4 for I have no idea why.

    Two dual CPUs and two dual GPUs.
  15. Quote:
    IMO amd will go with their what 2x2 core systems?
    Called 4x4 for I have no idea why.

    Two dual CPUs and two dual GPUs.

    Now THAT might be kinda sweet, but it still seems like WAY overkill. Still a lot of unused power!
  16. Quote:
    IMO amd will go with their what 2x2 core systems?
    Called 4x4 for I have no idea why.

    Two dual CPUs and two dual GPUs.
    Two grand later wow I couldn't imagine 2 7950's.
  17. Well, not everything multithreads well. So if you have modular code that is threading several non-threadable codes. It might not scale then beyond the number of non-threadable operations.

    But lots of things won't have this problem. I have a 16core "workstation" at work and all I want is more cores....
  18. Quote:
    I have a 16core "workstation" at work and all I want is more cores....


    Dude where the hell do you work? lol I assume you mean it's a "server" not a "workstation" right? Or maybe your doing some SERIOUS number crunching for like nuclear explosion simulations or something! lol
  19. Quote:
    IMO amd will go with their what 2x2 core systems?
    Called 4x4 for I have no idea why.

    Two dual CPUs and two dual GPUs.

    Now THAT might be kinda sweet, but it still seems like WAY overkill. Still a lot of unused power!

    Yes unused for now but I would call it future proofing :) If I spend alot of money on my system I like to make sure it will last me at least 2 years. My cheaper builds I could care less if they go 6 months lol
  20. Quote:
    I have a 16core "workstation" at work and all I want is more cores....


    Dude where the hell do you work? lol I assume you mean it's a "server" not a "workstation" right? Or maybe your doing some SERIOUS number crunching for like nuclear explosion simulations or something! lol

    The only thing I see that catches my eye is the word..

    MORE

    :lol:

    It will never be enough.. will it?
  21. Quote:
    Rule of Extensibility: design for the future, because it will be there sooner than you think.


    Translation: Build it, and they will come
  22. Quote:
    Grimmy wrote:
    I would believe it to be because of the money to be made off of Enterprises that need that kind of processing power.


    Ok, fair enough. I can see it for servers and such, but doesn't it seem like waste and/or overkill for the consumer desktop right now? "Yay... I've got 4 cores... but I'm never really using more than 2 at any given time... but I've got 4 of em."


    Good point. It is a bit of overkill for all but the enthusiast right now but hey, it's like getting a Ferrari or something. Sure you have a V12 with gobs of hp, but why? Just because... Now for the enterprises... it's all about getting things done and with more cores... more work gets done. I mean I can see where the whole desktop thing plays in but with the way people multitask... it may not be that bad of an idea.
  23. Quote:
    Yes unused for now but I would call it future proofing :) If I spend alot of money on my system I like to make sure it will last me at least 2 years. My cheaper builds I could care less if they go 6 months lol


    Yes, but with the rate of advance in graphics and such, that "future proofed" super expensive build is going to be WAY out of date in just two years. Take the difference in video cards from two years ago and the generation of DX10 cards coming out soon. That's a huge gap not only in performance but the technologies that are implimented.

    BTW I'd rather spend less money more often by building/upgrading more often, also keeping up with technology. Instead of spending $2500 every couple of years, I sell my "old" system and take a little bit of saved up money every 6 months and do a new build and/or major upgrades. The total cost was around $2500 on my last build... my most expensive build to date. However only about $300 of that actually came "out of my pocket" as I used money from selling the "old" parts. (which in fact were only 6 months old so they still had a good resell value) So $300 bucks or so every six months means I only spend about half as much money and I'm always up-to-date. (my current build has lasted me the longest, but I'm upgrading here in a couple of months hopefully... just been too busy! lol)
  24. For Enterprises.. time is money.

    For Home Users.. more cores is more cores to brag about :lol:
  25. Right... Enterprise applications may take advantage of all those cores... but that's because of the contstant workload and many different tasks that servers are required to do. Home user's aren't going to put that kind of load on a CPU without having apps optimized for multi core setups. I dunno... I see your point about "because we can" and those who like to brag... but it just doesn't seem like a smart move on the consumer front at this time. Like I said above, I think they might get stuck with a large inventory of quad cores b/c only high end enthusiasts and Enterprise users will actually have the money and the will to buy them.
  26. Yup... why do you think I'm getting a woodcrest setup? 2X2 cores means I have plenty of power to spare... and imagine when quad-core xeons come out... oh dear...
  27. yeah i think they should wait until start flooding the market with quad core setups , because programers are just strarting to make dual core aplicatton, but i know a few friends that would love it, jus rendering video in Avid and doing some post producrion i AE and rendering effects for some other job, and do some DVD authoring i think those are the people who would love quad core , because hey really gain some production time with it ;) , not gamers, real working computers would gain with quad core tec.
  28. "640 K ought to be enough for anybody."
    -- Bill Gates, 1981

    Remember this quote? 4 cores = future-proofing. Besides, adding more cores will allow software developers to add more power/features to their software through multithreading. Just look at what dual core processors did for the resource-hungry Adobe software.
  29. Currently, it's overkill for the home market - especially if the GPU
    is the bottleneck right now.

    But, why do we overclock???
    But, why do we compare AMD vs Intel????
    We want speed, - even if we really can't see the benefit, except for
    benchmarks.

    It is the future. So if the hardware is there, software will follow.

    It's cool.

    Once it's perfected, - Very scalable - between low performance
    and extreme performance. Just add more cores.
    This will be the roadmap for quite a few years. 10ghz processing
    power in a couple years.
  30. Quote:
    because we can do it, why stop?


    Well I don't think we should "stop", just wait a little bit before we go quad core. They should get a little more power out of the current dual cores, and give dev's time to start progamming for multi-core CPU's, THEN start going quad core when they would actually be worth it.

    I'm a VB.NET programmer so I'm not very smart, but I believe us developers don't need to "start programming for multi-core CPU's" we've already been writing threaded applications for years.

    Please see this article and goto the section "Programming for Dual-Core": http://www.devx.com/amd/Article/26686

    Note the section that says even if the programmer didn't thread his application, Windows and Linux will take advantage of the dual-core processor's ability to run multiple single-threaded tasks simultaneously, thereby increasing overall system responsiveness anyway.

    So I guess the bottom line is in theory having 4 cores should improve performance for all applications, not just Enterprise apps using many threads. But this is all a mute point because quad-core CPU's are coming whether you want it or not (I want it) :D
  31. Hard core multitasking and my favorite: Parallelized apps. The latter are easier to implement with a quad core CPU that does not share its cache with the other cores. This favors AMD because of the decreased software overhead that a shared cache requires

    Here's a good article related to the use of quad cores and games. Hardware will drive the software development just like bloated software and a GUI OS drove faster processor development in the 90's.

    Taking Game Performance to the Max with Threading: With the increased popularity of dual-core gaming platforms, the key to gaming successes is mastering threading. Here’s how.

    Quote:
    I've gotta say that 4 cores doesn't really seem all that appealing to me at this point in time. At least NOT for your typical home user. Especially since dual core is finally starting to pick up some real steam. I'm hoping that app and game programmers will start actually utilizing these "multi-core" CPU's before we take off in to the "10-core" realm. It just seems like WAY too soon to already be thinking 4 cores. (anyone else share these thoughts?) I wish that even now the current dual cores could use the second core for something like physics processing. (a PPU just seems like a waste of money in it's current form... just check out the most recent Tom's review) The only way I can see wanting a quad core machine is if, for example, in games they could work it so that one core does AI, one core does physics, one runs your background apps/resources, and one does whatever single core would normally process during games. That seems like it would be good utilization, but hey... what do I know? I'm sure there are other uses like high end video editing or CAD or for Enterprise something along those lines that some people may be able to use the multi core CPU's for, but not so much for the average user... at least not yet.

    EDIT: My real fear is that only high end enthusiasts and Enterprise users will find any real value or use for quad cores or more at this time. (Enterprise b/c obviosly they can actually use the power in high end servers and such, and the enthusiasts really just for bragging rights) And because only high end consumers and Enterprise will find value it there may be a seriously large inventory left over... which could hurt both Intel and AMD. It just seems like too soon. My opinion is that they need to wait just a little bit until apps can actually utilize all these cores to make it worth the money. Dual core is great for now.
  32. It think you'll fnd that once application designers/programmers get a hang of dual cores they will quickly take advantage of quad etc. Going from a single threaded application to one designed for multiple threads is a greater leap than going from 2 cores to 4 cores. The real issue is at what point is there no gain because you simply cannot do things in parallel (i.e. each thread is waiting for the others before they can proceed). For common applications (Office, browsing, etc.) there is virtually no need for anything beyond 2 threads. However in things like gaming, photo rendering etc. you could easily create 4 threads handling different things. Heck in gaming you could in theory create a thread for each active AI opponent in proximity to the player ( much like when playing online w/ 16 other people you have 16 "threads" going).

    So I guess what I am saying, yes your typical home user who uses the PC in a very simple fashion will not need the power, however the use of multiple cores can be harnessed at a lower level than the enthusiast crowd.

    Just my opinion, as it requires good software design ;)
  33. Quote:
    yeah i think they should wait until start flooding the market with quad core setups , because programers are just strarting to make dual core aplicatton


    Wow I didn't know there was such confusion over the way software takes advantage of dual core CPU's. Please see the article:
    http://www.devx.com/amd/Article/26686

    Quote:
    The next big question is: How do application developers take advantage of dual-core processors?

    Answer: The same way you exploit multi-chip SMP, by instituting threading.


    Every program I write uses multiple threads, and I write stupid little programs no one uses, I'm positive big software development companies do the same. So IMHO many average users would see a performance boost in many applications with more cpu cores.
  34. Coding dual core takes time and it would be a large portion of coding to make use of aside from single core coding.

    Nobody will take dual cores seriously until there is enough people with dual cores to make it worth coding for. For now the majority of people use single cores, and that's what they code for. Enterprise have their own coding for multi core and will always use it.

    Just like before, nobody will take quad cores seriously until there are enough people with them to make it worth coding for. I know of one new game coming out that will (as of now) utilize only one core - Hellgate: London. So until the mass market has dual cores, we'll be stuck with single core coding. This applies up the tree of X-cores.
  35. Quote:
    Coding dual core takes time and it would be a large portion of coding to make use of aside from single core coding.

    Nobody will take dual cores seriously until there is enough people with dual cores to make it worth coding for. For now the majority of people use single cores, and that's what they code for. Enterprise have their own coding for multi core and will always use it.

    Just like before, nobody will take quad cores seriously until there are enough people with them to make it worth coding for. I know of one new game coming out that will (as of now) utilize only one core - Hellgate: London. So until the mass market has dual cores, we'll be stuck with single core coding. This applies up the tree of X-cores.


    Could you please post a link to a web page that explains how you "code for multi core", because everything I can find says you use threading which is what the VAST VAST majority of programs already do... Like I've said I'm a small VB.NET programmer and even little old me uses threads in my sucky applications... So trust me popular applications do too!!
  36. The development of faster, more efficient, more powerful technology is simply fueled by the demands of the consumer, everyone posting on this thread is an example. Companies will remain competitive as long as this demand is present, and this demand for faster, more efficient, more powerful cpu's is a symptom of the progression in the modern world.

    Why would people not want a CPU that can do 2x, 3x, 4x 5x the amount of work, using less power and doing it in less time?

    To ask why people would not want the increase in power is somewhat of a myopic point. I find it inevitable that the home user, and what they can actually do with there pc, is something that will grow. Software and game programmers can only write within the parameters of the hardware at their disposal. So we will be able to do things on our laptops and desktops at home that perhaps were not within our reach with older technology, you only have to look at what pc's could do 10 years ago to understand this growth curve.

    Personally i am glad that the quad core and beyond will be within the reach of people like myself in the very near future. I find it exciting to think about so many things that will suddenly be in the reach of my finger tips, game realism we have not seen, software that can do things perhaps we do not even associate with home pc use.

    Resistance is futile eh?
  37. Quad-core processors have been proven to have four times as many cores as single-core processors.

    I agree that there is no need right now for them, but I can see perhaps at the end of 2007 software developers will be in full multi-core mode.
  38. Quote:
    Quad-core processors have been proven to have four times as many cores as single-core processors.

    I agree that there is no need right now for them, but I can see perhaps at the end of 2007 software developers will be in full multi-core mode.


    OMG please someone tell me how I need to program in "full multi-core mode" because I thought I've been doing this for the last 5 years using threads???
  39. I know it is very hypothetical, but imagine if AMD got the whole reverse hyperthreading thing to work. Even if software engineers could only program for say 2 cores but at the time the new thing was quad core chips. Just think about two processors working together on the same process. It would be like having using a 2.2 GHz quad core for dual core 4.4 GHz or single core 8.8 GHz! And....AMD claims its quad core would use same energy as its current dual cores. That baby could load windows in a fraction of a second!

    I know its very hypothetical, but man.....imagine......
  40. agent, thank for your input on this thread as I am not a programmer and perhaps that is my reason for not understanding. Your input is EXACTLY what I was looking for when I created this topic. So, that being said. Your saying then that most applications (including your own "crappy" ones as you put it) should be able to take advantage of multiple cores no matter what? If this is true, then GREAT!

    As I have read through this thread I've seen a lot of valid points. Don't get me wrong guys... I'm not arguing that we shouldn't go to quad core... I just thought that it was a little soon. Especially with C2D coming out now, and then from my understanding Kentsfield will be out very soon. But with many people buying C2D's such as the E6600 at fairly affordable prices, who's going to want to buy a quad core only a few months later? Save the people who hold out for them, don't you think there might be a bit of an abundance of stock as the demand may not be very high for them as of yet?
  41. Quote:
    OMG please someone tell me how I need to program in "full multi-core mode" because I thought I've been doing this for the last 5 years using threads???


    Mr. Agent,

    I am not a programmer, thus I do not know the proper terms. I've only learned the basics of C++, Java, and Visual Basic .net.

    However, one thing I am very aware of is that NONE of the software I currently own and use is multi-threaded. It is all single threaded. If I bought a dual-core or quad-core processor I would not stand to benefit much from it.

    I appreciate your attack though.
  42. Quote:
    agent, thank for your input on this thread as I am not a programmer and perhaps that is my reason for not understanding. Your input is EXACTLY what I was looking for when I created this topic. So, that being said. Your saying then that most applications (including your own "crappy" ones as you put it) should be able to take advantage of multiple cores no matter what? If this is true, then GREAT!

    As I have read through this thread I've seen a lot of valid points. Don't get me wrong guys... I'm not arguing that we shouldn't go to quad core... I just thought that it was a little soon. Especially with C2D coming out now, and then from my understanding Kentsfield will be out very soon. But with many people buying C2D's such as the E6600 at fairly affordable prices, who's going to want to buy a quad core only a few months later? Save the people who hold out for them, don't you think there might be a bit of an abundance of stock as the demand may not be very high for them as of yet?


    I think the reason Core 2 Duo is pretty cheap is that Intel has in mind that 2 cores will be your celerons and 4 cores will be your more expensive chips ? maybe further further down the road they could go 4 cores for the poor and 8 cores for the more afluient lol This is my tin foil hat thoery for the day :)
  43. Quote:
    agent, thank for your input on this thread as I am not a programmer and perhaps that is my reason for not understanding. Your input is EXACTLY what I was looking for when I created this topic. So, that being said. Your saying then that most applications (including your own "crappy" ones as you put it) should be able to take advantage of multiple cores no matter what? If this is true, then GREAT!

    As I have read through this thread I've seen a lot of valid points. Don't get me wrong guys... I'm not arguing that we shouldn't go to quad core... I just thought that it was a little soon. Especially with C2D coming out now, and then from my understanding Kentsfield will be out very soon. But with many people buying C2D's such as the E6600 at fairly affordable prices, who's going to want to buy a quad core only a few months later? Save the people who hold out for them, don't you think there might be a bit of an abundance of stock as the demand may not be very high for them as of yet?


    I think the reason Core 2 Duo is pretty cheap is that Intel has in mind that 2 cores will be your celerons and 4 cores will be your more expensive chips ? maybe further further down the road they could go 4 cores for the poor and 8 cores for the more afluient lol This is my tin foil hat thoery for the day :)

    Oh I see... so now CPU's will discriminate between social and financial classes! ;) lol
  44. I have to wonder. When programming for dual cores, wouldn't you just write code that would take advantage of having multiple cores, not necessarily just two?

    If this is the case, then wont that code scale to 4, 8, 16 cores with no modification. It just seems like a waste to write code that only uses two cores when everyone knows that processors are gonna keep scaling up in core-count.

    -not directed at anyone in particular, just a few posts I've read in the past week-
    It just seems odd to me that one would say, "They just stared writing for 2 cores, and now theyre talking about 4. They'll have to write all new code for that..."
  45. My only guess would be that it is probably more complicated and time consuming to program for multiple cores. I would think the programing time would progress linearly as you have to then tell 4, 8, or 16 cores what to do, and write code for each.

    I am in no way a software engineer, that is just a shot-in-the-dark guess at why people (including myself) mention programing for 2 cores.
  46. Herr and Chris, if you guys read the whole thread you'd see that we've discussed that. We said that if programs are written to take advantage of multiple cores (not just two) then it should scale. It wouldn't require programming for just 2 or 4 or however many, but "multiple".

    BTW, this is NOT a flame on you guys, I'm just saying read the whole thread. :)
  47. Heh, that's kinda what I was trying to say. I have seen posts from ppl complaining about "they'll have to write new code for that", and I'm thinking, "the multi-core code should scale."
  48. I did read the thread (not offended or anything). I am merely saying none of us are software engineers and being that there is no or very little multithreaded applications out there (and the ones that are out there for the average user are not written to take 100% advantage of two cores, meaning there is only a marginal gain) AND there are no quad core processors yet, how can we speculate that a multithreaded application will scale perfectly to infinite cores? As is it is not mainstream for 2 cores, so how can you predict many many cores? Is this something that is common knowledge for people that know about software and I am totally off or is this OK to question the scalability (writing and performance wise) of multithreaded applications?

    Just my $0.02
  49. Well most applications/software would need some optimizing to be effective.

    Quake 4 is one good example. When it came out, it was pretty much designed for single core. Then the update SMP made it to where perfromance increased for the game on the system dual core CPU side.

    I can only imagine the same would apply for quad cores as time goes by.

    But having a quad core system now for a home user, I would just think it would be a waste, unless that particular user does rendering for the most part.
Ask a new question

Read More

CPUs Core