Are cpus losing their clout?

In this day and age of good enuff for average Joe, Fermi, LRB Multi core cheapness, are cpus losing their importance in average Joes eyes?
"Leslie Sobon, AMD vice president of product marketing, said most of their market research indicated that consumers found it very challenging to find what they needed in retail settings and many were looking for systems that were geared toward entertainment and gaming. Their research also suggested that these features were more important to consumers over higher-end processors, memory, and even storage space. "
http://www.pcper.com/article.php?aid=811
Im thinking, oh help me, Jensen may have been right.........
thoughts?
 

TheViper

Distinguished
I think it is simply that more and more consumers are understanding the basics of a computer these days.

A decade ago the average consumer heard Pentium Mhz and that's all they looked at. Now they consider many more components in their purchase.

Have they lost their clout? In a way. I prefer to think of it as the CPU no longer being over emphasized.
 

amnotanoobie

Distinguished
Aug 27, 2006
1,493
0
19,360
I'm thinking this is about having a more "balanced" system. With the advent of Youtube and HD content, the once capable processor is gasping for air. Before we could get away with playing DivX content solely on the processor, but today you'd probably need something better than an i7 just to play HD movies.
 
There are still machines out there advertised for gaming because they have a couple of 9500GTs in SLI; what the consumer gets is all the possible noise, heat, and driver issues of multiple GPUs with none of the expected performance, regardless of the CPU it has. That's why consumers are confused.
Overall though, I think this marketing strategy makes some sense. I really don't care if AMD is trying to unload old tech, as long as it does what they say it can do; i.e. it will be fit for the purpose for which it was bought.
Now, will the battery last through a four hour flight...
 

usspacelabs

Distinguished
Oct 29, 2009
10
0
18,510
Just my personal experience is that a better GPU will do more for gaming than the processor. There are few games that will tax a processor 100% on a quadcore but the GPU is getting hit heavy.
 
I think we have finally reached a point where an inexpensive mid range, multi-core CPU is simply "fast enough" for 95% of the users out there. It is more about flexibility of the system and what you can plug into it that means a lot more to the average person.
I agree with TheViper that a decade ago people did look only at mhz, and yeah it HAD to be Pentium. But a decade ago, processors were slow, and more mhz made a huge, huge difference with everything you did.
 
In case some folks arent getting my post, Im not trying to come from an enthusiast approach here, but to get us to think more like average Joe, see his needs desires etc.
As an enthusiast, of course wed want more power,faster,lighter,power efficient etc etc
 

Not only can an i7 handle HD alone, it can do so without much effort. It's true that HD is a much bigger strain than video content used to be, but even so, it is easily handled by any modern CPU (the atom doesn't count here).
 

hypocrisyforever

Distinguished
Mar 30, 2008
155
4
18,715




I agree with this guy pretty much to a T. I would add to it by saying it wasn't even a decade ago that this happened though. I'd say it happened on a smaller scale with the athlon, and then huge with the core2's. What I mean by that is, you used to be able to bog down a computer, "have too many windows open", not be able to minimize a game without it locking up your whole system for 3 minutes. Well, at this point we are past that, and have been for two or 3 years. I challenge you to slow down any quad core with a pile of random programs. It's damned near impossible.

And, to the comment of the OP a few posts back, if the original thread was in regards to "gaming enthusiast".....when was the last time the cpu was the deciding factor? As long as I can remember, the gpu has ALWAYS been the bottleneck. I'd say the gpu's have stagnated way way more than the processors. The 8800Gtx I used was on top of the market for a year. How does that happen? Also, a few years down the road, this thing is still holding it's own. You can't really say that about a P4.
 

uncfan_2563

Distinguished
Apr 8, 2009
904
0
19,010
I think overall computer development has slowed down because software developers are just going after money rather than high-tech stuff. People just want to stay with the biggest market (older technology) to make money and since all this stuff we're talking about has become more affordable and mainstream recently, I think we'll be stuck around here for a while.
 
So, youre saying a old C2Q is 3 times slower than a i7 currently? Much like your old GTX is compared to a 5870 or a 295?
I think not.
This isnt about gaming, its just 1 usage average Joe does, but the overall usages, which have been handled quite well by cpus for awhile now, and stagnation has settled in, unless you can show me a cpu today thats 3 times faster than a C2Q, around the same release dates as your GTX
 
Or, easier and better put, isnt the 6600 considered the best cpu ever made? by many people? and to this day considered completely compatable, and users are still waiting for something good enough to come along?
Hows Crysis, or all vid playback working on your GTX?
 

amnotanoobie

Distinguished
Aug 27, 2006
1,493
0
19,360


You just wait, years before the X2 and FX were unstoppable. Look at where they are now.




Huh? What about the 7800GT, GF Ti 4200, Radeon 9600, etc.

Though back then developers were pushing the hardware limits year after year. Now advancement to visual quality has stagnated quite a bit, if the rate of development was the same as before, we should have more titles with the same quality as Crysis by now.
 
I think if we strictly stick to HW, thats what I see.
The GTX also pulls more power than todays cards as well, runs hotter etc etc.
You can put a similar gpu in a mobile today thatd run circles over a GTX in a mobile, for power, time and perf, not so with cpus.

PS Add in things like HKMG and the fact that gpus are finally catching cpus in process sizes, and theres even a larger jump to come, since cpus already employ this today
 
If you are just talking about the semi-mythical average user, he (or she) can go down to the closest Big Box store and drop $600 and get a nice system - with software - ready to go out of the box.

The average user just wants an appliance good enough to do whatever he wants to do. He doesn't care about the advantages of a particular micro-architecture, anti-aliasing, or NCQ. Good enough is good enough.

I recently built a computer for my sister-in-law - GA-G41-ES2L, E5200 (!!!), 4 GB RAM, 320 GB HD. Primary use is internet and office apps. She's very happy with it. It is waay faster than her old Dell P4 Celeron.

Good enough really is good enough.

Me, I think that's great. It helps drive down the costs for the rest of us.
 
Yep, it also points to trends like LRB/Fermi, where we should see a much more versatile chip doing many things cpus used to do, but using less power, plus gfx.
As gpgpu picks up, having the same perf as a cpu, a gpgpu can use much less power, and bodes well for the future, as we head more and more into mobiles
 

someguy7

Distinguished
Dec 12, 2007
1,186
0
19,310
I know exactly what you're saying and I agree. The processors of today are fast enough for the average joe. The average user would be perfectly happy with a lowest end Intel core based dual core. On the AMD side the average person will be more than happy with any x2 or later cpu for cpu.

Put an i7 system and a low level core2 based system side by side. Average consumer wouldnt be able able to tell the difference and will be happy with the speed of both.

Put a Phenom II x4 965 and a new athlon or a old k8 x2 side by side. Same thing.



 
It all comes down to SW.
Today, were being told that cpus will have many cores and theyll help average Joe.
Thats just not the case.
Using many cores is the only way MT will work correctly, and todays cpus just dont have enough cores for those threads, so until that happens, the gpgpu will be the bridge.
Eventually, when we actually see cpus and SW doing all this, I think theyll be taking somewhat of a backseat in driving future progress
 

That is so going to come back and haunt you. :lol:
 
I don't know if any of you will agree with me - but years ago, it was necessary for power users and gamers to get high end CPUs - I remember a Tom's article a while back when the Athlon XP's were king that recommended buying the second best processor in the series (at the time, the 3000+) for gaming.

The performance curves sure seemed to have changed. Overclocks in those days didn't bring too much more benefit - but the advent of the core series changed all that (and perhaps the original Athlon X2/64 line). Almost overnight the cheapest and second cheapest chips became the most popular...they were easy to overclock the snot out of for enthusiasts, and they were cheap, cool, and quiet for the average joe.

Now the CPU choice gaps have closed - instead of offering 6 or 7 chips with a more gradual price gradient, there's 3 or 4 in a series (with a pricing rage that's much more exponential). The demand for high-end chips is really decreasing a lot; those only with cash to burn spend them on such things, while everyone from enthusiasts to average consumers get the cheaper stuff.
 


I mostly agree with this, to a point, and actually looked into it somewhat. My "Green Gamer" project PC was a test of this, "how low can you go." For Guild Wars, a 4850e was very playable, but was an obvious bottleneck. The 720BE that replaced it was much better, using the same HD4670 GPU. Games like Crysis (esp. at high settings) are outliers, and not at all typical of what the average user runs.

I suspect that a stock 720BE with a SSD will feel way faster than an overclocked quad with a rotating HDD. Hopefully a good Black Friday deal on an 80GB-128GB SSD will let me test this.