9950 AMD Overclocked it not what you think
There's finaly some overclock Phenom 9950 numbers against the low
end Intel chips. here's the site http://www.xbitlabs.com/articles/cpu/display/core2quad-q9400.html
end Intel chips. here's the site http://www.xbitlabs.com/articles/cpu/display/core2quad-q9400.html
Looks like the low-cost quad-core processors market has changed significantly. Prices went down, new models came out. However, as we have just seen from our new article, there is hardly anything new we could advise you at this point.
One of the best choices among low-cost quad-core processors is still the old 65nm Core 2 Quad Q6600. Intel has dropped its price so significantly lately that is remains in the spotlight despite its age and relatively high power consumption. Especially since in some applications such as 3D games, for instance, it manages to perform as fast as a more expensive Core 2 Quad Q9300 thanks to large L2 cache. Its overclocking potential is also pretty encouraging. It doesn’t require a specific mainboard: you can almost always push its frequency 1.5 times up with just an efficient cooler.
In fact, the only drawback of Core 2 Quad Q6600 is its relatively high heat dissipation and power consumption, which makes it unfit for quiet and low-power systems. The new Core 2 Quad Q8200 will suit much better for them. Despite much smaller L2 cache, this CPU is pretty fast and outperforms the fastest processor of Intel’s competitor – AMD Phenom X4 9950. As a result, Phenom X4 family may be of interest only to those users who want to get a quad-core processor real cheap, but are ready to put up with low overclocking potential, low performance and high power consumption.
Unfortunately, we are not ready yet to offer our verdict to a more expensive Core 2 Quad Q9300 and Core 2 Quad Q9400. Yes, they are fast and have relatively low heat dissipation. But both of them are priced at $266 in the official price-list, while the junior Core i7-920 processor coming into retail after November 16 will be priced at $284. Therefore, we will only be able to say if Q9300 and Q9400 could be a good buy, only when we compare them to the upcoming solutions. So, we are not ready to comment on them at this point and would strongly recommend you to wait for the performance tests of upcoming solutions that should be revealed shortly.
Table of contents: Core 2 Quad Q9400 Core 2 Quad Q8200 Phenom X4 9950 Black Edition Testing Participants Specifications Testbed and Methods Performance General Performance 3D Games Media Content Encoding Final Rendering Other Applications Power Consumption Overclocking Performance during Overclocking Conclusion < Previous Page | Home >
Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 ]
Seriously, did anybody expect otherwise? Lower IPC + lower clockspeed = lower performance. Not exactly a startling revelation...
However, to be fair to AMD, Xbitlabs didn't use a SB750 mobo so they topped out at 3.2GHz which is rather low for a 9950BE. But even at the typical SB750 clocks of 3.4 - 3.5GHz the 9950BE would still be a bit behind the C2Qs as its still 10 - 15% slower per clock.
OK, Ill take the flame bait. The point of my then thread, was that I was surprised that Phenom could max a G280 out in any newer game, which the links showed. Only a few games sure, but a few nonetheless.
At jed, wheres the ocing for Phenom in your link?
Do you mean this? http://www.xbitlabs.com/articles/cpu/display/core2quad-q9400_15.html#sect0
The only reason I posted was because we have very little info on Phenom and its impact on gaming, so here we have an article that shows that a higher clocked (overclocked) Intel cpu stomps a oceed Phenom. But what it doesnt show is at the same clocks unfortunately, so people that own these cpus still dont know how they match up. So, once again, Phenom gets a bad wrap? I know Phenom isnt as good as Intels cpus, Ive never said anything different. Like I said, in the link Id provided from another thread awhile ago, it showed Phenom clocked at 3Ghz maxxing out a G280, and thus "keeping up" with Intels solution, which surprised me. This isnt Toms Intels cpu section, the the cpu section only. People who own Phenoms Im sure would like to see a few things shown about their cpus, and many a time, theyre be-ridden with fanboyism from the other side just because they own one. Like Ive said before, Ill say again, I dont care which cpu anyone puts in their rigs, nor do I care which ones go in mine, as long as they perform for the price Ive paid for them.
Its true, Ive kicked Intel a lil, just because Ive seen others do the same to AMD, and besides, Intel can handle it, their the big boys. But, if people cant see thru their fanboyistic colored glasses and see where Ive done the same to AMD, just not as much, because like I said, its already being done alot, then they need to start their own Intel cpu section, and stay there
jed said:when he turn everything bad about phenom to a plus
over the latest intel chips even if the numbers don't support
it. the same as thunderman and baron.
Ok so this is just flame bait, jejeje i remeber that topic long ago, and didnt think Jaydee comments were fanboyish, i myself own a 9850 with a 780gx and get 3.2 ghz without any work at all and dont see a difference with my brother q6600 at 3.4 ghz im sure its faster but in every day task hardly see a difference maybe in benchmarks but hell it was way cheaper than my Bros crossfire rig in his time an is still is cheaper so i say its not better, but dont really see a big diference in 6 frames per second 15% difference with same pair of gpus but in real life its not that bad if we increase resolution both rigs become unplayable(Crysis)
I know what youre saying, do you understand what Im saying? They arent at the same clocks, so we dont know what percentage Intel is faster than Phenom, and cant really make a true comparison. Its shows the only ocing on the link Ive provided, that page, and again, like I said, they werent at the same speeds.
Listen real good jed, I know youre a newbie, so listen. Reread what I said about the Intel cpus, just reread it ok? After you read the part about me saying the Intel cpus are clocked higher to their credit, then reread all the times Ive said its too bad they arent clocked the same for a better truer comparison, ok?
Some people either think if you say anything good about AMD, youre a AMD fanboy, or anything bad about Intel, youre a AMD fanboy, or if you dont go around espousing how good Intel is, youre a AMD fanboy, or.... jed? I think youve entered the twilight zone here, where things always arent what they seeeeem
My apologies, unnecessary language. I just don't understand why you are making a big deal over a thread that was posted to give information to an obvious minority on these forums. If you don't like the content in that thread, PM jay and discuss it with him. There is no point in making a flame bait thread when a PM will do the trick.
I also apologize for taking the bait in this thread. Use my above posts as examples as crap.
and thanks for the check Turpit. I want to help on this forum but i overreact sometimes.
jed, at this pont, I dont really care, but Ill refrsh your memory that it was a few games that the Phenom at 3Ghz maxxed out a G280, in other words, it got as good fps from that gpu as it could deliver, which was a surprise to me, as I didnt think it could. It didnt however max out the 4870x2, which no cpu, Intels nor AMDs can do, and using the x2, the Intel solution actually won. So, anyone saying that nVidia needs a better cpu for them to be at their best, isnt as true as it would seem either, since their highest card was maxxed out by a Phenom at 3Ghz.
Im not going to dig up the old links, I think theyre from PCGamersHardware, but not sure where they were from, but feel free to do so. I know youre certain that I favor AMDs cpus, and Im also certain that you dont believe what I tell you, so, in that vein, Ill finish with this
Whatever cpu works best is the best, and IMO, that concerns gaming only. I could care less about how fast winrar runs, or decoding media or anything like that. In gaming, AMD is many times closer to Intel than other apps, and its been proven, whether fanboys like it or not. In the link from this thread, we see maybe a 10% difference between a AMD and Intel cpu in gaming, while thats significant, its not major, as a better gpu will easily get you more fps for your money or efforts. If you took a Phenom at 3Ghz and used a 4870x2 vs a Intel 1000$ cpu and a G280, the AMD solution would still win, regardless of what anyone thinks, and if THAT doesnt settle well with you, then you ARE a fanboy
Xbitlabs didn't use a SB750 mobo.
Just thought I'd reassert that.
But the comments in the article were essentially correct.
Even if yu extrapolated the results to a better overclock for the Phenem the issue of power / heat and efficiency just leave it behind as an enthusiast chip.
It was good to see the Q6600 still up there but the newer 45nm cores do really show their strength in a number of areas - despite the smaller cache and clocks.
Does that still leave the Q9450 as the best potential Yorky core for overclocking for the price?
That is a more interesting question for me since I will be throwing in a 45nm core into this rig and retiring the Q6600 to one of the kids machines soon.
The noise of the fans on my desk are getting annoying ...
I am not getting a water cooler setup out of the shed again after the last leak either ... Corsair really let me down there when the reservoir base cracked and Thermaltake stuff is just rubbish.
reynod you are correct the Q6600 still the chip to beat.
How can nehalem be called a failure, when it's justs
as good as intels best in games, and better in all other
apps. Now look at phenom when it came out, not good
at all and when overclocked it worse.
But talked about like it's a great chip, talk about double standard.
JAYDEEJOHN said:jed, at this pont, I dont really care, but Ill refrsh your memory that it was a few games that the Phenom at 3Ghz maxxed out a G280, in other words, it got as good fps from that gpu as it could deliver, which was a surprise to me, as I didnt think it could. It didnt however max out the 4870x2, which no cpu, Intels nor AMDs can do, and using the x2, the Intel solution actually won. So, anyone saying that nVidia needs a better cpu for them to be at their best, isnt as true as it would seem either, since their highest card was maxxed out by a Phenom at 3Ghz.
Thats not entirely true, in the GPU bound games the GTX280 was the bottleneck, but in more CPU bound games the Phenom @ 3GHz falls behind:
This is hardly (sic) 'maxxed out':
http://www.legionhardware.com/document.php?id=775&p=5 (Crysis) C2Q @ 3.6GHz = 29fps, Phenom @ 3GHz = 24fps
http://www.legionhardware.com/document.php?id=775&p=5 (ET:QW) C2Q @ 3.6GHz = 94fps, Phenom @ 3GHz = 72fps
http://www.legionhardware.com/document.php?id=775&p=11 (Sup Com) C2Q @ 3.6GHz = 63fps, Phenom @ 3GHz = 54fps
http://www.legionhardware.com/document.php?id=775&p=11 (WiC) C2Q @ 3.6GHz = 40fps, Phenom @ 3GHz = 35fps
I'll address the point about the clocks not being equal before someone inevitably complains: Deal with it, C2Qs are hitting higher clocks in real life, even at identical clocks they are still faster.
So out of 7 games tested, Phenom is clearly bottlenecking the GTX280 in 4 of them. I know you have a soft spot for AMD JD, but at some point you gotta take off the blinkers, face the music and admit Phenom is inferior for gaming, yes, even on a GTX280...
epsilon, you forgot what Id said earlier in several of my comments. I said some games many a time. For it to do it at all is still amazing, it means its not that far off from really nice solution for gaming at its price point. Yea, it was legion. This is allll blown out of proportion simply because I refer to the speed of the G280 as astounding, and a Phenom brings out its best in a few games. Like I said tho, thats NOT the case with the X2, where again Ill say , NO cpu, be it Intels or AMDs can keep up with that card.
JAYDEEJOHN said:epsilon, you forgot what Id said earlier in several of my comments. I said some games many a time. For it to do it at all is still amazing, it means its not that far off from really nice solution for gaming at its price point. Yea, it was legion. This is allll blown out of proportion simply because I refer to the speed of the G280 as astounding, and a Phenom brings out its best in a few games. Like I said tho, thats NOT the case with the X2, where again Ill say , NO cpu, be it Intels or AMDs can keep up with that card.
Seriously, what is so amazing about certain games being GPU bound and not benefiting from a faster processor? Were you wowed in the same way when the P4 kept up with A64 in high res GPU bound gaming? Did that make P4 'amazing'?
The fact is that in the majority of games Phenom fails to bring the best out of the GTX280 and there is no way to spin that. There is nothing amazing about it, but keep on creaming your pants if you want.
As for things being blown outta proportion, have you forgotten you're the one who started all the talk about 'maxxing out' the GTX280 and that the Phenom is just as good in gaming (remember your thread? ). Judging by your replies you'd think the GTX280 didn't need anything faster than a 3GHz Phenom, which isn't exactly true as I have just proven...
If I recall, higher clocked AMD and higher clocked Intel CPU's have trouble bringing out the best in the 4870x2, so there should be problems with AMD and the GTX280. Which is why a 4870 or GTX260 is a better match for higher clocked Phenoms or older non-overclocked Intel CPU's.
In all honesty, I can't see anyone adding either a 4870x2 or a GTX280 SLI to a B3 Phenom, or a Q6600 rig. Core 2 is faster in games. Phenom can keep up in a few benchmarks and applications, much like P4 vs. Athlon 64 in the old days.
IMHO, the 125 watt 2.6 clocked B3's are a stopgap and aren't worth it. Deneb should improve things, and get close to (or match) Core 2 at stock, but they probably won't match Nehalem. I don't think AMD will stay in the doldrums forever and their next architecture should be designed to realistically take on Intel, but that's a few years away.
Right now, I like AMD Phenom AM2+ platforms for their cost and value. They have a place with budget gamers who don't overclock, but do benefit from more cores in other applications.
I do see jay's point that a nice comparison at stock should be done for those of us who don't overclock, but most reviews have the CPU's at stock along with overclocked in their benchmarks. At stock, my triple core's not bad, and a 9750's even better. The 9750 compares nicely to the Q6600:
Tom's Hardware - Benchmark Premiere Pro CS3 HDTV
Yes, I know most people here overclock, but there are more definitions of an enthusiast than "overclocker". I'd like a Phenom @ 3.0 stock, but that will have to wait on Deneb, or one of the triple core versions before I go that route.
edited for spelling
Quote:The point of my then thread, was that I was surprised that Phenom could max a G280 out in any newer game, which the links showed. Only a few games sure, but a few nonetheless.
1) It is an obvious possibility that a fast GPU would bottleneck on a new game running on a slow CPU. This is even easier to show if you take any game and crank up the res very high while turning down just the CPU intensive details. Without a link and details, we can't ascertain anything unexpected about Phenom's gaming ability.
2) Having just one or a few games where a CPU does not bottleneck a very fast GPU... means that in many games that CPU does bottleneck the GPU... and it would be illogical to recommend that CPU for general pairing with the very fast GPU, right?Quote:Whatever cpu works best is the best, and IMO, that concerns gaming only.... In gaming, AMD is many times closer to Intel than other apps,
Just a sec. You only care about gaming benchmarks. But all the gaming benchmarks show > 100 fps with most any CPU, or are GPU limited. Therefore, what modern CPU you pick does not matter. Why get excited about 9950 or Deneb or i7 then? Why are you volunteering so many comments in these 9950/Deneb/i7 threads?
If you would examine that xbitlabs article more carefully (there are 16 pages, with a Table of Contents for navigation), you would see that there are stock benchmarks showing the 2.6 GHz 9950BE falling behind the 2.4 GHz Q6600 in each of the four games tested.
Now, while no one needs much of a CPU or GPU for 1024x768 medium-quality gaming, the intent of these benchmarks was to predict which CPU would bottleneck first as new higher-detail games come out. And that prediction points to the Phenom in most cases going down first, with the quad core 45nm Celeron taking the remaining bench... in no case is the Q6600 going to bottleneck one of those games before 9950BE, without overclocking one and not the other.
If all you care about is games, then you simply don't need a Phenom, because time and again, overclocked or not, sites have shown the Phenom is a weaker performer in gaming than comparably priced competition, whether it be dual cores or Intel's quads. It is within the other, non-gaming tasks where a Phenom pulls ahead more than once in a blue moon. Especially server type applications.
Of course, a Phenom should suffice for many (not all) games today. So should a Conroe or an Athlon x2 (not always). But you don't upgrade your CPU to last just a month or three. Exactly how far ahead and wide you plan is up to you, but you'd need some favoritism to pick a Phenom with less mileage but the same cost - that's what xbitlabs is saying.
Tom's has a front-page article where they put an HD4850 in a bunch of Athlon64's. Guess what? If in 2006 you bought a 2.0-2.4 GHz A64/x2 instead of a Core 2 @ 2.66GHz, thinking no then-current games exploited the added CPU power or cores, and just recently you picked up an HD4850, you'd now be feeling the CPU bottleneck at reasonable 1600x1200 resolution in games like Oblivion, Test Drive Unlimited, CoD4, Crysis... probably many others not tested. And you know HD4850 isn't the best today. Core 2 meant little or nothing for mainstream gaming in 2006; two years later, Conroe versus Brisbane is a no-brainer. Two years down the line, I fully expect Phenom vs. Yorkfield to be a no-brainer, too. Just like Athlon64 vs. Pentium 4 was a no-brainer in 2006.Quote:This is allll blown out of proportion simply because I refer to the speed of the G280 as astounding, and a Phenom brings out its best in a few games.
I'm not sure I read you correctly. A GTX280 has astounding performance for 1 GPU, but where does it say all games that utilize this performance have to use the CPU a lot? I'm not at all surprised that a few don't.Quote:Like I said tho, thats NOT the case with the X2, where again Ill say , NO cpu, be it Intels or AMDs can keep up with that card.
You placed this sentence right after the Phenom-GTX280 comment - twice - so I must subject it to the same standard. Are you're taking bets on this?
(base system: Penryn 3.0GHz, HD4870x2, X38)
Unreal Tournament 3, max details:
93 fps @ 2560 x 1600
56.9 fps @ 2560 x 1600 with 4x AA/AF
Mass Effect, max details:
87.5 fps @ 1920 x 1200
57.7 fps @ 1920 x 1200 with 4x AA/AF
67 fps @ 2560 x 1600
34 fps @ 2560 x 1600 with 4x AA/AF
But of course, 4x AA/AF presents no extra load to the CPU, so the substantial fps drops you see here (representing SOME, not ALL of the games tested) must be the fault of the graphics card.
Without a link to the games/settings that the Phenom kept up with the GTX280, I'm going to go on a hunch that in those same games/settings, a Yorkfield would keep up with the HD4870 x2.Quote:the G280 ISNT the fastest gpu
Technically, it is. The 4870 x 2 uses 2 GPUs to accomplish the rendering task, and each of those are, individually, slower on average.
JD I prefer to believe you are not a fanboy, but the scale of commentary and the overall direction of mistakes made clearly err in AMD's favor, just like what happens with BM (and Tman is more a fanatic); believe me, there are ample ways of supporting AMD without overlooking fairness and logic, or being careless.
Like Ive said, the G280 ISNT the fastest gpu, and in some games, the phenom can bring out the best. However, in those same games, even the fastest Intel cpu fails to get the most out of a 4870X2, no matter how oceed it is. So, the oceed Phenom does pretty well with most graphic cards, if only the X2 can make it the bottleneck in a few games, as most other cards it wont. Thats what Im saying, and its better than Id thought itd do. Now if only Intel could make a cpu thatd max out a X2, then wed all be happy
well if you checkout the complete article at xbitlabs, it's
normal clock speed, the phenom wins 8 out of 21 test, with
a 200mhz boost over the q6600 that not good at all.
even with a 200mhz boost in clock speed, it didn't well.
when all chips where overclocked the phenom didn't win
a test at all, not even close.
OK, try Crysis warhead, Crysis, AoC, and many more, with more coming soon. Seems as if most newer titles are wanting more from the cpu. I see the gpu, as Id said doing 2 gens, and increasing its power/abilities many fold before we see anything coming from the cpu arena.
Look at what Valve is doing. Yes, hopefully theyll implement MT in their games, but they are also calling on many things from the cpu that is either in its infancy or not used at all currently, which only asks more from a cpu, which currently in about half of newer games is just adequate, and doesnt have the vast leads, or stores of power/availability weve seen in the past, as gpus just keep getting faster. Youre trying to pinhole 1 thing here, Im looking at a far larger picture.
The demands of higher AI,physics etc and faster gpus will make more and more cpus not able to max out gpus, especially faster ones thatre in existance today. Sure, pick a few older games, and youll see what youve said. But why is 3DMark06 so passe today? And yet, we havnt really seen much improvements in our cpus since its been out. And now, here we have Deneb and i7 at a standstill for gaming? While the gpu keeps plowing ahead. When we stand still, we lose, and yes, most games today are fine, but go back 1 year ago, and you had only 1 game, and that currently is still not beaten/playable at higher res, and barely at lower res. I could give example after example, and while you may point out games made in 07 or 06, Im talking about newer games, games coming out. The PC gaming industry has been hurting for some time now, partially due to the fact lower priced gpus couldnt play games, unlike what we see today. And the major IGP maker makes gaming a total joke using their solution for gaming, so the seeds of PC gaming arwent even being spread.
Now, if the cpus start to fall behind more and more of the abilities of the gpu, thatll only be another hit on PC gaming. Most gamers run single cores, use older gpus etc, but here we have the cpu makers almost elimanting dual cores, and theyve shown little improvement since 06, as far as gaming abilities go.
I could show you link after link that goes on about the requirements of the cpu for gaming, and it isnt one of these thats been posted at all, as thats just a comparison or a few cpus in some older easier games. In the links IM refering to, they say 3.2 or higher in many a game. So, a year from now, after we see 2 more gpu gens come out, you still think 3.2 will be fast enough? Do you really? I can argue the fact that AMDs solution for AMD users are looking better, while at rhwe same time , NEITHER company is doing enough, and are HOPEFULLY leaving it up to the devs to "fix" it. When was the last time Intel relied upon devs in gaming to bring out more from their cpus? Ever? Thats where were at, and if you, or anyone else doesnt follow this closely, you may not see it, but its here, and its going to get worse
Also, Im thinking in order to make your argutment next May, you may ,like Intel does as well, have to rely upon devs to make even higher resolutions for your argument, as 25x16 may NOT be enough to humble the next gen of gpus, so then where will you get youre bottleneck? No, dont even think it MIGHT be the cpu, must be because its not MT or some other excuse. This is coming, and see it for what it is
Also, why is it so hard to see my elation for AMD when theyre finally heading in the right direction, while at the same time see my disappointment for i7, as I see no gaming improvements from it? And why am I looking to Intel for this, and asking it from them? Because I dont expect miracles coming from AMD anytime soon? Ever thought of that? And maybe thats why Im more let down by i7's gaming performance? And this makes me a fanboy?
^ I feel let down that Devs haven't given a crap about MT, or x64. Designing games and heavy apps and OPERATING SYSTEMS nowadays to be mainly 32 bit is absolutely ridiculous. Anyone with a x86 processor doesn't have power to run any of these things. Windows 7 better be exclusively 64 bit or i will switch to osx and linux. I swear it.
Also, i have to manually tell my operating system (VISTA x64) to boot up using both cores?!?! Come on!
At least Valve is trying to do something about t. Theres otherrs as well, itll just take time. The problem is, it used to be that cpus got faster, and a Game dev had 2 years into a game from scratch, so they could plan on faster cpus by then, but at some point, the cpu stopped becoming faster, and went wider, and all the game devs have to start at some point using MT, but itll be awhile yet. Like I said, 2 years for a game, thats 2 years before it becomes the norm, and meanwhile, the cpus have stopped getting faster
This has been a tumultuous time for HW, as we see the main OS switch kernel/driver, 32bit to 64, cpus going to MT, or multi core, and all the SW has to catch up. Hopefully, MT will come sooner than expected, but even so, theres always going to be serial threads, no matter what, and hopefully game engines wont have many