Sign in with
Sign up | Sign in
Your question

A real gaming comparison Conroe-K8

Last response: in CPUs
Share
July 18, 2006 7:47:46 AM

I am just a little tired of people comparing only 800x600 for gaming purposes, no one uses that resolution and not only that, the benchmarks shown are basically irrelevant to anyone who wants to know how the cpu's can increase FPS in real life. HardOCP did a pretty good article IMO, it's sad that I have to say this everytime I post, but I will continue to do so..... Please don't get all upset as I am not bashing Core 2 cpu's, I am merely looking at things in a real world sort of persepective....

http://enthusiast.hardocp.com/article.html?art=MTEwOCwx...

Like I have said before, I will probably buy a Core 2 setup when I get back from my current location.... but by then things could change.... although they probably won't. So, lets try to be civil and act like grownups. Ycon.... if you could just turn away and not post I/everyone would appreciate it.

Edit: I am a little out of touch, can't get online very often, I did a aquick search, but sorry if this benchmark has been posted. If this is the case, Admins please delete.
July 18, 2006 8:05:58 AM

Conroes dont increase max FPS because of the GPU being a bottleneck, they raise the min FPS signifigantly however. Also in the future DX10 GPUs will push that framerate much higher.
July 18, 2006 8:15:20 AM

I am aware of that.... and I know it shows the potential of the cpu architecture.... but how long will it be until the benchmarks actually come to life? What I mean by that, is how long until all these 800x600 benchmarks actually translate over to the higher resolutions.... how long will it be until a 2.6Ghz X2 K8 will perform poorly at 1600x1200? Not for quite a while I imagine. Will DX10 cards actually be so much faster that we will see these differences in higher resolutions? I doubt it..... not atleast for a year or so.
Related resources
July 18, 2006 8:26:01 AM

I disagree with this being a good review. They purposely bottleneck the CPU, and in one case, they HANDICAP the Conroe in order to show the AMD in a good light.

Do i disagree with the whole 1600x1200 benchmark? NO! I disagree with it for the following reasons:

1. Playing games at 1600x1200 is NOT the most used resolution! 1200x1080 is used MUCH more! If this was real world performance, this would be the "Real World" benchmark that are emphasized and do show the improvement on Conroe!

2. People who play games at 1600x1200 use the most advance video card available. If they are going to benchmark these video games they would use a better video card. Intel even included a better video card in the test system (in addition to extra cards for the AMD system, but Hard OCP DECIDED (ON THEIR OWN) to use an INFERIOR video card!

3. Handicapping Conroe on the 1 benchmark that it clearly outperforms the AMD at 1600x1200.

Edit: Looks like Jack beat me too it, same content, but he posted links to his source :) 
July 18, 2006 8:42:46 AM

Showing games with the settings raised won't really show much of a difference as the bottleneck becomes the GPU. For a real indication of CPU power it would be better to compare things like audio/video encoding, file encryption, decryption, and compression. This is where these CPUs excel. This article does show that those planning to game shouldn't worry about buying the best processor but focus more on a high end GPU.
July 18, 2006 8:51:24 AM

Quote:
I am just a little tired of people comparing only 800x600 for gaming purposes, no one uses that resolution and not only that, the benchmarks shown are basically irrelevant to anyone who wants to know how the cpu's can increase FPS in real life. HardOCP did a pretty good article IMO, it's sad that I have to say this everytime I post, but I will continue to do so..... Please don't get all upset as I am not bashing Core 2 cpu's, I am merely looking at things in a real world sort of persepective....

http://enthusiast.hardocp.com/article.html?art=MTEwOCwx...

Like I have said before, I will probably buy a Core 2 setup when I get back from my current location.... but by then things could change.... although they probably won't. So, lets try to be civil and act like grownups. Ycon.... if you could just turn away and not post I/everyone would appreciate it.

Edit: I am a little out of touch, can't get online very often, I did a aquick search, but sorry if this benchmark has been posted. If this is the case, Admins please delete.


No this is a good thread to hold open because this needs to be discussed.

HardOCP rigged the test to get a predetermined/desired conclusion. They intentionaly setup the bench in order to throttle to the GPU and not demonstrate the effectiveness of which CPU crunched the gaming code the best. This is OK so long as you present the whole story. In that the statement should be,

'Until GPU's exceed the point where load on the CPU is important, running at 1600x1200 with all goodies will demonstrate no difference between the CPUs. Using a single card nVidia graphics solution.' This would have been a fair assessment and the appropriate conclusion.

There are a few problems with what they are calling realworld though:

http://www.steampowered.com/status/survey.html
I published this link ealier. The wonders of stealth survey's :)  .... if you look through the data, which samples some 700,000 users what you find is that the bulk majority of gamers do not game at 1600x1200 but at 1280x1024 and 1024x768 by 36.1 and 47.5% respectively --- that is over 80% of users out that run less than 1600x1200. Thus realworld to HardOCP is not the majority of realworld to most users. They did a great disservice to those who run at typcial resolutions lower than the uber-gobsmacked-nut-busting-maxed-out settings that most people do not run.

Furthermore, if you are the ultimate gamer, likely you will get the ultimate rig to run your games -- which means SLI or xFire --- why not setup a xFire system and bench ... Anandtech did and at 1600x1200 resolutions we now begin to see the Core 2 power show separation:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...

You see it in FEAR, Rise of Nations, even Oblivian (killer game by the way).

Heck start at the beginning of the xFire portion and read through them all:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...
HL2, BF2, Q4 -- all show a domineering position of C2D @ 1600x1200 full aliasing. How can this be explained? Simple, xFire puts the burden back on the CPU.... and since C2D demolishes FX-62, is it surprising that it now out performs.... say you buy the next gen GPU around xMAS, do you want a CPU throttled rig to run it? That is what you would evidently get with an AMD powered system. Did HardOCP explain this??? Nope.


Now here is what is irking me about that review --- it isn't like they could not have done it... they are a HW site for goodness sake. Apparently, Intel even sent them the capability as one site even published the contents of the press kit:

Quote:
To test its new baby, Intel sent out one of its own D975XBX motherboard with a beta BIOS that adds support for the Core 2 Duo processors. We also received a Core 2 Duo E6700, clocked at 2.66GHz, and a 2.93GHz Core 2 Extreme X6800. There was also an ATI Radeon X1900 XTX and an X1900 CrossFire Edition graphics card in the goodie box to try and keep the test rig similar to the one Intel provided for us for the controlled preview t'other month.

http://www.reghardware.co.uk/2006/07/14/intel_core2_duo...

So what gives --- the tone of the article certainly reads like --- man we don't like the idea that Intel is taking back the performance crown. At least that is how I read it. Frankly, this site should be the laughing stock of the net, as every other site who benched at those resolutions also provided the correct data to demonstrate that it was GPU limited.

Heck, you could have slapped a P920 or P940 into the mix and show it matching FPS of a FX-62 --- what conclusions would be drawn then? That P4 really isn't all that bad, that you should have bought a P4 in April because it was lower priced?

It was a ridiculous review, it was unethical, it was inappropriate and they published utter junk which helped no one make a sane conclusion.

Jack

Agreed.
July 18, 2006 8:52:45 AM

Quote:
Showing games with the settings raised won't really show much of a difference as the bottleneck becomes the GPU. For a real indication of CPU power it would be better to compare things like audio/video encoding, file encryption, decryption, and compression. This is where these CPUs excel. This article does show that those planning to game shouldn't worry about buying the best processor but focus more on a high end GPU.


Correct, but Hard OCP took this to the EXTREME in order to bottleneck the CPU as MUCH as possible so that it would look like they are even closer than they really are.

They even had a better video card that would be more "real-world" for an enthusiast who games at 1600x1200, but the purposely decided not to use it. Did you even read mine or Jack's post?
July 18, 2006 9:05:22 AM

Indeed I did, and I agree. The review wasn't showing performance based entirely on the CPU so I was just suggesting some benchmarks to compare that shows more CPU based performance. Obviously Conroe is the clear winner in almost every situation.
July 18, 2006 9:06:10 AM

Like others have said, this has been setup to show a 'real life' comparison... Do they do the same thing with a pentium D 820? NO. They would show that the 820 is in everyway inferior to any X2, they would do this by (like most people do) cranking all the settings to a high level that would work the GPU to 100%, there for offloading some work to the CPU.

This test didnt show any glory for conroe because they set the graphics so low that the same 820 would score within 1%, performance-wise, to the processors mentioned in the article.
July 18, 2006 9:13:06 AM

Quote:
Like others have said, this has been setup to show a 'real life' comparison... Do they do the same thing with a pentium D 820? NO. They would show that the 820 is in everyway inferior to any X2, they would do this by (like most people do) cranking all the settings to a high level that would work the GPU to 100%, there for offloading some work to the CPU.

This test didnt show any glory for conroe because they set the graphics so low that the same 820 would score within 1%, performance-wise, to the processors mentioned in the article.


Correct, but they do this saying it is "real world performance" when reality, it is NOT what most users would experience as "real world performance" due to reasons stated above.
July 18, 2006 9:38:09 AM

The ONLY reason I game in 1920x1200 is because of my LCD monitor. If I still had my Sony CRT I would game in whatever res I could pump all the eye-candy into. The VAST majority of the collective community games at a much lower res...be that 1280x1024 or 1024x768 or bychance even lower.

Modern GPUs can crank out a lot of frames at the resolutions most people game at, and the CPU is a major factor in this scenario. Just how fast can the CPU feed the GPU?

I wholheartedly agree, bench the gear the right way, and give poeple the info that really matters. Those that run at extreme resolutions already know the answers to their questions (dual GPUs help a lot!); tell the masses that read the articles what they really need to know.
July 18, 2006 12:36:28 PM

When I read a review of a processor, I want to know at what FPS the GPU bottlenecks. [/SARCASM]


Seriously, if I'm in the market for a new CPU, I want to know which one unequivically, performs the best. The ONLY way to do this is to turn the resolutions down, thus eliminating the GPU as a limiting factor and showing what raw power the processors have. Any review that does otherwise has corrupt and skewed data and is only doing a disservice to consumers.
July 18, 2006 12:41:40 PM

Quote:
I am just a little tired of people comparing only 800x600 for gaming purposes, no one uses that resolution and not only that, the benchmarks shown are basically irrelevant to anyone who wants to know how the cpu's can increase FPS in real life. HardOCP did a pretty good article IMO, it's sad that I have to say this everytime I post, but I will continue to do so..... Please don't get all upset as I am not bashing Core 2 cpu's, I am merely looking at things in a real world sort of persepective....

http://enthusiast.hardocp.com/article.html?art=MTEwOCwx...

Like I have said before, I will probably buy a Core 2 setup when I get back from my current location.... but by then things could change.... although they probably won't. So, lets try to be civil and act like grownups. Ycon.... if you could just turn away and not post I/everyone would appreciate it.

Edit: I am a little out of touch, can't get online very often, I did a aquick search, but sorry if this benchmark has been posted. If this is the case, Admins please delete.


No this is a good thread to hold open because this needs to be discussed.

HardOCP rigged the test to get a predetermined/desired conclusion. They intentionaly setup the bench in order to throttle to the GPU and not demonstrate the effectiveness of which CPU crunched the gaming code the best. This is OK so long as you present the whole story. In that the statement should be,

'Until GPU's exceed the point where load on the CPU is important, running at 1600x1200 with all goodies will demonstrate no difference between the CPUs. Using a single card nVidia graphics solution.' This would have been a fair assessment and the appropriate conclusion.

There are a few problems with what they are calling realworld though:

http://www.steampowered.com/status/survey.html
I published this link ealier. The wonders of stealth survey's :)  .... if you look through the data, which samples some 700,000 users what you find is that the bulk majority of gamers do not game at 1600x1200 but at 1280x1024 and 1024x768 by 36.1 and 47.5% respectively --- that is over 80% of users out that run less than 1600x1200. Thus realworld to HardOCP is not the majority of realworld to most users. They did a great disservice to those who run at typcial resolutions lower than the uber-gobsmacked-nut-busting-maxed-out settings that most people do not run.

Furthermore, if you are the ultimate gamer, likely you will get the ultimate rig to run your games -- which means SLI or xFire --- why not setup a xFire system and bench ... Anandtech did and at 1600x1200 resolutions we now begin to see the Core 2 power show separation:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...

You see it in FEAR, Rise of Nations, even Oblivian (killer game by the way).

Heck start at the beginning of the xFire portion and read through them all:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...
HL2, BF2, Q4 -- all show a domineering position of C2D @ 1600x1200 full aliasing. How can this be explained? Simple, xFire puts the burden back on the CPU.... and since C2D demolishes FX-62, is it surprising that it now out performs.... say you buy the next gen GPU around xMAS, do you want a CPU throttled rig to run it? That is what you would evidently get with an AMD powered system. Did HardOCP explain this??? Nope.


Now here is what is irking me about that review --- it isn't like they could not have done it... they are a HW site for goodness sake. Apparently, Intel even sent them the capability as one site even published the contents of the press kit:

Quote:
To test its new baby, Intel sent out one of its own D975XBX motherboard with a beta BIOS that adds support for the Core 2 Duo processors. We also received a Core 2 Duo E6700, clocked at 2.66GHz, and a 2.93GHz Core 2 Extreme X6800. There was also an ATI Radeon X1900 XTX and an X1900 CrossFire Edition graphics card in the goodie box to try and keep the test rig similar to the one Intel provided for us for the controlled preview t'other month.

http://www.reghardware.co.uk/2006/07/14/intel_core2_duo...

So what gives --- the tone of the article certainly reads like --- man we don't like the idea that Intel is taking back the performance crown. At least that is how I read it. Frankly, this site should be the laughing stock of the net, as every other site who benched at those resolutions also provided the correct data to demonstrate that it was GPU limited.

Heck, you could have slapped a P920 or P940 into the mix and show it matching FPS of a FX-62 --- what conclusions would be drawn then? That P4 really isn't all that bad, that you should have bought a P4 in April because it was lower priced?

It was a ridiculous review, it was unethical, it was inappropriate and they published utter junk which helped no one make a sane conclusion.

Jack

Jack,
Great analysis, you can't make itany more clearer. Nice touch,
by adding crossfire in 1600x1200 resolution. Extreme resolution -
uses extreme GPUs. For example, if I ever was going to get Dell's
30" monitor, I would go quad.
July 18, 2006 12:56:32 PM

Definitely reads like a biased article because it seems to blame Intel for making a CPU that's too powerful for their GPU setup. Had it not been biased, I would expect the article to carry a title like, "Do you really need a Core 2 today for gaming?" and the discussion to focus on how if you don't have the latest SLI, you're buying Core 2 only for the gaming potential with a future GPU upgrade, or for its non-gaming performance.

Whenever you get fps scores so close while changing the CPU, you know the same old GPU is the culprit. That's no weak GPU there, but we all know a GPU generation lasts six months while a CPU generation lasts for several years. On top of that, SLI is currently available, and they didn't use it - that's a substantial GPU bottleneck.

The key issue in the tests was that all the processors were very new or at the very high end. Their GPU solution, on the other hand, was a little weak, about six months old.

So what should one do if they're upgrading a gaming computer anyway? I'd say Core 2 definitely. While you probably wouldn't see a performance difference gaming with a mid-range Athlon64 or P4 today, one or two years down you'll definitely be affected by it when pondering what graphics solution to upgrade to.
July 18, 2006 12:59:35 PM

Quote:
blame Intel for making a CPU that's too powerful for their GPU setup.

Classic.
July 18, 2006 1:04:52 PM

Quote:
Like others have said, this has been setup to show a 'real life' comparison... Do they do the same thing with a pentium D 820? NO. They would show that the 820 is in everyway inferior to any X2, they would do this by (like most people do) cranking all the settings to a high level that would work the GPU to 100%, there for offloading some work to the CPU.

This test didnt show any glory for conroe because they set the graphics so low that the same 820 would score within 1%, performance-wise, to the processors mentioned in the article.



SO why did the FX60 and 965EE get throttled? An 820 would have done worse than a 4200+.

I'm just glag that now Intel people are crying foul. Weren't there enough 80x600 bnchmarks? SOme people want to know what the box will do at the res they play at. I play at 1280. If I had CrossFire I'd play at 1600.
July 18, 2006 1:08:49 PM

From a gamer's perspective:

When I am playing CounterStrike:Source, an online first person shooter usually played against 10-40 people, I need high FPS, not high eye candy.

I run at 1024*768 on my rig (see specs) so that I can have the smoothest experience possible. If your computer is slowing down on some maps you will not do very good.

Contrary to what some people say, your CPU is very important to gaming. You can't just buy a sweet video card and drop it in your Sempron system. Right now my gaming is limited by my slow Athlon 3200+. I'm hoping Valve will update the Source engine to use multiple cores, at that time I would probably upgrade to the X2 3800+ or whatever else may be affordable.
July 18, 2006 1:09:40 PM

Poor review, extremely biased and unprofesional. HOCP lost my respect by this garbarge reporting.
They should have benchmarked at more resolutions, and also used xFire cards, which they had at their disposal yet would skew their intent by painting Conroe in a good light.
This is all obvious for those who are willing to think about what they read and analyze the results.
July 18, 2006 1:11:33 PM

I wonder how much hate mail the author's been getting?
July 18, 2006 1:23:38 PM

Quote:
I wonder how much hate mail the author's been getting?
I'm sure Kyle's getting fan mail from AMD fanboys with offers for him to father their children. :wink:
July 18, 2006 1:25:44 PM

It'd be really cool if another review site directly challenged HardOCP's review. A review of a review, one could say.
July 18, 2006 1:31:55 PM

Quote:
SOme people want to know what the box will do at the res they play at. I play at 1280. If I had CrossFire I'd play at 1600.


I dont think you read through the article, but they claim that 1280x1024 is not real world performance, and you would only game at it if you wanted low resolution gaming. Your points make sense, but they just show how bad this article is, they claim 1280 is not a real world gaming res, then use a single card to test 1600 with max settings(you claimed youd go that high with crossfire). I have a 7950gx2 right now, and I am deciding on what to upgrade to for my cpu. If I didn't know better, some people on these forums would try to tell me that a core 2 duo would not give me better performance than my 3500+ I have right now. Anand's benchmark makes much mroe sense, if you want to show high res gaming, use a crossfire or SLi setup.
July 18, 2006 1:55:25 PM

Conroe is just blisteringly fast, nothing out there could compete with it.
There simply no point in discussing this topic, Conroe is the power on the shelves.
July 18, 2006 2:02:14 PM

Quote:
I am just a little tired of people comparing only 800x600 for gaming purposes, no one uses that resolution and not only that, the benchmarks shown are basically irrelevant to anyone who wants to know how the cpu's can increase FPS in real life. HardOCP did a pretty good article IMO, it's sad that I have to say this everytime I post, but I will continue to do so..... Please don't get all upset as I am not bashing Core 2 cpu's, I am merely looking at things in a real world sort of persepective....

http://enthusiast.hardocp.com/article.html?art=MTEwOCwx...

Like I have said before, I will probably buy a Core 2 setup when I get back from my current location.... but by then things could change.... although they probably won't. So, lets try to be civil and act like grownups. Ycon.... if you could just turn away and not post I/everyone would appreciate it.

Edit: I am a little out of touch, can't get online very often, I did a aquick search, but sorry if this benchmark has been posted. If this is the case, Admins please delete.


No this is a good thread to hold open because this needs to be discussed.

HardOCP rigged the test to get a predetermined/desired conclusion. They intentionaly setup the bench in order to throttle to the GPU and not demonstrate the effectiveness of which CPU crunched the gaming code the best. This is OK so long as you present the whole story. In that the statement should be,

'Until GPU's exceed the point where load on the CPU is important, running at 1600x1200 with all goodies will demonstrate no difference between the CPUs. Using a single card nVidia graphics solution.' This would have been a fair assessment and the appropriate conclusion.

There are a few problems with what they are calling realworld though:

http://www.steampowered.com/status/survey.html
I published this link ealier. The wonders of stealth survey's :)  .... if you look through the data, which samples some 700,000 users what you find is that the bulk majority of gamers do not game at 1600x1200 but at 1280x1024 and 1024x768 by 36.1 and 47.5% respectively --- that is over 80% of users out that run less than 1600x1200. Thus realworld to HardOCP is not the majority of realworld to most users. They did a great disservice to those who run at typcial resolutions lower than the uber-gobsmacked-nut-busting-maxed-out settings that most people do not run.

Furthermore, if you are the ultimate gamer, likely you will get the ultimate rig to run your games -- which means SLI or xFire --- why not setup a xFire system and bench ... Anandtech did and at 1600x1200 resolutions we now begin to see the Core 2 power show separation:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...

You see it in FEAR, Rise of Nations, even Oblivian (killer game by the way).

Heck start at the beginning of the xFire portion and read through them all:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=279...
HL2, BF2, Q4 -- all show a domineering position of C2D @ 1600x1200 full aliasing. How can this be explained? Simple, xFire puts the burden back on the CPU.... and since C2D demolishes FX-62, is it surprising that it now out performs.... say you buy the next gen GPU around xMAS, do you want a CPU throttled rig to run it? That is what you would evidently get with an AMD powered system. Did HardOCP explain this??? Nope.


Now here is what is irking me about that review --- it isn't like they could not have done it... they are a HW site for goodness sake. Apparently, Intel even sent them the capability as one site even published the contents of the press kit:

Quote:
To test its new baby, Intel sent out one of its own D975XBX motherboard with a beta BIOS that adds support for the Core 2 Duo processors. We also received a Core 2 Duo E6700, clocked at 2.66GHz, and a 2.93GHz Core 2 Extreme X6800. There was also an ATI Radeon X1900 XTX and an X1900 CrossFire Edition graphics card in the goodie box to try and keep the test rig similar to the one Intel provided for us for the controlled preview t'other month.

http://www.reghardware.co.uk/2006/07/14/intel_core2_duo...

So what gives --- the tone of the article certainly reads like --- man we don't like the idea that Intel is taking back the performance crown. At least that is how I read it. Frankly, this site should be the laughing stock of the net, as every other site who benched at those resolutions also provided the correct data to demonstrate that it was GPU limited.

Heck, you could have slapped a P920 or P940 into the mix and show it matching FPS of a FX-62 --- what conclusions would be drawn then? That P4 really isn't all that bad, that you should have bought a P4 in April because it was lower priced?

It was a ridiculous review, it was unethical, it was inappropriate and they published utter junk which helped no one make a sane conclusion.

Jack
Well I disagree the article showings are correct. Your average game player being at 1280X1024 is correct but the average game player doesn't have the high level GPU or CPU in the article. Cutting the resolution and not the GPU would be incorrect as well. The only CPU in the article even remotly in the average game players price range is the E6700.

I would like to see an E6700 v/s X2 5000+ with 7600GS or ATI X1600XT and that is as close as to average game players system you'll find. This article was clearly a gamers review and I hear Intel CPU's have a problem with SLI as well. I guess if you own a Nvidia card dont buy the X6800 but if you own ATI buy it because it needs the extra CPU power.

The article also make another good point games only use 1 core currently so why waste $1000 or even $500 when the same level of gamming can be had at under $200. This article does point out to gamers its not worth buying a new system over upgrading. At the end of this year it may be worth it to upgrade my sons 939 based AMD system to a dual core AMD as we may start seeing dual core games.

I will be buying a E6600 for myself as currently my C1.8 just is too slow.
July 18, 2006 2:11:40 PM

The article intent was to shock and they hoped it would rally hordes of people in an attempt to leave no doubt that the C2D is no faster at games period; and not show a variance of resolutions while also using xfire, so the user could analyze the facts with the article to decide what is best for them.
July 18, 2006 2:18:13 PM

Quote:
The article intent was to shock and they hoped it would rally hordes of people in an attempt to leave no doubt that the C2D is no faster at games period; and not show a variance of resolutions while also using xfire, so the user could analyze the facts with the article to decide what is best for them.

The article did show some increase for the 2 top Core 2 duo's but would this not be the correct case for Nvidia and Nvidia SLI users? ATI's GPU owners have already seen there is a good increase for them when the benches released back in May.
July 18, 2006 2:25:13 PM

Of course SLI mobos on Conroe benifit too.
July 18, 2006 2:33:39 PM

Quote:
From a gamer's perspective:

When I am playing CounterStrike:Source, an online first person shooter usually played against 10-40 people, I need high FPS, not high eye candy.

I run at 1024*768 on my rig (see specs) so that I can have the smoothest experience possible. If your computer is slowing down on some maps you will not do very good.

Contrary to what some people say, your CPU is very important to gaming. You can't just buy a sweet video card and drop it in your Sempron system. Right now my gaming is limited by my slow Athlon 3200+. I'm hoping Valve will update the Source engine to use multiple cores, at that time I would probably upgrade to the X2 3800+ or whatever else may be affordable.


I helped my son set his server and gamming systems up and in doing I've found that setting the fps_max less than average fps, per the number of players, lowers your ping and fps lag. It also helps alot with those lag hacks he tells me about so turn on the eye candy and skip a few frames that your eye will never notice anyway.
July 18, 2006 2:35:52 PM

Quote:
Of course SLI mobos on Conroe benifit too.

Yes but SLI will see only small increase as its not as CPU intensive as Crossfire load balancing.
July 18, 2006 2:38:33 PM

The review actually didn't bother me. I think most people know the tests where about CPUs and not a realistic gaming environment. I personally care more about divx encoding, 3D studio rendering, etc as a better gauge.
However, I do feel over the next year, games will begin using multi-cores and you will actually need all that horsepower. Even most low-end Semprons will be 64bit and dual-core next year. (AMD knows something we don't??). I also always like having an extra core around for doing some background task - such as VM-Ware.
July 18, 2006 3:01:12 PM

I agree that they throttled the CPU's. What was even more troubling in the Elder Scrolls IV: Oblivion set up is the fact they LOWERED the shadows two notches on the Athlon system only to keep the Athlon system competitive. All settings should be the same for a comparison huh?
July 18, 2006 3:11:17 PM

Cool, I did not read that article to deeply, I merely skimmed through it. I see what you are saying Jack, and never doubted the superiority of the Core 2 chips, just like to see them put into real world situations IE. higher than 800x600.... but like you said.... most rn @ 1280x1024 and after I posted I thought that was probably the case, since most people with LCDs probably dont own 20+inch monitors. Yeah I did notice that the lowered the settings of some of the stuff on the AMD setups.... and either they are doing it in a pro AMD stance, or they are doing it to show the real world difference..... IE, you wont notice the difference and still be able to play at the same FPS, I imagine its the first of the two. Sorry for the poor grammar, it will not let me use the hyphen key, it opens up the firefox search function every time I try to use it, and dont really have the time to search for the cause.

I feel like my first post may have sounded like I was siding with a certain company.... just wanted to assure everyone I was not trying to sound that way, and I side with neither.

wes
July 18, 2006 3:19:30 PM

Quote:
I agree that they throttled the CPU's. What was even more troubling in the Elder Scrolls IV: Oblivion set up is the fact they LOWERED the shadows two notches on the Athlon system only to keep the Athlon system competitive. All settings should be the same for a comparison huh?

I agree it is odd they benchmark with those settings different but they clearly state the change and everyone knows Intel 2 top CPU's beats the top FX. I wish they would have showed them all set at 3 as this game maybe the most CPU intensive. In light of this the E6700 is the best new system gamming CPU for the money but we all know this as well.

Is it worth upgrading from a athlon 64 system? I would have to say no as the cost outways the benefits.

Good catch there HA_ZEE.
July 18, 2006 3:22:42 PM

But do you believe anyone spending $1000 or more on the X6800 will be running at 1024x768, or even 1280x1024? I certainly hope not... :lol: 
July 18, 2006 3:24:03 PM

I don't think there's anything in that article that anyone reading it didn't know. A gpu will limit the C2D's full potential at high resolutions?!?! No shit. AMD's top of the line processors run similar FPS as C2D's when limited by the GPU?!?!? No shit.
July 18, 2006 3:33:37 PM

That was basically my point, I no that it is impossible to know, but how long could it be until an current AMD or Intel(not Conroe) CPU would no longer be good enough for newly released games.... I imagine it would be at least a year, I bet it would be more like 2 or 3 unless some major advances in games comes along. I understand the 800x600 benchmarks.... but don't really see the point of people referring to them as some sort of useful benchmark, as there will be no noticable difference of gaming coming from these cpu's.... the best gaming advancements will be coming from ATI and Nvidia.... and I just see those as a much bigger deal than some 800X600 benchmarks. I just think that current cpu's have much life left in gaming, especially when games start becoming multi threaded.... that will give them room to expand.... and should relieve alot of the cpu bottleneck(at the lower resolutions). It obviously will not make a bit of difference at higher resoltuions until the faster video cards come out or the games demand a more powerful cpu. Any idea of when this time might come?

(sorry for the scatered post, posting on the fly not much time to make it pretty)
July 18, 2006 3:38:18 PM

Heh, I always find it funny when people start threads like this. They always seem to miss the point. That point being "Sure, Proc X might be faster than Proc Y, but when it comes down to it, is Proc Y now no longer suitable for the games it was playing awesomely before Proc X came out?"

Its going on all the time on the forums, here and elsewhere. Yes, Conroe is fast, undeniable. Conroe is newer, thats why. So yes, its going to be a lot better than its competitors. but the point still stands, do you REALLY need one for gaming right now?

Yes, some games, notably Oblivion, need all the speed they can get. Some games will run more smoothly. But really, most other game wont benefit much. You can only go so fast before the extra speed becomes unapparent.

Right now, I'd be happy with an X2 4400+, I dont need stupidly fast processing power. I'd rather save the money, and get a better gfx setup.

So, as a final point, I'm going to say this. Is there anyone out there who NEEDS (note the word NEED, not WANT) all that power from a Conroe? Is it really worth getting right now, or would it be wiser to save some money buying a cheaper proc, and making sure the rest of the rig is up to scratch as well?

Edit: For the record, I run at 1024x768 most of the time, and I'm happy with that. I get decent FPs, and good visuals too. And this is on an AGP 6600 with an XP 2800+...
July 18, 2006 3:52:37 PM

Quote:
I agree that they throttled the CPU's. What was even more troubling in the Elder Scrolls IV: Oblivion set up is the fact they LOWERED the shadows two notches on the Athlon system only to keep the Athlon system competitive. All settings should be the same for a comparison huh?

I agree it is odd they benchmark with those settings different but they clearly state the change and everyone knows Intel 2 top CPU's beats the top FX. I wish they would have showed them all set at 3 as this game maybe the most CPU intensive. In light of this the E6700 is the best new system gamming CPU for the money but we all know this as well.

Is it worth upgrading from a athlon 64 system? I would have to say no as the cost outways the benefits.

Good catch there HA_ZEE.

Actually Intel's top 3 beats the top FX. The E6600 beats the FX-62 and will cost around $315, which isn't too expensive compare to the E6700 and the X6800. Also, the less expensive ones aren't trailing too far behing the FX-62 either. It isn't about the benefits that you will see immediately but down the road with future games that will make the investment worth it.
July 18, 2006 4:10:38 PM

Quote:
Heh, I always find it funny when people start threads like this. They always seem to miss the point. That point being "Sure, Proc X might be faster than Proc Y, but when it comes down to it, is Proc Y now no longer suitable for the games it was playing awesomely before Proc X came out?"

Its going on all the time on the forums, here and elsewhere. Yes, Conroe is fast, undeniable. Conroe is newer, thats why. So yes, its going to be a lot better than its competitors. but the point still stands, do you REALLY need one for gaming right now?

Yes, some games, notably Oblivion, need all the speed they can get. Some games will run more smoothly. But really, most other game wont benefit much. You can only go so fast before the extra speed becomes unapparent.

Right now, I'd be happy with an X2 4400+, I dont need stupidly fast processing power. I'd rather save the money, and get a better gfx setup.

So, as a final point, I'm going to say this. Is there anyone out there who NEEDS (note the word NEED, not WANT) all that power from a Conroe? Is it really worth getting right now, or would it be wiser to save some money buying a cheaper proc, and making sure the rest of the rig is up to scratch as well?

Edit: For the record, I run at 1024x768 most of the time, and I'm happy with that. I get decent FPs, and good visuals too. And this is on an AGP 6600 with an XP 2800+...


But an Opteron 175 is $521 and a X2-4400 is $460 on NewEgg...

and you can get a Conroe CPU, motherboard and 1gig of DDR2 memory for just the cost of AMD's 939 2.2GHz dual core processor.
And that conroe not only will outperform it, but you are getting new effiecient technology with a much better upgrade path, plus speaking of future proof, HOCP's review is enough alone to illustrate Conroe is futureproof, since as game cards get more powerfull, the Conroe will push them, but AMD chips would need an upgrade to keep up.
July 18, 2006 4:26:19 PM

Quote:
I agree that they throttled the CPU's. What was even more troubling in the Elder Scrolls IV: Oblivion set up is the fact they LOWERED the shadows two notches on the Athlon system only to keep the Athlon system competitive. All settings should be the same for a comparison huh?

I agree it is odd they benchmark with those settings different but they clearly state the change and everyone knows Intel 2 top CPU's beats the top FX. I wish they would have showed them all set at 3 as this game maybe the most CPU intensive. In light of this the E6700 is the best new system gamming CPU for the money but we all know this as well.

Is it worth upgrading from a athlon 64 system? I would have to say no as the cost outways the benefits.

Good catch there HA_ZEE.

Actually Intel's top 3 beats the top FX. The E6600 beats the FX-62 and will cost around $315, which isn't too expensive compare to the E6700 and the X6800. Also, the less expensive ones aren't trailing too far behing the FX-62 either. It isn't about the benefits that you will see immediately but down the road with future games that will make the investment worth it.
True but the X2 5000+ performance is right at the FX-60 and very little difference to the E6600 as well and the new price list has the X2 5000+ cheaper at only $282.
http://www.dailytech.com/article.aspx?newsid=3361

The E6600 truly only outperformes the FX-62 in none gamming and thats why im buying a E6600 system. Im a programmer by trade and only my son has a need for a gamming system.
July 18, 2006 4:41:36 PM

Quote:
But do you believe anyone spending $1000 or more on the X6800 will be running at 1024x768, or even 1280x1024? I certainly hope not... :lol: 

Yes CAD designing programs and Oblivion are 2 good examples of why you would buy a $1000 CPU and not run higher than 1280X1024. Hardware isn't to the point of easly being able to push beyond 1280X1024 with current programs. Only 6 years ago the top 3DFX cards only pushed use to 800X600 and they were the top systems at the time.

I guess hardware has about a 10 year doubling of both the x and y resolution for that times current software.
July 18, 2006 4:47:14 PM

performance wise, IMO, the E6600 is as fast or faster than the FX-62. and it performs with inexpensive DDR2 memory which can saves $$$..

Also, with just mild overclocks which are safe and stable for 24/7 use, then all the Conroe dual cores chips can exceed performance levels of the FX-62...
Meaning that for around $500 you can get the CPU, Motherboard, and Memory for a bare bones system that has the potential to outperform AMD's best that they can offer, even if you overclock AMD's to the limit they still are behind the mildly overclocked Conroes.

also, I suspect within a few months the performance levels of Coroe's new arch will begin to be enhanced taking advantage of new features through software programming. In other words, the perfromance gap has not peaked yet.

That is my take on the market... for what it is worth :wink:
July 18, 2006 5:03:38 PM

Too bad for big Intel that it took them 3 years to engineer something that can compete with little AMD.
They will sell a lot of them, but with AMD planning to win the price performance game as they have since their existence, I dont think its going to be the silver bullet Intel fans are hoping for. :( 
July 18, 2006 5:13:49 PM

Quote:
Too bad for big Intel that it took them 3 years to engineer something that can compete with little AMD.
They will sell a lot of them, but with AMD planning to win the price performance game as they have since their existence, I dont think its going to be the silver bullet Intel fans are hoping for. :( 
You smoke far too much crack.
July 18, 2006 5:24:58 PM

Quote:
Too bad for big Intel that it took them 3 years to engineer something that can compete with little AMD.


How is this too bad for anyone? Pretty sure that Intel was raking in profits for all of these years in question and now they have a badass new line of cpu's.
July 18, 2006 5:26:04 PM

Quote:
Too bad for big Intel that it took them 3 years to engineer something that can compete with little AMD.
They will sell a lot of them, but with AMD planning to win the price performance game as they have since their existence, I dont think its going to be the silver bullet Intel fans are hoping for. :( 
You smoke far too much crack.
Intel could bomb all of AMD's factories and offices, thus wiping out their entire workforce and obliterating the company, and AMD fans would still try to find a way to spin the situation.
July 18, 2006 5:28:22 PM

Quote:
Poor review, extremely biased and unprofesional. HOCP lost my respect by this garbarge reporting.
They should have benchmarked at more resolutions, and also used xFire cards, which they had at their disposal yet would skew their intent by painting Conroe in a good light.
This is all obvious for those who are willing to think about what they read and analyze the results.


This isn't really what I've come to expect of the [H] crew. I wonder if this was another BFG sponsored article...perhaps that would explain the use of single BFG card in lieu of the included ATI product? (Still wouldn't explain even the lack of SLI). Even so, the claims of 'real-world' gaming strike me as odd. The benchmarks contained in the article would have been much better suited to simply showing how C2D (and FX to a lesser extent)has simply outpaced current-gen graphics.
July 18, 2006 5:57:19 PM

A fair test is run at resolutions that reflect what the gaming community plays at. I haven't seen 800 x 600 on my machine in years, I play some at 1280 x 1024, and some at 1600 x 1200. With the arrival of LCDs and the like that broadens the playable resolutions.
For a site to purposely reduce them so it puts lower performing CPU in a good light to one that is superior is total crap.
Why is everyone so scared because Intel has a fast CPU now ?
It was bound to happen, everyone knew it would, and it's good that is has happened. We can all move forward to better hardware, good competition.
It's like watching a Nascar race..... if you concentrate only on the driver you like, you really miss all of the great racing everywhere else on the track. If you align yourself with one CPU manufacturer you really miss all that technology has to offer.
July 18, 2006 6:09:12 PM

Quote:
Heh, I always find it funny when people start threads like this. They always seem to miss the point. That point being "Sure, Proc X might be faster than Proc Y, but when it comes down to it, is Proc Y now no longer suitable for the games it was playing awesomely before Proc X came out?"

Its going on all the time on the forums, here and elsewhere. Yes, Conroe is fast, undeniable. Conroe is newer, thats why. So yes, its going to be a lot better than its competitors. but the point still stands, do you REALLY need one for gaming right now?

Yes, some games, notably Oblivion, need all the speed they can get. Some games will run more smoothly. But really, most other game wont benefit much. You can only go so fast before the extra speed becomes unapparent.

Right now, I'd be happy with an X2 4400+, I dont need stupidly fast processing power. I'd rather save the money, and get a better gfx setup.

So, as a final point, I'm going to say this. Is there anyone out there who NEEDS (note the word NEED, not WANT) all that power from a Conroe? Is it really worth getting right now, or would it be wiser to save some money buying a cheaper proc, and making sure the rest of the rig is up to scratch as well?

Edit: For the record, I run at 1024x768 most of the time, and I'm happy with that. I get decent FPs, and good visuals too. And this is on an AGP 6600 with an XP 2800+...


Cable..... I was trying to make the exact same point you made..... was I so far off that you couldn't tell at all that what you were just saying is what I was trying to say?
!