david__t

Distinguished
Feb 2, 2003
200
0
18,680
Are we still living in the stone age?? If Tom has to run a game benchmark at 640 x 480 on a Radeon 9700 pro to see a decent increase in his P4 overclock review then I think that he is trying too hard to justify the huge expense of the cooling system and of the huge overclock to 4.1 GHz. I mean what possible advantage is there going from 460 fps to 492 fps ??

And on the Commanche 4 benchmark even the lowest CPU in the comparison table still scored a very playable 46 fps (2500+)
I could understand if the review had been written from the point of view of buying a cooler that will last for many years (even if it is at stock speeds in the future) but for a minor increase now what is the point??

Doom 3 will not run in full detail even at 4.1 GHz unless you have a good DX 9 card. As we all know, and as Tom has said before, a good general spec inside any PC is far better than having 1 or 2 good components inside it.

So come on Tom - be realistic and do not drop below 1024 x 768. We should be looking in the range of 60 - 120 fps for maximum game enjoyment (with all effects switched on) and anything above that is pointless. OK massive speed would be a good indication of future performence and would help you to buy a future proof system but even the Ti4600 was made redundant very quickly by Direct X 9.

I know that many people enjoy overclocking for the challenge and not for the performance but what is the point in going to those extreme financial and risky lengths in reality??

4.77MHz to 4.0GHz in 10 years. Imagine the space year 2020 :)
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
No offense david__T, but you're <i>really</i> in the dark. Running a game at a low resolution ensures that the graphics card is not a significant factor in the performance. So when we see the benchmark results we are seeing them influenced almost completely by the CPU (and memory system) alone.

If you ran the benchmark with high resolution settings then the video card would become a significant factor in performance and that would reduce how meaningful the benchmark scores are in comparing the CPU's performance.

If the review was saying "Look how great this video card is!" or "Look how great this whole PC is!" then yes, you would want higher resolution settings for the benchmarks. <i>However</i>, since this article was purely about the <b>CPU</b>, the benchmarks were set up to mitigate the influence of non-CPU factors such as the graphics card.

Get it?

"<i>Yeah, if you treat them like equals, it'll only encourage them to think they <b>ARE</b> your equals.</i>" - Thief from <A HREF="http://www.nuklearpower.com/daily.php?date=030603" target="_new">8-Bit Theater</A>
 

daniel1113

Distinguished
Jan 30, 2003
79
0
18,630
Also, I could be mistaken, but I think benchmarking programs, such as 3DMark, account for the resolution used, along with the filtering, anti-alaising, etc. Therefore, a system that "only" gets 60 FPS, but is at 1600x1200 with 8x antialaising, should get a similar score to a system that is running at 1025x768 with 120 FPS.

This may not be correct, but it is the way I have always thought the benchmarks worked.

- Daniel (daniel1113@attbi.com)
 

david__t

Distinguished
Feb 2, 2003
200
0
18,680
But my point was that going from 460 - 490 fps is next to useless as a comparison because nobody will ever run games at that res nowadays. If they said "look we can get Doom 3 demo from 20 fps to 40 fps then it would be meaningful and would be a more tangeable result. It would also be much more of a significant rise in performance.

I understand the video card being a limiting factor but that doesn't make the result more meaningful does it? Plus the main chunk of games are processed by the graphics card and this will only become more and more true. So this means that CPU upgrades are of less importance from a gaming perspective than that of a good video card - which makes overclocking a P4 even more pointless. When the first Voodoo and Power VR cards were launched it started a chain of events that was inevitably going to lead to the GPU becoming the core of any gaming platform - and over the years it has meant that CPU power has taken a back seat in importance stakes as far as gaming is concerned. I remember the last game that I can think of where the CPU was the emphasis was Unreal - it required a beast when it first came out. But now we have swung in completely the other direction so that Doom 3 players will be lusting after a Radeon 9800 pro or a Geforce FX rather than a P4 3.06 or an AMD 3200+.

As my major arguament was saying, the real world benefits are what we should focus on - not some unrealistic situation that nobody will encounter. There will always be people trying to justify their very expensive CPU purchases but the time has come to admit that the Video card is now king where gaming performance is concerned.

I might be in the dark slvr_phoenix but only about how relevant these scores are :) If you can get excited by 500 fps when 60 is excellent for games playing then enjoy the scores and be happy - whatever floats your boat.

4.77MHz to 4.0GHz in 10 years. Imagine the space year 2020 :)
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
But my point was that going from 460 - 490 fps is next to useless as a comparison because nobody will ever run games at that res nowadays. If they said "look we can get Doom 3 demo from 20 fps to 40 fps then it would be meaningful and would be a more tangeable result. It would also be much more of a significant rise in performance.
You really just don't get it. You're missing the point entirely. The concern isn't with how much more 'playable' the game is because of the faster CPU. The concern is how much faster is the faster CPU. With just a little math you know that Processor Y is n% faster than Processor X in Application A and m% faster in Application B. <i>That</i> is the point. It's about comparison statistics of performance based purely upon the CPU, not about overall performance of the complete PC for running Doom 3.

I understand the video card being a limiting factor but that doesn't make the result more meaningful does it?
Yes, it does. Especially in games the video card accounts for a considerable amount of the performance at the resolution settings that most people play games at.

This is a bit generalized for simplicity's sake, but say you've got the Doom 3 demo at an enjoyable resolution. The CPU will account for 20% of the performance and the video card accounts for 80% of the performance. So when we compare the scores of two CPUs the results that we see as the improvement in performance are actually only 1/5 of the true amount of more work that the faster processor can do, because the video card is doing so much more work than the CPU.

Now say we reduce the resolution to the crappiest possible and kill as many effects as possible. Now the video accounts for 20% of the performance and the CPU for the remaning 80%. This time when we run the benchmark we see that the difference in the performance of the faster CPU is a much closer representation to it's actual capabilities of doing more work because it's showing 4/5 of the true amount of work that the faster CPU can do.

By reducing the amount of work that the video card does in the benchmark we will see a much more accurate portrayal of the extra amount of work that the faster CPU can do.

Plus the main chunk of games are processed by the graphics card and this will only become more and more true. So this means that CPU upgrades are of less importance from a gaming perspective than that of a good video card - which makes overclocking a P4 even more pointless.
This is the first good point that you've actually given in this whole discussion. You're right. For games the CPU isn't nearly as important as the graphic card is these days.

That aside, that has <b>nothing</b> to do with showing the performance increase of OCing a CPU or comparing like CPUs. A CPU comparison should be about comparing the CPUs, period. It is up to the consumer to decide whether or not they want to OC their CPU or upgrade their CPU, not up to the CPU comparison review to decide for them and bias all of the comparison benchmarks specifically towards gamers.

As my major arguament was saying, the real world benefits are what we should focus on - not some unrealistic situation that nobody will encounter.
And this is exactly where you're just not getting it. This wasn't a review of a complete PC. The benchmarks are <i>not</i> portraying the increase in performance of a complete system. What they <i>are</i> representing is the performance gain <i>specifically and only</i> from the CPU. Why? Because it is a <b>CPU COMPARISON</b>.

I might be in the dark slvr_phoenix but only about how relevant these scores are :) If you can get excited by 500 fps when 60 is excellent for games playing then enjoy the scores and be happy - whatever floats your boat.
See, now not only are you proving your ignorance, but you're also proving that you're an arse.

First of all I don't get excited, period. Frankly, I don't like the case and I'm definately not concerned with a refrigerant cooling system. I'm not interested in that article on so many levels. All that I'm <i>trying</i> to do is explain to you why you're so very wrong. Perhaps you just didn't know. So I provided the information. Now you have no excuse. You can't claim ignorance, because now you know. So now it's either just plain stupidity or now you finally get it.

Second of all what I look for in a CPU comparison is the percentage of more work that the faster processor can do. Why? Because that's all that a CPU comparison should be.

When it comes down to it, who even cares about FPS from a CPU? As you yourself said, the CPU isn't the important factor in games anymore. So 30, 300, or 3000: FPS doesn't matter. What matters is how much more work the faster processor can do, period.

If we want to improve the performance of our PCs for games then we get better video cards, not better CPUs.

"<i>Yeah, if you treat them like equals, it'll only encourage them to think they <b>ARE</b> your equals.</i>" - Thief from <A HREF="http://www.nuklearpower.com/daily.php?date=030603" target="_new">8-Bit Theater</A>
 

Stain

Distinguished
Jun 28, 2002
331
0
18,780
Basically what this article shows us is something we already knew, there is no need to have more than 2GHz for <b>today's</b> applications.

See, now not only are you proving your ignorance, but you're also proving that you're an arse.
Honestly your seeming more of an arse. However I believe the original poster probably did not read the article, only looked at the benchmarks. :frown:
 

Stain

Distinguished
Jun 28, 2002
331
0
18,780
Your mistaken. You get higher score running lower settings, antialaising off, etc. It's not as much of a difference as you might think though (not like Quake3 fps).

I've noticed AMD_me likes to post his 3dmark score at 640x480x16, guess it makes him feel big. IMO all 3dmark scores should be posted at default settings.
 

slvr_phoenix

Splendid
Dec 31, 2007
6,223
1
25,780
Basically what this article shows us is something we already knew, there is no need to have more than 2GHz for <b>today's</b> applications.
I'd have to both agree and disagree with that. Basically the article did show us something that we already knew, that an OCed P4 is faster than a non-OCed P4. We could just say "duh" and leave it at that for all that it matters to most people.

As for the need for speed, it really depends on what you're doing. If you're playing Quake3 then it really doesn't matter. If you're running 3DSM, then it really does.

But the thing is, the whole point of the benchmarks wasn't to say "This is what software you can run with your OCed 4GHz monster." The point of the benchmarks was to say "This is how much more CPU performance an OCed 4GHz monster has over a stock 3GHz." We knew that it would perform better. That was obvious. The benchmarks are there to tell us <i>how much better</i>.

Honestly your seeming more of an arse.
Thank you. :)

That is because the way that I work is if someone is being an arse, I am an arse to them. If someone is being helpful or respectful, then I'm helpful or respectful to them. It's an easy and enjoyable system. And I'm more than glad to be an arse to people who deserve it. :)

However I believe the original poster probably did not read the article, only looked at the benchmarks. :frown:
No, I think he genuinely just doesn't get the whole point of comparing a CPU to another CPU. He really doesn't strike me as either being very bright or being very old. I could be wrong mind you. This is just my observations based on his posts.

"<i>Yeah, if you treat them like equals, it'll only encourage them to think they <b>ARE</b> your equals.</i>" - Thief from <A HREF="http://www.nuklearpower.com/daily.php?date=030603" target="_new">8-Bit Theater</A>
 

addiarmadar

Distinguished
May 26, 2003
2,558
0
20,780
I really do not see the point in that resolution either. That res is for dinky monitors wich the standard now is 1024x768. Unless youre handicaped visually, no one is going to use that but they still show that setting for those people. But for the rest of use Running games anything below 800x600 would look like castle wolfenstein 3d.

Everyone should know that speed is not the key of performance but its ability. OC stuff is really for those that want to show off. If you are just into using a computer for work and games and not to brag then who cares. There is real know advantage to OC stuff for it really doesnt enhance it much due to the nature of the technology being used. It is just for braggers.
 

david__t

Distinguished
Feb 2, 2003
200
0
18,680
I'm not going to try and get in to a flame war and I wasn't trying to be disrespectful when I replied to your post.

Yes I did read the whole article and no I am not young and yes I do have an IT degree and have been building PCs for years.

I wasn't missing the point about the fact that this review was a CPU bench mark - I know what they were trying to do. I was just venting my frustration at, on the one hand, the pointlessness and miniscule increase in speed for the huge amount of effort invested, and on the other hand, the futile mentality to increase CPU speed at any cost when considerable gaming performance is attained in much more easier ways such as by getting a better video card.

I appreciate your trying to explain to me the fact that this was a pure CPU bench mark but no PC is complete without all the other components as well and I think that reviews should reflect that fact. After all when Tom has tested PC3200/PC3500 memory he doesn't just say "this memory is the fastest available" which we already know, he goes in to detail about why it won't give you a good increase in performance due to asynchronous CPU FSB speeds. So you get a good memory bench mark article but with some realism in there as well. We all read these things to know what the best parts are to buy in combination - it is no good analysing things on their own which is why Tom always give an explanation of the system setups and specifications.

I suppose at the end of the day we all get what we want from these articles and we can take or leave them based on how we make our buying decisions and how we like to evaluate our prospective purchases. I generally agree with Toms articles and I find them very informative, but the odd one pops up that I am bemused by and I just have to say so.

Can I remove my Flame Protector 5000 now ;P

4.77MHz to 4.0GHz in 10 years. Imagine the space year 2020 :)
 

Crashman

Polypheme
Former Staff
Basically what this article shows us is something we already knew, there is no need to have more than 2GHz for today's applications.
You are so far off there it's hillarious! It takes up to 4 hours for me to convert certain video formats into certain other video formats, with editing, on a very good performing P4 2.4B, using <b>today's</b> version of TMPGENC. I could REALLY USE a CPU 4x as fast as this one, because waiting 4 hours for a process to complete is excruciating.

<font color=blue>Watts mean squat if you don't have quality!</font color=blue>
 

Crashman

Polypheme
Former Staff
4 years ago it was impressive to see a system get 60FPS at 640x480 in Quake III. Now, a year later I could compare a new processor getting 120FPS to that former processor's 60FPS and say "wow, it doubled". And so forth, and so on, so that testing a processor in Q3 with 640x480 resolution gives me an idea of progress over MANY YEARS of upgrades! It doesn't have to be relavent to modern computing, it only shows what progress has been made over a certain period of time.

<font color=blue>Watts mean squat if you don't have quality!</font color=blue>
 

Stain

Distinguished
Jun 28, 2002
331
0
18,780
I guess I should have seen this coming. Saying you don't <b>need</b> more than 2GHz now on a CPU forum. (think when Bill Gates said we dont need more ram than 640k or whatever it was:)

Just to be more clear, obviously it's nice to have 4GHz but for most applications (at least for me) your not going to see a noticable difference. Think if you gave your mom a O/C 4GHz p4, do you think she would notice the difference between that and a 2GHz (using the stereotype that your mom isn't a PC enthusiest like you). There are just not nearly as many reasons to get the fastest CPU as there used to be. I don't think most people need to convert their 8-hour home made goat porn vids to DivX so they can burn them to CDs and mail them to Jenifer Lopez or at least not often enough that it matters that it takes 3hours instead of 4.

As always IMHO.
 

TRENDING THREADS