Is this 3dmark06 looking good?

Win Vista SP1
Gefore 9800gtx stock at the moment
2gb DDR 600
AMD Opteron 170 dual core at 3ghz

Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?
41 answers Last reply
More about 3dmark06 good
  1. addiarmadar said:
    Win Vista SP1
    Gefore 9800gtx stock at the moment
    2gb DDR 600
    AMD Opteron 170 dual core at 3ghz

    Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?



    Not good, you should do better. I got over 15k on 8800gts.
  2. dagger said:
    Not good, you should do better. I got over 15k on 8800gts.
    http://ourworld.cs.com/dagger9066/ss3dmark.jpg



    There you go again tellin another stock dual corer he should do better since your 3.6 quad OC CPU and graphics got 15k.

    Your CPU gets close to 3000 more then him, "I'm quessing he is close to 2000 cpu score." And with your CPU getiing that much more it would easily put your OC video card up another 2k.

    And lets not forget that AMD VS Intel: come on man.

    @OP its about right for what you got.
  3. addiarmadar said:
    Win Vista SP1
    Gefore 9800gtx stock at the moment
    2gb DDR 600
    AMD Opteron 170 dual core at 3ghz

    Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?

    It's about right for an AMD dual core, but my friend the difference between an AMD dual core and a recent Intel dual core is really startling, like 6,000 3d marks startling, I kid you not.
    http://img383.imageshack.us/img383/3994/e840036ghzjf0.th.jpg
    @dagger [:mousemonkey:2] If a quad is better than a dual and a GTS is better than a GT then your score is a bit on the low side which considering the overclock you are claiming on that GTS in your sig can only mean that Vista must be a real pile of poo.
  4. This is not a good 3DMark 06 for 9800GTX and this result is just at same range as 8800GT. I guess your bottleneck is AMD Opteron 170 which is an old type server CPU which more likely for professional 3D computing while 3DMark 06 more likely on general 3D computing. It is certain these 2 years the AMD CPU is normally with lower benchmarking marks than Intel CPU.
  5. Mousemonkey said:
    It's about right for an AMD dual core, but my friend the difference between an AMD dual core and a recent Intel dual core is really startling, like 6,000 3d marks startling, I kid you not.
    http://img383.imageshack.us/img383/3994/e840036ghzjf0.th.jpg
    @dagger [:mousemonkey:2] If a quad is better than a dual and a GTS is better than a GT then your score is a bit on the low side which considering the overclock you are claiming on that GTS in your sig can only mean that Vista must be a real pile of poo.


    I guess your 8800GT 512MB with E8400@3.6GHZ got higher 3DMark06 mark than Dagger's 8800GTS 512 with Q6600@3.6G is due to the E8400 is new generation CPU (45nm vs 65nm) and is with higher FSB.
  6. Crazy-PC said:
    I guess your 8800GT 512MB with E8400@3.6GHZ got higher 3DMark06 mark than Dagger's 8800GTS 512 with Q6600@3.6G is due to the E8400 is new generation CPU (45nm vs 65nm) and is with higher FSB.

    Nah, I beat Dagger's score with a lower clocked 65nm dualie as well. :lol:
    http://img388.imageshack.us/img388/1001/0634ghzhz2.th.jpg
  7. Actually, it's because of Windows Vista Ultimate 64bit, which I further filled with third party bloatware and crapware, compared to a stripped down XP. Anyone who tested on both OS should see a significant difference. Also, the gpu core is at 750mhz, not 780, I've recently tuned it down since the extra juice isn't really needed.

    The point is, OP's score is low, not that mine is high. Opetron server processor is not that bad. Not enough to justify the poor performance of 9800gtx by a long shot. If you pretend it's not broken, it'll never get fixed. :sarcastic:
  8. dagger said:
    Actually, it's because of Windows Vista Ultimate 64bit, which I further filled with third party bloatware and crapware, compared to a stripped down XP. Anyone who tested on both OS should see a significant difference. Also, the gpu core is at 750mhz, not 780, I've recently tuned it down since the extra juice isn't really needed.

    The point is, OP's score is low, not that mine is high. Opetron server processor is not that bad. Not enough to justify the poor performance of 9800gtx by a long shot. If you pretend it's not broken, it'll never get fixed. :sarcastic:

    If the OP is using a 939 Opteron then the score is actually quite high. I don't use a stripped down version of XP, just the full corporate version and my GPU's are at 600/1500/900. :whistle:
  9. Your score is fine i scored about a little under 11,100 with:
    amd x2 5200 o/c @3.0ghz
    4 gig ram
    9800gtx o/c core @ 840 and memory @ 1240/2480effective
  10. It's the AMD cpu you got , my intel at the same clock as Invisik and a 8800 gt , i get jus smidgen under 12,000 .Thats with Windows Vista home premuim , so i would either suggest be happy with that score or overhaul your system and get a Intel based system lol.

    But in perspective that score aint "bad" but your old platform is bottlenecking your fast video card.
  11. addiarmadar said:
    Win Vista SP1
    Gefore 9800gtx stock at the moment
    2gb DDR 600
    AMD Opteron 170 dual core at 3ghz

    Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?



    back to the original question.. lol ya thats bad.. but its because of your processor, i get 11,400 stock on my Q6600 and EVGA 8800GT SC. 14,500+ overclocked! but dont fret it, i only got 8k with a X2 3800+ and my 8800GT.. just for reference so you can kinda see the difference u'll see;)
  12. oh ya forgot to mention 3dmark06 results dont mean much.
    My other build with an amd x2 6400 o/c @ 3.4 ghz and 8800gts g92 o/c 815/1090
    scored 12,140
    as u can see it beats my other pc which scored lil under 11,100
    with amd x2 5200 o/c 3.0 and a 9800gtx o/c 840 / 1240
    my build with the amd x2 5200 and 9800gtx scored less in 3dmark06 but in games it gets about 6-9fps more then my amd x2 6400 and 8800gts build.
  13. invisik said:
    oh ya forgot to mention 3dmark06 results dont mean much.
    My other build with an amd x2 6400 o/c @ 3.4 ghz and 8800gts g92 o/c 815/1090
    scored 12,140
    as u can see it beats my other pc which scored lil under 11,100
    with amd x2 5200 o/c 3.0 and a 9800gtx o/c 840 / 1240
    my build with the amd x2 5200 and 9800gtx scored less in 3dmark06 but in games it gets about 6-9fps more then my amd x2 6400 and 8800gts build.



    And how did you scientifically calculate the 6-9fps difference? Of is it just a "feeling." :p

    People who get low benchmarks always say it doesn't mean much. If you can't reach the grapes, they must be sour! :na:

    The point is, if you don't admit it's broken, it'll never get fixed. Foolish pride won't do you any good. Find out what's wrong, get it fixed, and then boast about it when it's running like clockwork.

    Anyway, look at the first 2 categories (sm2.0 and sm3.0 scores) instead of the cpu score or overall score for a more accurate bench of the graphics card. And keep in mind cpu performance do matter in real life (duh).
  14. So Dagger...12,000 is a low benchmark?
  15. addiarmadar said:
    Win Vista SP1
    Gefore 9800gtx stock at the moment
    2gb DDR 600
    AMD Opteron 170 dual core at 3ghz

    Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?


    DDR1-600?!
    What are the time clock settings?
    It may go up with lower clocks -and- running it around DDR1-520-550 and use a non 1:1 ratio if your CPU will run at that speed.
    I run the Optron 146's in my profile @ 3Ghz and I run the one system I had to rebuild with a stock 6000+.
    You should be able to hit 3.4Ghz on any of the 939 Optrons on air with the right cooler/PSU and DFI MB.

    Try running it on XP for a better score as well.

    Also IMHO you would have been better off with a 8800GTS 512 (G92) as they have been known to OC 800/2200+ on the stock coolers without overheating.

    The 8800GTS 512 is the best card for the money no matter what company makes it and for now leaves ATI in it's wake.
  16. Hi I get 6649 with a phenom 9600 @ 2.473 2 Gig o Ram and a Radeon X1900Xt and I play games very well I’m will upgrade to a HD3870
  17. addiarmadar said:
    Win Vista SP1
    Gefore 9800gtx stock at the moment
    2gb DDR 600
    AMD Opteron 170 dual core at 3ghz

    Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?


    The rest of your system is holding it back slightly...along with vista. You look like you are right around where you should be, so I wouldnt worry about it.

    Best,

    3Ball
  18. Hi I get 6649 with a phenom 9600 @ 2.473 2 Gig o Ram and a Radeon X1900Xt and I play games very well I’m will upgrade to a HD3870

    http://img367.imageshack.us/img367/9955/3dmark06patriotxj5.th.jpg" class="img lazy">
  19. Crazy-PC said:
    I guess your 8800GT 512MB with E8400@3.6GHZ got higher 3DMark06 mark than Dagger's 8800GTS 512 with Q6600@3.6G is due to the E8400 is new generation CPU (45nm vs 65nm) and is with higher FSB.


    The improved 45nm process will not create a significance difference among processors with the same architecture. If clocked at the same speed then the performance will be very very similar. The operating system is the killer in this scenario.

    Best,

    3Ball
  20. I don´t think the opteron 170 is the bottleneck how much points does it give you
  21. dagger said:
    Anyway, look at the first 2 categories (sm2.0 and sm3.0 scores) instead of the cpu score or overall score for a more accurate bench of the graphics card. And keep in mind cpu performance do matter in real life (duh).


    Dagger, you retard!

    Quess what happens when your CPU score goes up. CPU score has a significant impact on sm2 and sm3 scores.

    The graphic card might be good but if a cpu is holding it back it wont make a dfference.

    The graphic card has to communicate with the cpu.
  22. Figured much my cpu is holding it back, futuremark is putting a heavy emphasis on cpus now these days to make people to burn more $$$ than just GPUs to get those scores. Just wanted comparison to other DC CPUs. Getting the Q6600 soon but not yet.

    Running COD4 on maxed out everything and it aint choaking at all but WoW is another thing...
  23. Whats wrong wit WoW? It shouln't be as demanding as COD4. Ihave a 8500gt and it plys both of those games great.
  24. roadrunner197069 said:
    Dagger, you retard!

    Quess what happens when your CPU score goes up. CPU score has a significant impact on sm2 and sm3 scores.

    The graphic card might be good but if a cpu is holding it back it wont make a dfference.

    The graphic card has to communicate with the cpu.

    Gee, they also affect it the same way in real games. Or is that not convinent for you to cite to others. :sarcastic:
  25. @ dagger : Do us all a favor and leave, and never come back: until you know something.
  26. dagger said:
    And how did you scientifically calculate the 6-9fps difference? Of is it just a "feeling." :p

    People who get low benchmarks always say it doesn't mean much. If you can't reach the grapes, they must be sour! :na:

    The point is, if you don't admit it's broken, it'll never get fixed. Foolish pride won't do you any good. Find out what's wrong, get it fixed, and then boast about it when it's running like clockwork.

    Anyway, look at the first 2 categories (sm2.0 and sm3.0 scores) instead of the cpu score or overall score for a more accurate bench of the graphics card. And keep in mind cpu performance do matter in real life (duh).



    No some of my games have built in test and i noticed an increase in fps unless those fps r false then i agree with u.
    =]
  27. Chillax boyz and move on.
  28. addiarmadar said:
    Win Vista SP1
    Gefore 9800gtx stock at the moment
    2gb DDR 600
    AMD Opteron 170 dual core at 3ghz

    Got 10243 for 3dmark06 free edition. Seems a bit low but thinking its cause I aint using a quad core cpu yet. This good?



    Not bad, as others have mentioned you definitely should look into a CPU upgrade.

    For reference, my e8400 dual core at 3.6 GHz and 8800gts 512 (oc'ed at 770/1050) gets 15,800 in XP and 14,500 in Vista with the 3dmark06 trial software.
  29. 3Ball said:
    The improved 45nm process will not create a significance difference among processors with the same architecture. If clocked at the same speed then the performance will be very very similar. The operating system is the killer in this scenario.

    Best,

    3Ball


    Do you mean Dagger using Vista while mousemonkey use XP? If it is the case, this is still normal understanding and no surprise.
  30. Crazy-PC said:
    Do you mean Dagger using Vista while mousemonkey use XP? If it is the case, this is still normal understanding and no surprise.



    Yep, it makes a big impact on cpu score. That's why for graphics performance, you can look at scores of the first 2 categories (for results of the two shader models), instead of cpu score or even overall score. Although obviously cpu performance matters.

    3dmark is the most authoritative among benchmark out there. If you don't think it's belivable, there's no other benchmark to believe in. People who say it has nothing to do with system performance are the same ones who got very low scores and claim it's the benchmark that's broken. It's the foot-in-mouth approach. :na:
  31. omahagtp said:
    Not bad, as others have mentioned you definitely should look into a CPU upgrade.

    For reference, my e8400 dual core at 3.6 GHz and 8800gts 512 (oc'ed at 770/1050) gets 15,800 in XP and 14,500 in Vista with the 3dmark06 trial software.

    He doesn't need a new CPU unless he wants to shout how many 3DMarks he can get!
    My system scores just over 13k, but with a decent quad, it should easily hit 16-18k I'd guess.
    Do I care, absolutely not, because in game it's fine. 3DMark is synthetic and it is skewed massively to giving higher scores with a quad over a dual!
    In the real world, there is barely going to be a difference between a dual and a quad especially in (most!) games. OP don't worry about it, it's a fine score unless you want more 3DMarks! :sol:
  32. dagger said:


    3dmark is the most authoritative among benchmark out there. If you don't think it's belivable, there's no other benchmark to believe in. People who say it has nothing to do with system performance are the same ones who got very low scores and claim it's the benchmark that's broken. It's the foot-in-mouth approach. :na:


    The point is, what matters is the real world performance. How games actually perform is more important than what 3DMark shows you for a score. For instance, the OP is getting a little over 10k. If his gaming performance is good, it doesn't matter if he ups the score to 15k. In most games (except Crysis), unless you are running an extremely high resolution like 1900x1200, anything over 10k in 3DMark06 will most likely mean your games will run perfectly fine. That isn't to say that there is no correlation between your 3Dmarks and how your games perform. But, there is a point where it doesn't make a noticeable difference. An example, you get 150 fps in CS:S. Is it worth it to upgrade your CPU and Mobo (among other parts) to get your FPS upto 200? No, that would be a waste of money. It is all in the eye of the beholder.

    So, the question is: If you are happy with gaming performance, is it worth it to get a new CPU (and most likely a motherboard) to increase a synthetic benchmarks score.

    Now, I am not sure if the OP's games are running good or not, I am just generalizing.
  33. LukeBird said:
    He doesn't need a new CPU unless he wants to shout how many 3DMarks he can get!
    My system scores just over 13k, but with a decent quad, it should easily hit 16-18k I'd guess.
    Do I care, absolutely not, because in game it's fine. 3DMark is synthetic and it is skewed massively to giving higher scores with a quad over a dual!
    In the real world, there is barely going to be a difference between a dual and a quad especially in (most!) games. OP don't worry about it, it's a fine score unless you want more 3DMarks! :sol:


    Well, he did wonder why it was low, and that is the definitive suggestion as how to raise it.

    Don't quote me about quad vs dual cores. I have a dual, and do not, and did not, suggest a quad.

    Unless you think I'm bragging about 3dmark scores. Which I am not. Only a reference point. You don't see me with a list of my junk in my sig. :kaola:
  34. basketcase said:
    The point is, what matters is the real world performance. How games actually perform is more important than what 3DMark shows you for a score. For instance, the OP is getting a little over 10k. If his gaming performance is good, it doesn't matter if he ups the score to 15k. In most games (except Crysis), unless you are running an extremely high resolution like 1900x1200, anything over 10k in 3DMark06 will most likely mean your games will run perfectly fine. That isn't to say that there is no correlation between your 3Dmarks and how your games perform. But, there is a point where it doesn't make a noticeable difference. An example, you get 150 fps in CS:S. Is it worth it to upgrade your CPU and Mobo (among other parts) to get your FPS upto 200? No, that would be a waste of money. It is all in the eye of the beholder.

    So, the question is: If you are happy with gaming performance, is it worth it to get a new CPU (and most likely a motherboard) to increase a synthetic benchmarks score.

    Now, I am not sure if the OP's games are running good or not, I am just generalizing.

    Then why did OP spent extra on 9800 when 9600 run games just fine? It's all for a little futureproofing. Those extra spent should come out to mean something.

    And as I pointed out, 3dmark has cpu and graphics components, if you think the cpu part is skewed, which is debateable, then just look at the graphics scores. To say the whole benchmark is worthless just because you got a low score is childish. :sarcastic:
  35. dagger said:
    Then why did OP spent extra on 9800 when 9600 run games just fine? It's all for a little futureproofing. Those extra spent should come out to mean something.

    And as I pointed out, 3dmark has cpu and graphics components, if you think the cpu part is skewed, which is debateable, then just look at the graphics scores. To say the whole benchmark is worthless just because you got a low score is childish. :sarcastic:


    Did I say it was worthless?

    And there is a bit of a skew related to quad cores. It gives a significant gain in Marks for having a quad core. Which wouldn't be an issue if games were doing the same. But, most games currently available do not really get that much in gains from a quad core CPU, when compared to a similarily clocked dual core. In the future, yes, more and more games will utilize more cores better. But, until then, it is skewed. And that being said, it doesn't make 3Dmark worthless. It just makes understanding what real world performance a little bit harder to judge. Not impossible, but you need to read into things a little more. For instance, you could have a e6600 cpu at 3.0 ghz and a q6600 at 3.0 ghz. The quad would obviously score higher in 3DMark06. But, in real world scenarios, they would (in most games currently available) get ALMOST the same performance.
  36. basketcase said:
    Did I say it was worthless?

    And there is a bit of a skew related to quad cores. It gives a significant gain in Marks for having a quad core. Which wouldn't be an issue if games were doing the same. But, most games currently available do not really get that much in gains from a quad core CPU, when compared to a similarily clocked dual core. In the future, yes, more and more games will utilize more cores better. But, until then, it is skewed.

    If you think that way, faster dual cores are also skewed, since few games can saturate cpu usage, and gpu is usually the bottleneck. But let's not debate this, just look at the graphics component if you think it's more important.
  37. dagger said:
    If you think that way, faster dual cores are also skewed, since few games can saturate cpu usage, and gpu is usually the bottleneck. But let's not debate this, just look at the graphics component if you think it's more important.


    There is a difference. Most game currently available are not written to fully take advantage of 4 cores. But, 3DMark06 is. So, that is where the skew comes into play. CPU saturation and GPU bottleneck is a different factor that is not really skewed in 3DMark, as if it is doing that in 3Dmark, it is likely doing the same thing for your games (and vice-versa).
  38. basketcase said:
    There is a difference. Most game currently available are not written to fully take advantage of 4 cores. But, 3DMark06 is. So, that is where the skew comes into play. CPU saturation and GPU bottleneck is a different factor that is not really skewed in 3DMark, as if it is doing that in 3Dmark, it is likely doing the same thing for your games (and vice-versa).




    Compare e8400 with 8600gt and e4400 with 8600gt on a typical graphically intensive game, you'll get similar if not identical fps, which is because graphics is the bottleneck, not cpu. It'll show up on benchmarks though.

    Basically, when some other cpu beats yours in benchmarks, it's not fair. But when you beat others in the same benchmarks, it's fair? :na:

    Life isn't fair. Sometimes you get the longer end of the stick, sometimes you don't. Let it go already.
  39. Ok, I see you are not reading what anyone is saying, so I will stop.
  40. Crazy-PC said:
    Do you mean Dagger using Vista while mousemonkey use XP? If it is the case, this is still normal understanding and no surprise.

    Oh, there's more to it than that. :whistle:
    omahagtp said:
    For reference, my e8400 dual core at 3.6 GHz and 8800gts 512 (oc'ed at 770/1050) gets 15,800 in XP and 14,500 in Vista with the 3dmark06 trial software.

    Same CPU and clock speed same OS GTS rather than GT and still lower, have you sussed out the 'booster' yet?
  41. @addiarmadar

    if you're getting the q6600 soon, and have a bit more money left over, why not go for the q6700? recent intel price slash = heaven x) its 299 on newegg (doesn't include tax + shipping yet) and might be cheaper elsewhere, such as directron 289 no tax, free shipping :) im thinking bout replacing my q9450 with the q6700, 400fsb x 10= 4ghz sounds really nice!
Ask a new question

Read More

Graphics Cards Windows Vista Graphics