Sign in with
Sign up | Sign in
Your question
Solved

3DMark 11 Score Issue

Last response: in Graphics & Displays
Share
October 6, 2012 8:00:05 AM

Hello :D  (Please excuse my user name, I made it when I was 12 :p  )

So right before I'm about to head to sleep (it's 4am here) I decided to see how my 3DMark 11 score would compare against other computers. I doing this to check and see that all of my parts are getting the scores the are excepted to get. and I stumbled upon this thread on Tom's -

http://www.tomshardware.com/forum/344299-28-mark#t26324...

Where after taking a look at this persons score I realized that other than his CPU (which im still not sure about), my score should be better or at least on par with this one. And yet, my score is well below it. Due to the late time, could someone please take a look at this and see if I missed something.

Here is my score -

http://www.3dmark.com/3dm11/4574326?loginkey=KdqaGRfd21...

And here is his -

http://www.3dmark.com/3dm11/3894507

Thanks in advance guys!
Light

More about : 3dmark score issue

October 6, 2012 9:05:47 AM

He got a I7 920 which has hyper threading and OC.So that will give him more cpu score.As for GPU its not possible to tell how much he is OC.Since 3d mark11 system info is yet to figure out how to calculate kepler boost clock.

So you can't compare.

Is your gtx 680 stock?My stock gtx 670 scores 8820 max and 8750 min.Which is on par with online benchmarks.Though your cpu score is kinda less.My i5 3550 at 3.7ghz scores 7490 min to 7550 max.Asus p8 z77-v board.

Make sure u update your mobo bios.i got boost of about 200 points in cpu after bios update.Run it 3d mark after just rebooting ur pc for best scores.
a c 109 U Graphics card
October 6, 2012 9:12:10 AM

Because 3D mark CPU bench takes advantage of all 8 threads on an i7. You've only got 4 cores and 4 threads.

Your physics score is:
6823
i5-3570K @stock clocks
His Physics score is:
8535
i7-920 @4ghz (stock of 2.8ghz)

That's why he scores higher than you.



Even my overclocked 2600K @4.5ghz physics score is higher.
http://www.3dmark.com/3dm11/4417459
Related resources
October 6, 2012 1:36:25 PM

Thanks for the feed back guys, I really appreciate it!

I understand that his physics score will be higher due to HT, but why is his graphics score higher... a GTX 680 should beat an 670. And not only is his better, is significantly better...
October 6, 2012 1:41:45 PM

You can't compare the results like this, as others are saying. You're not using the same drivers and there's no way to know how much he overclocked his 670. If he used something like Lucid MVP, that could have boosted his score as well (I can't figure out if 3dmark11 tells you rather or not Lucid MVP was used).

An overclocked 670 can pretty easily beat a stock 680... the 680 is only like 7% faster to begin with.
a c 109 U Graphics card
October 6, 2012 6:56:37 PM

Exactly.
October 6, 2012 9:26:17 PM

So this mean my hardware is preforming up to par then, correct? In your guys opinion, was it a mistake at the time to get a GTX 680? Should I have just waited for the 670 or something like that.

Thanks again,
Light

P.S. What do you mean by Lucid MVP? Sorry I am just not familiar with that term.
October 6, 2012 10:25:56 PM

Only you can decide if the GTX 680 was a "mistake" or not...

I think everyone who gave Nvidia money for their lame-o GTX 6xx series cards made a "mistake" but that's just me :lol: 

Anyways, your hardware is fine. Lucid MVP is a feature on some Z77 boards that works some sort of magic to increase framerates in games. I dunno exactly how it works - maybe reducing CPU overhead, I haven't read enough about it to know. I do know that it doesn't do anything helpful reliably in games, but it can boost 3dmark11 scores considerably.
October 6, 2012 10:40:31 PM

So Lucid MVP boots 3DMark scores, but not real games.
Got it thanks!

BigMack70 said:
Only you can decide if the GTX 680 was a "mistake" or not...

I think everyone who gave Nvidia money for their lame-o GTX 6xx series cards made a "mistake" but that's just me :lol: 



And regarding your comment, I use to get Radeon card because they were cheap and did well, but the drivers were killing me so I decided to just spend the money and get a Nvidia card. I AM NOT STARTING A FAN-BOY WAR! But I after 3 bad cards I got sick of it. But I still appreciate the input.

Again, Thanks for the help guys!
Light
a c 290 U Graphics card
October 6, 2012 11:00:42 PM

Update your driver. You're running an outdated one. And overclock your CPU :) .
October 6, 2012 11:01:15 PM

For clarity - my comment wasn't a fanboy comment. I don't have anything against Nvidia's cards in general, I just don't like the GTX 6xx series in particular for various reasons.
October 6, 2012 11:27:53 PM

Sunius said:
Update your driver. You're running an outdated one. And overclock your CPU :) .


Thanks for the suggestion. What do you think my target clock should be? My CPU cooler is H60 btw.
October 6, 2012 11:28:37 PM

BigMack70 said:
For clarity - my comment wasn't a fanboy comment. I don't have anything against Nvidia's cards in general, I just don't like the GTX 6xx series in particular for various reasons.


just out of curiosity, what are these reasons if you dont mind me asking.
October 7, 2012 2:04:51 AM

***Disclaimer*** These are just my views for not liking the GTX 6xx series cards. I'm not trying to start a fanboy war here. I know that these things aren't going to bother most people the way they bother me. ***/Disclaimer***

GK104 (the chip used in the GTX 660ti/670/680) was originally designed as the midrange Kepler chip - all the rumors from December through February had GK110 being the big chip aka GTX 680 with the GK104 powering stuff like the GTX 660ti. The GK110 was rumored to be as much as 40-50% faster than the HD 7970.

Then, due to manufacturing/yield problems, they had to scrap GK110 completely (for now at least). They caught a lucky break since AMD's 7xxx series didn't blow the lid off performance (IMO because AMD severely underclocked their cards), so they just tweaked their GK104 chip to compete with it and released it as a $500 card.

Then they voltage locked their cards and added GPU Boost, making it much more difficult to achieve a stable, consistent overclock. It's a great feature for the casual user, but just an annoyance for enthusiats/overclockers.

So, basically, they released what is essentially a midrange card that has unnecessary gimmicks attached to it at a high end price point, and they looked good doing it because AMD wasn't able to push performance all that much with underclocked cards.

The whole thing just rubbed me the wrong way. I don't like being sold $300 cards at $500 price points just because the GPU manufacturer couldn't get its manufacturing act together to actually produce its $500 card. I don't like being told how I can or can't use my card - why can't I turn GPU Boost off and just use it like a normal GPU? etc

With AMD copying the whole boost thing, I'm sadly thinking that stupid gimmick is here to stay, but I seriously hope Nvidia rights the ship (in my view) with their next series.
a c 109 U Graphics card
October 7, 2012 2:33:05 AM

iIdon't like gpus that cost more than $150 :p 
October 7, 2012 4:58:25 AM

amuffin said:
iIdon't like gpus that cost more than $150 :p 


My wallet agrees, but me heart wants more :p 
October 7, 2012 2:10:09 PM

BigMack70 said:
***Disclaimer*** These are just my views for not liking the GTX 6xx series cards. I'm not trying to start a fanboy war here. I know that these things aren't going to bother most people the way they bother me. ***/Disclaimer***

GK104 (the chip used in the GTX 660ti/670/680) was originally designed as the midrange Kepler chip - all the rumors from December through February had GK110 being the big chip aka GTX 680 with the GK104 powering stuff like the GTX 660ti. The GK110 was rumored to be as much as 40-50% faster than the HD 7970.

Then, due to manufacturing/yield problems, they had to scrap GK110 completely (for now at least). They caught a lucky break since AMD's 7xxx series didn't blow the lid off performance (IMO because AMD severely underclocked their cards), so they just tweaked their GK104 chip to compete with it and released it as a $500 card.

Then they voltage locked their cards and added GPU Boost, making it much more difficult to achieve a stable, consistent overclock. It's a great feature for the casual user, but just an annoyance for enthusiats/overclockers.

So, basically, they released what is essentially a midrange card that has unnecessary gimmicks attached to it at a high end price point, and they looked good doing it because AMD wasn't able to push performance all that much with underclocked cards.

The whole thing just rubbed me the wrong way. I don't like being sold $300 cards at $500 price points just because the GPU manufacturer couldn't get its manufacturing act together to actually produce its $500 card. I don't like being told how I can or can't use my card - why can't I turn GPU Boost off and just use it like a normal GPU? etc

With AMD copying the whole boost thing, I'm sadly thinking that stupid gimmick is here to stay, but I seriously hope Nvidia rights the ship (in my view) with their next series.


**IMO**

I see your point, and I understand why this can be so frustrating. To tell the truth, I completely agree with you. I too dont like graphics cards companies charging consumers almost %150 more than what the product is actually worth just because your competitor couldn't hold his own.

On the flip side though, from a business perspective, this makes complete sense. Nvidia is first and formost a business, and sadly, the main goal of a business is to make a profit. They have no reason to spend all this money on pushing the GK110 out to the market if their GK104 manages just fine. Now, I dont like this as much as the next guy, but I can understand it.

I think the main problem here isn't Nvidia though. We have reached a point where, AMD has essentially given up on fighting Nvidia for top GPU. I believe Tom's wrote an article about this somewhere on the site. This means Nvidia doesn't have the competition we need as consumers to get the products we want, at the prices we want.

Thanks again,
Light
October 7, 2012 2:20:58 PM

Nah, AMD has given up fighting Intel for top CPU. They are neck and neck right now with Nvidia for top GPU and you can make a very solid argument for why the 7970 is the best GPU out there.

Nvidia could have blown them out of the water with GK110, but they just couldn't get yields up enough to make it viable.
October 7, 2012 2:41:39 PM

Gtx 680 with 256bit beats 7970ghz in most games.Only time 7970ghz takes lead is when resolution is over fullHD with crazy 8xAA.But at those setting both cards produces framerate under 60 in demanding games.

The fact that 256bit card beat 386bit cards shows how much AMD failed with tahiti.
October 7, 2012 5:00:21 PM

7970 ghz is about the same overall (1-3% faster) as the 680 at 1080p and gains a significant advantage at resolutions above that. Please check your facts before posting green goggle nonsense :non: 



The 680 beats the 7970 soundly in terms of performance/watt, but your post about 256 bit vs 384 bit is just wrong and meaningless.
October 8, 2012 2:55:50 AM

******* Warning! Warning! Threat detected. Fan boy war is imminent. *******


:p 
October 8, 2012 2:59:40 AM

BigMack70 said:
Nah, AMD has given up fighting Intel for top CPU. They are neck and neck right now with Nvidia for top GPU and you can make a very solid argument for why the 7970 is the best GPU out there.

Nvidia could have blown them out of the water with GK110, but they just couldn't get yields up enough to make it viable.


I thought AMD was gave up competing with Intel an Nvidia, and went full throttler into APU's. They currently have the best on die GPU's on the market if I'm not mistaken. So I can defiantly see AMD dominating the Laptop and even the tablet market in the next few years.
a b U Graphics card
October 8, 2012 3:10:12 AM

I didnt end up reading everything above, got a bit too long for me. Just keep in mind benchmarking isnt real world use. If your system is doing what you need it to dont stress. If your playing games your only using one or two cores. That score uses more and wouldnt make a difference in-game. I find this utility more helpful for managing overclocks and upgrades for my own system. Running before and after a change helps see the difference on your own rig. But then comparing against something completely different is like comparing chalk and cheese.
October 8, 2012 4:17:13 PM

Burgies said:
I didnt end up reading everything above, got a bit too long for me. Just keep in mind benchmarking isnt real world use. If your system is doing what you need it to dont stress. If your playing games your only using one or two cores. That score uses more and wouldnt make a difference in-game. I find this utility more helpful for managing overclocks and upgrades for my own system. Running before and after a change helps see the difference on your own rig. But then comparing against something completely different is like comparing chalk and cheese.


Does this mean you wouldnt recommend using benchmarks as a way of of testing if a system isnt performing up to par?
a c 109 U Graphics card
October 9, 2012 12:15:06 AM

Use actual games.
a c 216 U Graphics card
October 9, 2012 12:28:02 AM

ilikegirls said:
Does this mean you wouldnt recommend using benchmarks as a way of of testing if a system isnt performing up to par?


As long as you use one that has a good base of users to compare your card to, it's fine, but there is nothing wrong with your card.
a b U Graphics card
October 9, 2012 6:11:45 AM

I would use it, but for things that are actually similar, ie your own system when you modify it. If someone runs an 8 core CPU they are going to get a better benchmark but thats irrelevant as games wont use them all.
October 9, 2012 4:18:07 PM

Burgies said:
I would use it, but for things that are actually similar, ie your own system when you modify it. If someone runs an 8 core CPU they are going to get a better benchmark but thats irrelevant as games wont use them all.


Thank makes sense, thank you for the explanation.

I'm assuming everyone agrees though that my GTX680 is preforming up to par right?!

Best solution

a c 216 U Graphics card
October 9, 2012 4:34:56 PM
Share

ilikegirls said:
Thank makes sense, thank you for the explanation.

I'm assuming everyone agrees though that my GTX680 is preforming up to par right?!


Assuming you have stock clocks, the graphics score is very close to what it should be.
October 9, 2012 4:47:46 PM

bystander said:
Assuming you have stock clocks, the graphics score is very close to what it should be.


Everything I have is currently running at stock. I might look into overclocking my cpu at some point.

Again, thank you so so much guys!
October 19, 2012 1:15:20 AM

Best answer selected by ilikegirls.
!