Sign in with
Sign up | Sign in
Your question

GTX 660 NVIDIA performs WAY worse than it should

Tags:
  • Intel i7
  • Intel
  • Nvidia
  • CPUs
  • Graphics
  • Gtx
Last response: in Graphics & Displays
Share
April 27, 2014 11:59:34 PM

Hey there,
I've got a Intel Core i7-3770 CPU @ 3.40 GHz overclocked at 3,9 GHz ( could do far more, but I have bad cooling ) as well as a GTX 660 GPU, not overclocked.
Now, what the hell! When I do the 3DMark 11 Benchmark, I get around 12,3 FPS when it's about GPU-Performance ( and less than 9 in the combined, but around 28 FPS @ CPU-only ) and my score is only P2326! http://www.3dmark.com/3dm11/8275616
Now the thing is, it says that it can't accept the driver, that's because I use a beta-version of the driver...However, before I installed the beta-driver, it was the same performance. So what the hell? I hope someone can help me out with this, overclocking ain't workin' much here guys, I tried...

More about : gtx 660 nvidia performs worse

a b Î Nvidia
a b à CPUs
April 28, 2014 12:16:07 AM

try a lower version driver. try cleaning first the old driver, sweep
m
1
l
a b Î Nvidia
April 28, 2014 9:58:59 AM

What do you mean the CPU is overclocked to 3.9 GHz? It runs at 3.9 without an overclock. If you added voltage you might as well remove it because it's not necessary for stock clocks.
m
0
l
Related resources
April 28, 2014 12:55:24 AM

Cons29 said:
try a lower version driver. try cleaning first the old driver, sweep

Tried that out, here's what ( didn't ) change ( at all ): http://www.3dmark.com/3dm11/8275682
For some reason it couldn't detect the Core Speed etc. but it basically is the same. I use the 327.23 Driver this time.
m
0
l
April 28, 2014 11:53:14 AM

@cst
Alright, great to hear that my GTX is fine at least. I'm ashamed to admit, but I am not so familiar with Power Supplies - Since everyone knows my system-information now, can you tell me which Power Supply ( from say Corsair ) is compatible with my gear? I am not sure if that is a proper question anyways, I don't know if Power Supplies differ all too much - In any case, thanks in advance!
@Kekoh
https://www.dropbox.com/s/l70hnqr9ubia7a5/Screenshot%20...
As you can see, it's "@ 3.40 GHz". Again, I am not all too familiar with components yet, I'm currently trying to get more comfortable with all these terms. However, it seems like my CPU actually IS a 3.40GHz Processor and not 3.9.
And no, I did not add any voltage to the CPU when overclocking - I didn't need it, and I couldn't increase the CPU's overclock anymore anyways, as it was at about 80°C due to my bad cooling.
m
0
l
a b Î Nvidia
a b à CPUs
April 28, 2014 1:03:29 AM

What about your power supply? and what temperatures are you getting?
m
1
l
a b Î Nvidia
a b à CPUs
April 28, 2014 12:49:33 PM

There's no need to be ashamed - knowledge is meant to be shared; it's OK if you don't know, atleast you're willing to learn.
I'd say go with a 600W or 650W PSU; your CPU is good and that PSU will allow you to use any graphics card you want, and overclock both the CPU(if you decide to get one) and graphics card comfortably.
It's OK if you cannot overclock your CPU; it's a pretty good CPU by itself. The 3.9 GHz you see is Turbo frequency, saves power when not needed, more performance when needed.
You should go with a Corsair(HX, TX or AX series; the VS, CX, RM are not very good quality) XFX, Antec or Seasonic PSU. Seasonic is preferable. And apart from the nature(modular: can be removed /non modular) and number of cables, there's no difference.

Your GPU is OK if it's working - a fried GPU will not show anything on the screen at all - a damaged GPU will show artifacts/distorted picture.
m
1
l
April 28, 2014 1:15:41 AM

cst1992 said:
What about your power supply? and what temperatures are you getting?


First of all, my bad for not restarting my PC - now the stats are correct @ http://www.3dmark.com/3dm11/8275704.
Also, the power supply seems to be a "LC600H-12 V2.31 Active PFC", if that is the information you meant. If not, let me know and I will try to find other information on it.
In any case, I used Speedfan for the temperatures - First, without any processing except windows + some applications, then when powered with the MSI Kombustor running for a minute @ the GPU-Burn-in, standard settings. I used NO overclocking this time, though I want to use it once the GPU is working properly again.
https://www.dropbox.com/s/kxa2y883mej4pkn/Screenshot%20...
https://www.dropbox.com/s/mxmi7myttthpond/Screenshot%20...
m
0
l
April 28, 2014 1:07:11 PM

Thanks for the help so far, great to see people helping others. :) 
I understand a bit more now. However, I did not quite understand the difference between modular / non modular. Does this mean anything to the installation / compability?
In any case, I was looking up some Seasonic's and the "Seasonic SS-600ET 600 W" seems alright. Will 600W be enough for some overclocking-action, are +50W a must-have for it? ( in addition, the model-name for that PSU is 'SS-600ET-F3 80+ Bronze' )
Thanks in advance!
m
0
l
April 28, 2014 2:04:37 AM

Hmm, I think it's worth to say that I overclocked the GPU a while ago - both Core & Memory ( clearly, I had no idea what I was doing with MSI's Afterburner ) and when I was testing it ( I didn't gradually go up but quite a big step ) it worked for a while, then did some nice graphics glitches and shut down itself, resulting in a black screen. I do believe it's possible I broke the performance with that, sadly I CANNOT recall how good the card was to begin with - I really don't remember if it sucked this much even before I did this amature overclocking.

/edit:
For good measure, I used Everest to provide proper hardware-information now:
http://pastebin.com/NFDVdMs1
m
0
l
a b Î Nvidia
a b à CPUs
April 28, 2014 1:18:03 PM

Should be enough for your needs. Get it!
Also keep us posted on what your experience with the new PSU is/if you have any problems installing.
m
0
l
a b Î Nvidia
a b à CPUs
April 28, 2014 8:31:40 AM

The problem is not your card - it's your power supply.
It was struggling to deliver the power to your card before, even more when you overclocked it.
The power supply is not good quality.
Get another power supply, a good one from Corsair, Antec or Seasonic. Get a 550W for some upgrade headroom.
m
0
l
April 28, 2014 9:28:25 PM

Alright, sure will! :) 
I am curious, though. My graphics card goes up to 81°C with the GPU Burn-In Test on auto-fanspeed.
When I set it to a automation I created myself ( pretty much it goes fullspeed sooner than the auto-fanspeed by default ) then it's about 70°.
Now: If the PSU is going to pump even more Voltage into the GPU, thus getting full performance out of it, by logic, the GPU should also get hotter.
This is a slight concern, especially if I were to overclock it - even just a notch - aber installing the new PSU. Any ideas on that one? Cleaning the PC is one thing, but I would like to add a second fan angled at the GPU or something like that to ensure lower temperatures - will it be worth it?
m
0
l
a b Î Nvidia
a b à CPUs
April 28, 2014 10:59:37 PM

It doesn't work that way - the cooling of the GPU is adequate for non-overclocked cards for reference, and overclocked cards for non-reference(depending on design).
70C is pretty good, you could fine tune your GPU for 75, even the 80 is not bad, if you're getting a lot of noise. If not, then it's all good. :)  Also, a burn-in test (like FurMark) will tax the GPU to its limit - games will not push it that much, even at 99% usage.
A better power supply won't pump more voltage in it, it'll provide better regulation; like for 12V if the old power supply is going between 11.5 to 12.5, the new one will go from 11.8 to 12.2(just for example; these are not actual figures). The GPU internal circuit for regulation won't need to work very hard because of this; so your GPU will run cooler, not hotter. It'll also allow to overclock higher. That's one of the reason gamers who overclock their cards go for top-notch power supplies.

By getting more performance, I meant that because of insufficient current, the old power supply was able to provide say 130W to your GPU, the new one will be able to provide 140(which is the GPU's rated power). Here there is more current, not more voltage. More voltage will result in the GPU getting burned out.

Personally, I don't think you need to add any custom fans; but if there are empty fan slots, feel free to install fans in them; but keep the intake fans less than the exhaust(like 2 fans intake, 3 exhaust).
While doing so, remember that front and bottom fans take air in (intake fans) while rear and top fans push it out (exhaust).
m
0
l
April 29, 2014 5:38:49 AM

Hmm, that makes a lot of sense now to be honest, ahahaha! :) 
In any case, I will inform you about the new PSU - It will take some time, as I have to transfer money from my bank account to my PayPal, then Ebay ... you get the point. And no, sadly I don't have PC-Stores near me, I have to use Ebay. :/ 

/edit:
I'm sorry, but I can't hide my scepsis... I can't understand how just the power supply can lower the performance so much - I mean, I get like only 20% out of the actual GPU - And your example is just 1V difference if at all - What is this sorcery? :p 
m
0
l
a b Î Nvidia
a b à CPUs
April 29, 2014 9:43:45 AM

I told you the numbers are just examples, they're fictional, but they explain the concept.
Real scenario could be different - there are many factors.
I can, however tell you you'll see an improvement with a new power supply.
m
0
l
April 29, 2014 9:55:50 PM

Well alright, you've got a point there - I expected too much from the example, ehehe. :) 
Okay, seems like I'll buy the Power Supply today, until then - Farewell!
m
0
l
a b Î Nvidia
a b à CPUs
April 30, 2014 1:08:26 AM

Yeah, see you, and good luck.
m
0
l
May 6, 2014 4:35:57 AM

Well, I changed the Power Supply - And no changes, at all. I feel kinda pissed as it was very hard for me to get everything back together, I expected some better results. :/ 
What am I supposed to do now..?
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 6:21:06 AM

Look at game benchmarks on Geforce website. If you have any of those games, then try running them on given resolution. You should get similar FPS. Also you could check out reviews of the 660. Those also contain benchmarks.
m
0
l
May 6, 2014 6:26:11 AM

Mate, I know how the GTX 660 should perform, my problem si that I bought a new Power Supply on your suggestion in order to solve the problem. Looking at further benchmarks don't help me solve that at all, I know that it sucks compared to any other GTX 660.
m
0
l
a b à CPUs
May 6, 2014 7:14:30 AM

is it a reference 660 or a third party one?

I've been using the 660 a while (got 2 in sli now) and your scores are lower than they should be.

As you stated earlier, you had tried some overclocking which resulted in crashing, so it could well be that the card has been damaged from that. That or the card has become faulty anyway.

Have you tried furmark to see what fps figures it gets?
m
0
l
May 6, 2014 7:55:05 AM

Hmm, I didn't use that Benchmark before, but just for the heck of it I did.
https://www.dropbox.com/s/lxnb3n3f4fzt9h4/Screenshot%20...
My GPU went up to 92°C this time - Quite hot, that is. Never had it so hot, ever. o_O
In any case, as you can see: Its score is pretty low, I don't get it. Perhaps it is broken - but was the new PSU a complete waste of money then?
m
0
l
a b à CPUs
May 6, 2014 8:01:19 AM

hmmm I leaning towards a damaged/faulty GPU, as that is a low score for that card and that temp is pretty hot

my 660's never break 65C with the fans on max with furmark

If it is the fact that the GPU is the problem, then technically, the new PSU will not have improved anything.

At the end of the day, the only thing that will remedy a faulty GPU is replacement or repair
m
0
l
May 6, 2014 8:32:38 AM

I don't quite have the money for a new GTX 660 ... Can someone recommend me a card close to the GTX 660 - or at least a card which is by far better than what the score for THIS dead GPU is? Just a few bucks less without too much performance loss would be great.
m
0
l
a b à CPUs
May 6, 2014 8:42:21 AM

GTX 750 ti

a little cheaper than the 660, but uses less power and dishes out decent performance

even cheaper is the HD7770, cracking performance for the price

If you could find a good price for it, the HD7870 is also good
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 8:55:58 AM

I'm sorry a PSU upgrade didn't work out for you.
However, I'd still say a PSU upgrade is never a waste of money, especially when it's Seasonic.
You must be getting lower temperatures than before.
940 is a low score, my GTX650 with a Core 2 duo scored 1300 with 22 FPS at 1280x720.
The card looks to be throttling. Which exact model of 660 do you have?
Did you try removing the cooler before? The GTX 660 is only a 140W card, and 92C is absurdly high for such a card.
m
0
l
May 6, 2014 9:10:52 AM

Hey there,
let me check it out and give a reply soon - it will an edit to this post!
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 9:17:13 AM

Another thing that may happen is that you said you used afterburner, so maybe you set fan speed to manual. And by default at manual the fan speed would be the lowest(so that you can adjust it to your liking). And running furmark at lowest fan speed will definitely cause very high temperatures.
m
0
l
May 6, 2014 9:27:48 AM

Thanks for reminding me of Afterburner: After fiddling around with the GPU, I ended up leaving the GPU overclocked - This only counts for the last Benchmark on Furmark. In any case, the problem with this GPU is that I can't find any identical picture of it, nor a complete name. Maybe you can identify it: http://s7.directupload.net/images/140506/m2gtbu3w.jpg
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 9:33:41 AM

It's a Point of View GPU.
Can you post a TechPowerUp GPU-Z screenshot? It's included as a part of FurMark.
I'm afraid you're in for bad news. I'd better confirm it before breaking it to you.
Should look like this:
http://cdn.overclock.net/2/29/29c0f318_5arWnrH.png
m
0
l
May 6, 2014 9:37:53 AM

Hold up, here it is: https://www.dropbox.com/s/71av2g8qwhbpo3w/Screenshot%20...
Oh well, let me guess what's wrong: It's not an original GTX 660, but a rip-off?
I am looking forward to the verdict...

/edit:
in fact, I see huge differences between you GTX 660 and mine - Jesus Christ! O_O
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 10:06:39 AM

screenshot?
m
0
l
May 6, 2014 10:47:02 AM

Ouch. I expected something, but that low ...
http://www.hwcompare.com/13298/geforce-gt-640-ddr3-vs-g...
Jesus... That hurt a little inside. Now what to do ... I'll probably get myself a true GTX 660 in the future, for now I have to deal with this rip-off however...
By any chance, can someone tell me if the size of the GPU matters for compatibility with the Motherboard to be able to connect them? I don't want to step into another landmine ...

A little history on why I got this rip-off-gpu in the first place: I didn't know much about PC's a few years ago ( even now I only get into it as you can see ) so I bought a pre-built PC from Ebay for about 500€. What a shame...
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 11:18:32 AM

Size of GPU doesn't matter. Only PCI-e matters.
As long as you have atleast PCI-e 2.0x16 slot on your motherboard(x16 you do, since this GPU fit; 2.0 or 1.1 or 3.0 you'll have to check) you should be fine with any card. And you have a Seasonic PSU now, so good to go!

One thing that could confuse you is that the bandwidth.
In the screenshot I provided, bandwidth is given as 158.6 GB/s. Some people confuse it with PCI-E bandwidth which is in tens of GB/s.
Note that the first one is GPU to GPU memory bandwidth, and the second is CPU to GPU bandwidth. The two are completely unrelated. PCI-e 2.0x16 is enough for any card out there.
m
1
l
May 6, 2014 12:15:07 PM

Hmm, I see... So wait, the Bandwidth IS the actual Bandwidth, no x10? I so, then holy cow is that diferent ... I am very surprised about the Pixel Rate however, gotta make me think why it went down on the 660.

I am intrigued: How did you do that chart? I'd be interested in it if it was completely customizable ( ergo, usable for any sorts of applications ).

Also, this might seem like an obvious answer t you, but if I want to go for a GTX 660 with good cooling, I should consider ones with two instead of one fan, or do they only exist because they run hotter than ones with only one fan? I'm not getting the gist of it yet and I don't know which one I should take. Personally, I prefer 2 fans as it's more stylistic and "sounds" less hot. It looks cool. *badumtss*
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 12:19:27 PM

a PCI-e 2.0 lane has a bandwidth(CPU-GPU) of 500 MB/s.
so 16 such lanes of PCI-e 2.0 will equal 8 GB/s of bandwidth.
Similarly a PCI-e 3.0 lane has 985 MB/s of bandwidth.
So a PCI-e 3.0 x16 slot has 15.75 GB/s.

As I said in the first line, this is the CPU to GPU bandwidth. It has nothing to do with the 144 GB/s bandwidth of the GTX 660. That is the memory bandwidth of graphics data travelling between the GPU chip and black memory chips on the card itself.

I made that chart in MS Excel myself. It shows everything on the same scale, however, so the Texel rate and Pixel rate appear lower than they actually are. I wish it showed a separate scale for everything.
Nevertheless, it satisfies its main purpose which is showing you the relative values of the two cards.
I have assumed the value for power consumption, as I don't know what the real chip is(you'll have to remove the heatsink to see that). I have assumed a GT640(which is a GK107).
m
1
l
May 6, 2014 12:53:15 PM

About the Bandwidth: I think I start to understand.
That chart is fine. After all, the fact remains that it's certainly anything BUT a GTX 660.
Just to clarify: PCI-e 3.0 lanes can still work with GPU's requiring 2.0 lanes?
I would adore to get a 660 with two fans as it would ensure relatively regulated temperatures.
By any chance, do you have a single- or dual-fan 660? Is it an advantage i it has two, or doesnt it matter in the end?
Thanks in advance! :) 

m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 1:10:56 PM

It's not that straightforward really... Some cards have a single fan but good cooling, whereas others have two fans but bad cooling(by good cooling I mean efficient heat transfer).
Some cards have both good cooling and two fans.
Go with MSI's twin Frozr version of the GTX660. It's one of the best, and MSI cards are relatively cheaper. ASUS also sell good coolers on their cards, and they have their own PCB's too, which will allow for good overclocks. But they charge a premium for that.

If you can drag on with your card for now, though, I'd recommend a GTX 760 for a new card. MSI sell those too at good prices, and that card is sitting at the sweet spot of price vs performance. It's around $250(The 660 is $210). Your PSU should also handle that card(assuming you got the SS-600ET like you said).
m
0
l
May 6, 2014 1:45:45 PM

Alrighty then, seems alright!
I guess I can upgrade to a 760 then, as it has a slick design in any case ( pun unintentionally ).
but are you sure it's only $250? On sites such as Ebay ( I am from Germany, that is ) it costs about $290 if we were to trust their currency-calculations.
http://tinyurl.com/ebay-760

I will check my Motherboards-specs in order to make sure they definitely are compatible, but that's aside the point for now.
Once this last upgrade is done, it just MUST work. Technically, there's nothing left that could cause trouble in case of performance, as long as it's nothing I could mess up manually ( which I made sure to be impossible by now ).

Waiting for a final reply on that price-tag. Once that's done, the topic will freeze for a few weeks until I can get the new GPU - Thanks to everyone contributing to solving this problem ( So far )! :) 
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 10:06:02 PM

Prices definitely differ by country, since the supply/demand ratio is different. If you can afford it, go for it.
Do you have any other Germany-only site? I could give you a better idea then.
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 10:37:27 PM

Seems a little cheaper.
But searching for other cards, you'll find EUR209 to be the minimum price for this card.
I'd say use ebay. The seller on ebay seems to be a good one.
m
0
l
May 6, 2014 10:43:03 PM

Cut the crap off , Yo Bro , Firstly don't do OverClocking ever again because What people don't understand is that it is not worth it.
Again , If your card is in the warranty try it , but I bet that overclock might have void the warranty.
Again , If nothing works out sadly , I think you should check your card by a Hardware Store.
Again , If you have to throw it away , Get R9 270X , It kicks GTX 660's Ass , Runs Battlefield 4 at Ultra at 55FPS and has extremely well FPS Latency.
m
0
l
a b Î Nvidia
a b à CPUs
May 6, 2014 10:49:09 PM

No offence, but are you sleeping through the thread or something?
The 660 is off the discussion already.
Second thing, the 660 that the OP has got is fake.
Third thing, overclocking didn't break the card, it was originally a bad performer.
Also, overclocking does offer a minor performance boost, and if you have good cooling and a nice power supply, it's definitely worth getting some extra power out of for free.

@OP
Found this one. It's from ASUS, for small cases. But its cooling is adequate. It's cheaper, but refurbished. If that's acceptable to you, then you can get it.
http://www.ebay.com/itm/Asus-GeForce-GTX-760-Mini-OC-Ed...
m
0
l
May 7, 2014 12:19:49 AM

cst1992 said:
No offence, but are you sleeping through the thread or something?
The 660 is off the discussion already.
Second thing, the 660 that the OP has got is fake.
Third thing, overclocking didn't break the card, it was originally a bad performer.
Also, overclocking does offer a minor performance boost, and if you have good cooling and a nice power supply, it's definitely worth getting some extra power out of for free.

@OP
Found this one. It's from ASUS, for small cases. But its cooling is adequate. It's cheaper, but refurbished. If that's acceptable to you, then you can get it.
http://www.ebay.com/itm/Asus-GeForce-GTX-760-Mini-OC-Ed...


I suggested the R9 270X , you attacked.
I don't have time to read all the posts , I suggested R9 270X , and that's it.
m
0
l
      • 1 / 2
      • 2
      • Newest
!