GTX 660 NVIDIA performs WAY worse than it should
Tags:
- Intel i7
-
Intel
- Nvidia
-
CPUs
-
Graphics
- Gtx
Last response: in Graphics & Displays
Skelpolu
April 27, 2014 11:59:34 PM
Hey there,
I've got a Intel Core i7-3770 CPU @ 3.40 GHz overclocked at 3,9 GHz ( could do far more, but I have bad cooling ) as well as a GTX 660 GPU, not overclocked.
Now, what the hell! When I do the 3DMark 11 Benchmark, I get around 12,3 FPS when it's about GPU-Performance ( and less than 9 in the combined, but around 28 FPS @ CPU-only ) and my score is only P2326! http://www.3dmark.com/3dm11/8275616
Now the thing is, it says that it can't accept the driver, that's because I use a beta-version of the driver...However, before I installed the beta-driver, it was the same performance. So what the hell? I hope someone can help me out with this, overclocking ain't workin' much here guys, I tried...
I've got a Intel Core i7-3770 CPU @ 3.40 GHz overclocked at 3,9 GHz ( could do far more, but I have bad cooling ) as well as a GTX 660 GPU, not overclocked.
Now, what the hell! When I do the 3DMark 11 Benchmark, I get around 12,3 FPS when it's about GPU-Performance ( and less than 9 in the combined, but around 28 FPS @ CPU-only ) and my score is only P2326! http://www.3dmark.com/3dm11/8275616
Now the thing is, it says that it can't accept the driver, that's because I use a beta-version of the driver...However, before I installed the beta-driver, it was the same performance. So what the hell? I hope someone can help me out with this, overclocking ain't workin' much here guys, I tried...
More about : gtx 660 nvidia performs worse
-
Reply to Skelpolu
Related resources
- My GTX 660 SC Performing worse than my GTX 650 - Forum
- R9 280x running worse than GTX 660 in some games - Forum
- Need help. My R9 290 performs worse than it should! - Forum
- SLI performance worse than single 660TI - Forum
- My graphics performance worse than it should be? - Forum
Skelpolu
April 28, 2014 12:55:24 AM
Cons29 said:
try a lower version driver. try cleaning first the old driver, sweepTried that out, here's what ( didn't ) change ( at all ): http://www.3dmark.com/3dm11/8275682
For some reason it couldn't detect the Core Speed etc. but it basically is the same. I use the 327.23 Driver this time.
-
Reply to Skelpolu
m
0
l
Skelpolu
April 28, 2014 11:53:14 AM
@cst
Alright, great to hear that my GTX is fine at least. I'm ashamed to admit, but I am not so familiar with Power Supplies - Since everyone knows my system-information now, can you tell me which Power Supply ( from say Corsair ) is compatible with my gear? I am not sure if that is a proper question anyways, I don't know if Power Supplies differ all too much - In any case, thanks in advance!
@Kekoh
https://www.dropbox.com/s/l70hnqr9ubia7a5/Screenshot%20...
As you can see, it's "@ 3.40 GHz". Again, I am not all too familiar with components yet, I'm currently trying to get more comfortable with all these terms. However, it seems like my CPU actually IS a 3.40GHz Processor and not 3.9.
And no, I did not add any voltage to the CPU when overclocking - I didn't need it, and I couldn't increase the CPU's overclock anymore anyways, as it was at about 80°C due to my bad cooling.
Alright, great to hear that my GTX is fine at least. I'm ashamed to admit, but I am not so familiar with Power Supplies - Since everyone knows my system-information now, can you tell me which Power Supply ( from say Corsair ) is compatible with my gear? I am not sure if that is a proper question anyways, I don't know if Power Supplies differ all too much - In any case, thanks in advance!
@Kekoh
https://www.dropbox.com/s/l70hnqr9ubia7a5/Screenshot%20...
As you can see, it's "@ 3.40 GHz". Again, I am not all too familiar with components yet, I'm currently trying to get more comfortable with all these terms. However, it seems like my CPU actually IS a 3.40GHz Processor and not 3.9.
And no, I did not add any voltage to the CPU when overclocking - I didn't need it, and I couldn't increase the CPU's overclock anymore anyways, as it was at about 80°C due to my bad cooling.
-
Reply to Skelpolu
m
0
l
There's no need to be ashamed - knowledge is meant to be shared; it's OK if you don't know, atleast you're willing to learn.
I'd say go with a 600W or 650W PSU; your CPU is good and that PSU will allow you to use any graphics card you want, and overclock both the CPU(if you decide to get one) and graphics card comfortably.
It's OK if you cannot overclock your CPU; it's a pretty good CPU by itself. The 3.9 GHz you see is Turbo frequency, saves power when not needed, more performance when needed.
You should go with a Corsair(HX, TX or AX series; the VS, CX, RM are not very good quality) XFX, Antec or Seasonic PSU. Seasonic is preferable. And apart from the nature(modular: can be removed /non modular) and number of cables, there's no difference.
Your GPU is OK if it's working - a fried GPU will not show anything on the screen at all - a damaged GPU will show artifacts/distorted picture.
I'd say go with a 600W or 650W PSU; your CPU is good and that PSU will allow you to use any graphics card you want, and overclock both the CPU(if you decide to get one) and graphics card comfortably.
It's OK if you cannot overclock your CPU; it's a pretty good CPU by itself. The 3.9 GHz you see is Turbo frequency, saves power when not needed, more performance when needed.
You should go with a Corsair(HX, TX or AX series; the VS, CX, RM are not very good quality) XFX, Antec or Seasonic PSU. Seasonic is preferable. And apart from the nature(modular: can be removed /non modular) and number of cables, there's no difference.
Your GPU is OK if it's working - a fried GPU will not show anything on the screen at all - a damaged GPU will show artifacts/distorted picture.
-
Reply to cst1992
m
1
l
Skelpolu
April 28, 2014 1:15:41 AM
cst1992 said:
What about your power supply? and what temperatures are you getting?First of all, my bad for not restarting my PC - now the stats are correct @ http://www.3dmark.com/3dm11/8275704.
Also, the power supply seems to be a "LC600H-12 V2.31 Active PFC", if that is the information you meant. If not, let me know and I will try to find other information on it.
In any case, I used Speedfan for the temperatures - First, without any processing except windows + some applications, then when powered with the MSI Kombustor running for a minute @ the GPU-Burn-in, standard settings. I used NO overclocking this time, though I want to use it once the GPU is working properly again.
https://www.dropbox.com/s/kxa2y883mej4pkn/Screenshot%20...
https://www.dropbox.com/s/mxmi7myttthpond/Screenshot%20...
-
Reply to Skelpolu
m
0
l
Skelpolu
April 28, 2014 1:07:11 PM
Thanks for the help so far, great to see people helping others.
I understand a bit more now. However, I did not quite understand the difference between modular / non modular. Does this mean anything to the installation / compability?
In any case, I was looking up some Seasonic's and the "Seasonic SS-600ET 600 W" seems alright. Will 600W be enough for some overclocking-action, are +50W a must-have for it? ( in addition, the model-name for that PSU is 'SS-600ET-F3 80+ Bronze' )
Thanks in advance!
I understand a bit more now. However, I did not quite understand the difference between modular / non modular. Does this mean anything to the installation / compability?
In any case, I was looking up some Seasonic's and the "Seasonic SS-600ET 600 W" seems alright. Will 600W be enough for some overclocking-action, are +50W a must-have for it? ( in addition, the model-name for that PSU is 'SS-600ET-F3 80+ Bronze' )
Thanks in advance!
-
Reply to Skelpolu
m
0
l
Skelpolu
April 28, 2014 2:04:37 AM
Hmm, I think it's worth to say that I overclocked the GPU a while ago - both Core & Memory ( clearly, I had no idea what I was doing with MSI's Afterburner ) and when I was testing it ( I didn't gradually go up but quite a big step ) it worked for a while, then did some nice graphics glitches and shut down itself, resulting in a black screen. I do believe it's possible I broke the performance with that, sadly I CANNOT recall how good the card was to begin with - I really don't remember if it sucked this much even before I did this amature overclocking.
/edit:
For good measure, I used Everest to provide proper hardware-information now:
http://pastebin.com/NFDVdMs1
/edit:
For good measure, I used Everest to provide proper hardware-information now:
http://pastebin.com/NFDVdMs1
-
Reply to Skelpolu
m
0
l
-
Reply to cst1992
m
0
l
Skelpolu
April 28, 2014 9:28:25 PM
Alright, sure will!
I am curious, though. My graphics card goes up to 81°C with the GPU Burn-In Test on auto-fanspeed.
When I set it to a automation I created myself ( pretty much it goes fullspeed sooner than the auto-fanspeed by default ) then it's about 70°.
Now: If the PSU is going to pump even more Voltage into the GPU, thus getting full performance out of it, by logic, the GPU should also get hotter.
This is a slight concern, especially if I were to overclock it - even just a notch - aber installing the new PSU. Any ideas on that one? Cleaning the PC is one thing, but I would like to add a second fan angled at the GPU or something like that to ensure lower temperatures - will it be worth it?
I am curious, though. My graphics card goes up to 81°C with the GPU Burn-In Test on auto-fanspeed.
When I set it to a automation I created myself ( pretty much it goes fullspeed sooner than the auto-fanspeed by default ) then it's about 70°.
Now: If the PSU is going to pump even more Voltage into the GPU, thus getting full performance out of it, by logic, the GPU should also get hotter.
This is a slight concern, especially if I were to overclock it - even just a notch - aber installing the new PSU. Any ideas on that one? Cleaning the PC is one thing, but I would like to add a second fan angled at the GPU or something like that to ensure lower temperatures - will it be worth it?
-
Reply to Skelpolu
m
0
l
It doesn't work that way - the cooling of the GPU is adequate for non-overclocked cards for reference, and overclocked cards for non-reference(depending on design).
70C is pretty good, you could fine tune your GPU for 75, even the 80 is not bad, if you're getting a lot of noise. If not, then it's all good.
Also, a burn-in test (like FurMark) will tax the GPU to its limit - games will not push it that much, even at 99% usage.
A better power supply won't pump more voltage in it, it'll provide better regulation; like for 12V if the old power supply is going between 11.5 to 12.5, the new one will go from 11.8 to 12.2(just for example; these are not actual figures). The GPU internal circuit for regulation won't need to work very hard because of this; so your GPU will run cooler, not hotter. It'll also allow to overclock higher. That's one of the reason gamers who overclock their cards go for top-notch power supplies.
By getting more performance, I meant that because of insufficient current, the old power supply was able to provide say 130W to your GPU, the new one will be able to provide 140(which is the GPU's rated power). Here there is more current, not more voltage. More voltage will result in the GPU getting burned out.
Personally, I don't think you need to add any custom fans; but if there are empty fan slots, feel free to install fans in them; but keep the intake fans less than the exhaust(like 2 fans intake, 3 exhaust).
While doing so, remember that front and bottom fans take air in (intake fans) while rear and top fans push it out (exhaust).
70C is pretty good, you could fine tune your GPU for 75, even the 80 is not bad, if you're getting a lot of noise. If not, then it's all good.
Also, a burn-in test (like FurMark) will tax the GPU to its limit - games will not push it that much, even at 99% usage.A better power supply won't pump more voltage in it, it'll provide better regulation; like for 12V if the old power supply is going between 11.5 to 12.5, the new one will go from 11.8 to 12.2(just for example; these are not actual figures). The GPU internal circuit for regulation won't need to work very hard because of this; so your GPU will run cooler, not hotter. It'll also allow to overclock higher. That's one of the reason gamers who overclock their cards go for top-notch power supplies.
By getting more performance, I meant that because of insufficient current, the old power supply was able to provide say 130W to your GPU, the new one will be able to provide 140(which is the GPU's rated power). Here there is more current, not more voltage. More voltage will result in the GPU getting burned out.
Personally, I don't think you need to add any custom fans; but if there are empty fan slots, feel free to install fans in them; but keep the intake fans less than the exhaust(like 2 fans intake, 3 exhaust).
While doing so, remember that front and bottom fans take air in (intake fans) while rear and top fans push it out (exhaust).
-
Reply to cst1992
m
0
l
Skelpolu
April 29, 2014 5:38:49 AM
Hmm, that makes a lot of sense now to be honest, ahahaha!
In any case, I will inform you about the new PSU - It will take some time, as I have to transfer money from my bank account to my PayPal, then Ebay ... you get the point. And no, sadly I don't have PC-Stores near me, I have to use Ebay.
/edit:
I'm sorry, but I can't hide my scepsis... I can't understand how just the power supply can lower the performance so much - I mean, I get like only 20% out of the actual GPU - And your example is just 1V difference if at all - What is this sorcery?
In any case, I will inform you about the new PSU - It will take some time, as I have to transfer money from my bank account to my PayPal, then Ebay ... you get the point. And no, sadly I don't have PC-Stores near me, I have to use Ebay.
/edit:
I'm sorry, but I can't hide my scepsis... I can't understand how just the power supply can lower the performance so much - I mean, I get like only 20% out of the actual GPU - And your example is just 1V difference if at all - What is this sorcery?
-
Reply to Skelpolu
m
0
l
Skelpolu
April 29, 2014 9:55:50 PM
Skelpolu
May 6, 2014 4:35:57 AM
Skelpolu
May 6, 2014 6:26:11 AM
is it a reference 660 or a third party one?
I've been using the 660 a while (got 2 in sli now) and your scores are lower than they should be.
As you stated earlier, you had tried some overclocking which resulted in crashing, so it could well be that the card has been damaged from that. That or the card has become faulty anyway.
Have you tried furmark to see what fps figures it gets?
I've been using the 660 a while (got 2 in sli now) and your scores are lower than they should be.
As you stated earlier, you had tried some overclocking which resulted in crashing, so it could well be that the card has been damaged from that. That or the card has become faulty anyway.
Have you tried furmark to see what fps figures it gets?
-
Reply to blockhead78
m
0
l
Skelpolu
May 6, 2014 7:55:05 AM
Hmm, I didn't use that Benchmark before, but just for the heck of it I did.
https://www.dropbox.com/s/lxnb3n3f4fzt9h4/Screenshot%20...
My GPU went up to 92°C this time - Quite hot, that is. Never had it so hot, ever. o_O
In any case, as you can see: Its score is pretty low, I don't get it. Perhaps it is broken - but was the new PSU a complete waste of money then?
https://www.dropbox.com/s/lxnb3n3f4fzt9h4/Screenshot%20...
My GPU went up to 92°C this time - Quite hot, that is. Never had it so hot, ever. o_O
In any case, as you can see: Its score is pretty low, I don't get it. Perhaps it is broken - but was the new PSU a complete waste of money then?
-
Reply to Skelpolu
m
0
l
hmmm I leaning towards a damaged/faulty GPU, as that is a low score for that card and that temp is pretty hot
my 660's never break 65C with the fans on max with furmark
If it is the fact that the GPU is the problem, then technically, the new PSU will not have improved anything.
At the end of the day, the only thing that will remedy a faulty GPU is replacement or repair
my 660's never break 65C with the fans on max with furmark
If it is the fact that the GPU is the problem, then technically, the new PSU will not have improved anything.
At the end of the day, the only thing that will remedy a faulty GPU is replacement or repair
-
Reply to blockhead78
m
0
l
Skelpolu
May 6, 2014 8:32:38 AM
I'm sorry a PSU upgrade didn't work out for you.
However, I'd still say a PSU upgrade is never a waste of money, especially when it's Seasonic.
You must be getting lower temperatures than before.
940 is a low score, my GTX650 with a Core 2 duo scored 1300 with 22 FPS at 1280x720.
The card looks to be throttling. Which exact model of 660 do you have?
Did you try removing the cooler before? The GTX 660 is only a 140W card, and 92C is absurdly high for such a card.
However, I'd still say a PSU upgrade is never a waste of money, especially when it's Seasonic.
You must be getting lower temperatures than before.
940 is a low score, my GTX650 with a Core 2 duo scored 1300 with 22 FPS at 1280x720.
The card looks to be throttling. Which exact model of 660 do you have?
Did you try removing the cooler before? The GTX 660 is only a 140W card, and 92C is absurdly high for such a card.
-
Reply to cst1992
m
0
l
Skelpolu
May 6, 2014 9:10:52 AM
-
Reply to cst1992
m
0
l
Skelpolu
May 6, 2014 9:27:48 AM
Thanks for reminding me of Afterburner: After fiddling around with the GPU, I ended up leaving the GPU overclocked - This only counts for the last Benchmark on Furmark. In any case, the problem with this GPU is that I can't find any identical picture of it, nor a complete name. Maybe you can identify it: http://s7.directupload.net/images/140506/m2gtbu3w.jpg
-
Reply to Skelpolu
m
0
l
It's a Point of View GPU.
Can you post a TechPowerUp GPU-Z screenshot? It's included as a part of FurMark.
I'm afraid you're in for bad news. I'd better confirm it before breaking it to you.
Should look like this:
http://cdn.overclock.net/2/29/29c0f318_5arWnrH.png
Can you post a TechPowerUp GPU-Z screenshot? It's included as a part of FurMark.
I'm afraid you're in for bad news. I'd better confirm it before breaking it to you.
Should look like this:
http://cdn.overclock.net/2/29/29c0f318_5arWnrH.png
-
Reply to cst1992
m
0
l
Skelpolu
May 6, 2014 9:37:53 AM
Hold up, here it is: https://www.dropbox.com/s/71av2g8qwhbpo3w/Screenshot%20...
Oh well, let me guess what's wrong: It's not an original GTX 660, but a rip-off?
I am looking forward to the verdict...
/edit:
in fact, I see huge differences between you GTX 660 and mine - Jesus Christ! O_O
Oh well, let me guess what's wrong: It's not an original GTX 660, but a rip-off?
I am looking forward to the verdict...
/edit:
in fact, I see huge differences between you GTX 660 and mine - Jesus Christ! O_O
-
Reply to Skelpolu
m
0
l
Skelpolu
May 6, 2014 10:08:16 AM
It's right there - I don't know if you saw it, but here it is again: https://www.dropbox.com/s/71av2g8qwhbpo3w/Screenshot%20...
-
Reply to Skelpolu
m
0
l
Sorry, I missed it earlier.
But you see. It has too low memory bandwidth.
I bet you went for 3GB of RAM.
I fear it's a DDR3 GT640 or something.
You're not alone: http://www.tomshardware.com/answers/id-2119564/gtx-660-...
Sorry man.
But you see. It has too low memory bandwidth.
I bet you went for 3GB of RAM.
I fear it's a DDR3 GT640 or something.
You're not alone: http://www.tomshardware.com/answers/id-2119564/gtx-660-...
Sorry man.
-
Reply to cst1992
m
1
l
cst1992 said:
Sorry, I missed it earlier.But you see. It has too low memory bandwidth.
I bet you went for 3GB of RAM.
I fear it's a DDR3 GT640 or something.
You're not alone: http://www.tomshardware.com/answers/id-2119564/gtx-660-...
Sorry man.
OOOF! bummer
-
Reply to blockhead78
m
0
l
Skelpolu
May 6, 2014 10:47:02 AM
Ouch. I expected something, but that low ...
http://www.hwcompare.com/13298/geforce-gt-640-ddr3-vs-g...
Jesus... That hurt a little inside. Now what to do ... I'll probably get myself a true GTX 660 in the future, for now I have to deal with this rip-off however...
By any chance, can someone tell me if the size of the GPU matters for compatibility with the Motherboard to be able to connect them? I don't want to step into another landmine ...
A little history on why I got this rip-off-gpu in the first place: I didn't know much about PC's a few years ago ( even now I only get into it as you can see ) so I bought a pre-built PC from Ebay for about 500€. What a shame...
http://www.hwcompare.com/13298/geforce-gt-640-ddr3-vs-g...
Jesus... That hurt a little inside. Now what to do ... I'll probably get myself a true GTX 660 in the future, for now I have to deal with this rip-off however...
By any chance, can someone tell me if the size of the GPU matters for compatibility with the Motherboard to be able to connect them? I don't want to step into another landmine ...
A little history on why I got this rip-off-gpu in the first place: I didn't know much about PC's a few years ago ( even now I only get into it as you can see ) so I bought a pre-built PC from Ebay for about 500€. What a shame...
-
Reply to Skelpolu
m
0
l
Size of GPU doesn't matter. Only PCI-e matters.
As long as you have atleast PCI-e 2.0x16 slot on your motherboard(x16 you do, since this GPU fit; 2.0 or 1.1 or 3.0 you'll have to check) you should be fine with any card. And you have a Seasonic PSU now, so good to go!
One thing that could confuse you is that the bandwidth.
In the screenshot I provided, bandwidth is given as 158.6 GB/s. Some people confuse it with PCI-E bandwidth which is in tens of GB/s.
Note that the first one is GPU to GPU memory bandwidth, and the second is CPU to GPU bandwidth. The two are completely unrelated. PCI-e 2.0x16 is enough for any card out there.
As long as you have atleast PCI-e 2.0x16 slot on your motherboard(x16 you do, since this GPU fit; 2.0 or 1.1 or 3.0 you'll have to check) you should be fine with any card. And you have a Seasonic PSU now, so good to go!
One thing that could confuse you is that the bandwidth.
In the screenshot I provided, bandwidth is given as 158.6 GB/s. Some people confuse it with PCI-E bandwidth which is in tens of GB/s.
Note that the first one is GPU to GPU memory bandwidth, and the second is CPU to GPU bandwidth. The two are completely unrelated. PCI-e 2.0x16 is enough for any card out there.
-
Reply to cst1992
m
1
l
Skelpolu said:
Ouch. I expected something, but that low ...http://www.hwcompare.com/13298/geforce-gt-640-ddr3-vs-g...
It's a little different than that.
Here:

-
Reply to cst1992
m
1
l
Skelpolu
May 6, 2014 12:15:07 PM
Hmm, I see... So wait, the Bandwidth IS the actual Bandwidth, no x10? I so, then holy cow is that diferent ... I am very surprised about the Pixel Rate however, gotta make me think why it went down on the 660.
I am intrigued: How did you do that chart? I'd be interested in it if it was completely customizable ( ergo, usable for any sorts of applications ).
Also, this might seem like an obvious answer t you, but if I want to go for a GTX 660 with good cooling, I should consider ones with two instead of one fan, or do they only exist because they run hotter than ones with only one fan? I'm not getting the gist of it yet and I don't know which one I should take. Personally, I prefer 2 fans as it's more stylistic and "sounds" less hot. It looks cool. *badumtss*
I am intrigued: How did you do that chart? I'd be interested in it if it was completely customizable ( ergo, usable for any sorts of applications ).
Also, this might seem like an obvious answer t you, but if I want to go for a GTX 660 with good cooling, I should consider ones with two instead of one fan, or do they only exist because they run hotter than ones with only one fan? I'm not getting the gist of it yet and I don't know which one I should take. Personally, I prefer 2 fans as it's more stylistic and "sounds" less hot. It looks cool. *badumtss*
-
Reply to Skelpolu
m
0
l
a PCI-e 2.0 lane has a bandwidth(CPU-GPU) of 500 MB/s.
so 16 such lanes of PCI-e 2.0 will equal 8 GB/s of bandwidth.
Similarly a PCI-e 3.0 lane has 985 MB/s of bandwidth.
So a PCI-e 3.0 x16 slot has 15.75 GB/s.
As I said in the first line, this is the CPU to GPU bandwidth. It has nothing to do with the 144 GB/s bandwidth of the GTX 660. That is the memory bandwidth of graphics data travelling between the GPU chip and black memory chips on the card itself.
I made that chart in MS Excel myself. It shows everything on the same scale, however, so the Texel rate and Pixel rate appear lower than they actually are. I wish it showed a separate scale for everything.
Nevertheless, it satisfies its main purpose which is showing you the relative values of the two cards.
I have assumed the value for power consumption, as I don't know what the real chip is(you'll have to remove the heatsink to see that). I have assumed a GT640(which is a GK107).
so 16 such lanes of PCI-e 2.0 will equal 8 GB/s of bandwidth.
Similarly a PCI-e 3.0 lane has 985 MB/s of bandwidth.
So a PCI-e 3.0 x16 slot has 15.75 GB/s.
As I said in the first line, this is the CPU to GPU bandwidth. It has nothing to do with the 144 GB/s bandwidth of the GTX 660. That is the memory bandwidth of graphics data travelling between the GPU chip and black memory chips on the card itself.
I made that chart in MS Excel myself. It shows everything on the same scale, however, so the Texel rate and Pixel rate appear lower than they actually are. I wish it showed a separate scale for everything.
Nevertheless, it satisfies its main purpose which is showing you the relative values of the two cards.
I have assumed the value for power consumption, as I don't know what the real chip is(you'll have to remove the heatsink to see that). I have assumed a GT640(which is a GK107).
-
Reply to cst1992
m
1
l
Skelpolu
May 6, 2014 12:53:15 PM
About the Bandwidth: I think I start to understand.
That chart is fine. After all, the fact remains that it's certainly anything BUT a GTX 660.
Just to clarify: PCI-e 3.0 lanes can still work with GPU's requiring 2.0 lanes?
I would adore to get a 660 with two fans as it would ensure relatively regulated temperatures.
By any chance, do you have a single- or dual-fan 660? Is it an advantage i it has two, or doesnt it matter in the end?
Thanks in advance!
That chart is fine. After all, the fact remains that it's certainly anything BUT a GTX 660.
Just to clarify: PCI-e 3.0 lanes can still work with GPU's requiring 2.0 lanes?
I would adore to get a 660 with two fans as it would ensure relatively regulated temperatures.
By any chance, do you have a single- or dual-fan 660? Is it an advantage i it has two, or doesnt it matter in the end?
Thanks in advance!
-
Reply to Skelpolu
m
0
l
It's not that straightforward really... Some cards have a single fan but good cooling, whereas others have two fans but bad cooling(by good cooling I mean efficient heat transfer).
Some cards have both good cooling and two fans.
Go with MSI's twin Frozr version of the GTX660. It's one of the best, and MSI cards are relatively cheaper. ASUS also sell good coolers on their cards, and they have their own PCB's too, which will allow for good overclocks. But they charge a premium for that.
If you can drag on with your card for now, though, I'd recommend a GTX 760 for a new card. MSI sell those too at good prices, and that card is sitting at the sweet spot of price vs performance. It's around $250(The 660 is $210). Your PSU should also handle that card(assuming you got the SS-600ET like you said).
Some cards have both good cooling and two fans.
Go with MSI's twin Frozr version of the GTX660. It's one of the best, and MSI cards are relatively cheaper. ASUS also sell good coolers on their cards, and they have their own PCB's too, which will allow for good overclocks. But they charge a premium for that.
If you can drag on with your card for now, though, I'd recommend a GTX 760 for a new card. MSI sell those too at good prices, and that card is sitting at the sweet spot of price vs performance. It's around $250(The 660 is $210). Your PSU should also handle that card(assuming you got the SS-600ET like you said).
-
Reply to cst1992
m
0
l
Skelpolu
May 6, 2014 1:45:45 PM
Alrighty then, seems alright!
I guess I can upgrade to a 760 then, as it has a slick design in any case ( pun unintentionally ).
but are you sure it's only $250? On sites such as Ebay ( I am from Germany, that is ) it costs about $290 if we were to trust their currency-calculations.
http://tinyurl.com/ebay-760
I will check my Motherboards-specs in order to make sure they definitely are compatible, but that's aside the point for now.
Once this last upgrade is done, it just MUST work. Technically, there's nothing left that could cause trouble in case of performance, as long as it's nothing I could mess up manually ( which I made sure to be impossible by now ).
Waiting for a final reply on that price-tag. Once that's done, the topic will freeze for a few weeks until I can get the new GPU - Thanks to everyone contributing to solving this problem ( So far )!
I guess I can upgrade to a 760 then, as it has a slick design in any case ( pun unintentionally ).
but are you sure it's only $250? On sites such as Ebay ( I am from Germany, that is ) it costs about $290 if we were to trust their currency-calculations.
http://tinyurl.com/ebay-760
I will check my Motherboards-specs in order to make sure they definitely are compatible, but that's aside the point for now.
Once this last upgrade is done, it just MUST work. Technically, there's nothing left that could cause trouble in case of performance, as long as it's nothing I could mess up manually ( which I made sure to be impossible by now ).
Waiting for a final reply on that price-tag. Once that's done, the topic will freeze for a few weeks until I can get the new GPU - Thanks to everyone contributing to solving this problem ( So far )!
-
Reply to Skelpolu
m
0
l
Skelpolu
May 6, 2014 10:31:58 PM
Hmm, I see ... well, there's this one, even though I've not used it in my life yet.
http://www.hardwareversand.de/articledetail.jsp?aid=827...
http://www.hardwareversand.de/articledetail.jsp?aid=827...
-
Reply to Skelpolu
m
0
l
FaqirKhan
May 6, 2014 10:43:03 PM
Cut the crap off , Yo Bro , Firstly don't do OverClocking ever again because What people don't understand is that it is not worth it.
Again , If your card is in the warranty try it , but I bet that overclock might have void the warranty.
Again , If nothing works out sadly , I think you should check your card by a Hardware Store.
Again , If you have to throw it away , Get R9 270X , It kicks GTX 660's Ass , Runs Battlefield 4 at Ultra at 55FPS and has extremely well FPS Latency.
Again , If your card is in the warranty try it , but I bet that overclock might have void the warranty.
Again , If nothing works out sadly , I think you should check your card by a Hardware Store.
Again , If you have to throw it away , Get R9 270X , It kicks GTX 660's Ass , Runs Battlefield 4 at Ultra at 55FPS and has extremely well FPS Latency.
-
Reply to FaqirKhan
m
0
l
No offence, but are you sleeping through the thread or something?
The 660 is off the discussion already.
Second thing, the 660 that the OP has got is fake.
Third thing, overclocking didn't break the card, it was originally a bad performer.
Also, overclocking does offer a minor performance boost, and if you have good cooling and a nice power supply, it's definitely worth getting some extra power out of for free.
@OP
Found this one. It's from ASUS, for small cases. But its cooling is adequate. It's cheaper, but refurbished. If that's acceptable to you, then you can get it.
http://www.ebay.com/itm/Asus-GeForce-GTX-760-Mini-OC-Ed...
The 660 is off the discussion already.
Second thing, the 660 that the OP has got is fake.
Third thing, overclocking didn't break the card, it was originally a bad performer.
Also, overclocking does offer a minor performance boost, and if you have good cooling and a nice power supply, it's definitely worth getting some extra power out of for free.
@OP
Found this one. It's from ASUS, for small cases. But its cooling is adequate. It's cheaper, but refurbished. If that's acceptable to you, then you can get it.
http://www.ebay.com/itm/Asus-GeForce-GTX-760-Mini-OC-Ed...
-
Reply to cst1992
m
0
l
FaqirKhan
May 7, 2014 12:19:49 AM
cst1992 said:
No offence, but are you sleeping through the thread or something?The 660 is off the discussion already.
Second thing, the 660 that the OP has got is fake.
Third thing, overclocking didn't break the card, it was originally a bad performer.
Also, overclocking does offer a minor performance boost, and if you have good cooling and a nice power supply, it's definitely worth getting some extra power out of for free.
@OP
Found this one. It's from ASUS, for small cases. But its cooling is adequate. It's cheaper, but refurbished. If that's acceptable to you, then you can get it.
http://www.ebay.com/itm/Asus-GeForce-GTX-760-Mini-OC-Ed...
I suggested the R9 270X , you attacked.
I don't have time to read all the posts , I suggested R9 270X , and that's it.
-
Reply to FaqirKhan
m
0
l
- 1 / 2
- 2
- Newest
Related resources
- My GTX 660 has more performance than it should? Forum
- SolvedSapphire HD 7950 Dual X performing worse than gtx 750 ti ftw... Forum
- is my PC performing worse than it should? Forum
- SolvedMy laptop is performing worse than it should be... Forum
- GTX 650 ti performing worse than HD 6770 Forum
- Why is my evga gtx 560 2gb superclocked (non ti) performing worse than a 9800gtx+? Forum
- GTX 670 SLI performance worse than single. Fixer gets a gift! 10 $ Forum
- SolvedWhat would be the performence of a 3 way sli gtx 660 Forum
- GTX 680 getting worse performance than my old GTX 550 ti. Forum
- Worse-than-it-should-be Performance with Dell XPS 410? Forum
- Worse performance with SLI GTX 460's than with a single card Forum
- SolvedWould two GTX 660 Ti in sli perform better than a GTX 760? Forum
- Help: GTX 570 performing a lot worse than usual Forum
- SolvedBetter performance and graphics at 1600x900 than at 1920x1080 (GTX660 Ti on a 23 " Monitor.). Forum
- GTX 570 Worse Performance Than GTX 275? Forum
- More resources
Read discussions in other Graphics & Displays categories
!