[Nvidia] Graphics Card Clock Halving

Alright, I thought I'd ask the community for answers, and I decided tom's hardware was the best place for it. I recently had an Gigabyte GTX550Ti 1Gb, after a while it was fine, nothing wrong at all. Then I started having heat issues with it, I took it out and investigated the problem and I noticed where the ventilation area is, the metal has discolored to a blue to bronze to brown color. And on the PCB where the DVI ports are, around them is the same color on the metal. So I contacted Gigabyte and they were clueless and I had no idea what to do. The clocks on that card was Core - 900/4100 MHz Then the memory clock got halved to 2050MHz. So I was getting frustrated and decided to try and overclock it back to it's original state, but most tools only allowed me to reach a max of 2520MHz. So yesterday I bought a new graphics card and took my Gigabyte GTX550Ti in to get the warranty and get a new one in place. I didn't know at that moment I was going to get a new one, but it would have seemed to be more expensive to get it repaired if the warranty expired than to buy a new one. So confident that buying a new graphics card would fix the problem, I bought a Leadtek Winfast GTX550Ti 2Gb version. Got home, put it in, updated all the drivers to the latest as they can be, and I checked the clocks and instead of getting the clocks it's meant to have which are 810/4008 MHz. But same issue, and it isn't faulty and brand new. The clocks I'm getting now are quite poor - 900/1800 MHz. So my question is, where is the other 2208 MHz gone? And the 900/1800 MHz is not power saving mode, power saving is 50/135 MHz. I really need help with this please, I have no idea what to do or where to look or what to pin point as the problem. Cheers!
25 answers Last reply
More about nvidia graphics card clock halving
  1. The video cards clock down to match the work load, this saves power and electricty.

    Use a tool like MSI afterburner. Set it to record the Usage%, temps, clocks, and fan speed in csv so you can look at it in excel after the fact.

    Fire up your favortate game and play a round or 2.

    Open the log file, and in excel you should be able to see the clocks go up to max as the usage goes up to 80-90%. The temp should go up to 80ish and as a result the fan should go up to 60ish.

    Then you should see all these things come back down,because you closed the game and opened excel.
  2. Sadly I don't have excel. I checked to see what it was like in full 3D performance, that's where I got the 900/1800 MHz.
  3. U can read the output csv file in ms wordpad. Afterburner is cool because you can turn on onscreen display so you can watch the clocks change in real time as you game.

    What program are u using to check the clocks?
  4. I use multiple programs, I use EXPERTool, CPU-Z, Speedfan, NVWinfox, Easyboost, AMD Overdrive (That tells me my GPU clocks still). I'll try out Afterburner and let you know what it reads. Would Battlefield 3 suffice?
  5. Okay did some tests and print screened some things for you.

  6. Heres the log you asked for.
  7. Post a TechPowerUp GPU-Z snapshot of when the GPU is under load.
  8. Wow, I have no idea why the video card is staying at idle speeds. My only thoughts.
    Newest driver clean install.
    Try a different pcie slot on board.
    Do you have a 3rd party power managment software installed that might be messing things up?
  9. Only have 1 PCI-E slot unfortunately. Also, I've reinstalled drivers many times. And no, no 3rd party power software. I'll run something and then post a snapshot of the highest clocks from GPU-Z.
  10. These were the highest results in Shift 2 Unleashed on the highest graphics (I think).
  11. Where's the GPU-Z snapshot of the Graphics Card tab page to go along with your Sensors tab page snapshot?

    I want to see what is being detected as the Default Clock settings.
  12. Right here sorry. :P
  13. The Default Clock for both GPU and Memory matches exactly what Leadtek publishes in its specifications for the WinFast GTX 550 Ti 2G.

    Your GPU Clock is running at its default speed but the memory is overclocked by 150 MHz. Is this while it's running Shift 2 Unleashed?

    What's really weird is that on the Sensors readings is that the memory speed is being reported as a maximum of 162.0 MHz which is only 20% of the default memory clock speed.

    The Bus Interface is reporting PCI-E 1.1 x16 @ x16 1.1. Click on the ? button beside this display field and click on the Start Render Test button to see if the Bus Interface changes to PCI-E 2.0 x16 @ x16 2.0 since your motherboard and graphics card should both support it.
  14. Yeah this is while it was running Shift 2 Unleashed. And that's odd, because it said 4008 MHz Shader and either 800-900 MHz on the Memory. Yeah I overclocked the Memory by a bit.
  15. dGCyanide said:
    Yeah this is while it was running Shift 2 Unleashed. And that's odd, because it said 4008 MHz Shader and either 800-900 MHz on the Memory. Yeah I overclocked the Memory by a bit.

    With Fermi GPUs the Shader clock is, by default, set at twice the speed of the GPU Core clock.
  16. 4008 MHz memory* Even if it is 1800 MHz how would it compare to my Gigabyte version? Heres the detailed way.

    Gigabyte GTX550Ti 1Gb (Factory Overclocked)
    Core clock - 900 MHz
    Memory clock - 4100 MHz
    Shader clock - 1800 MHz
    PCIE 2.0
    Power requirement - 400W Recommended.
    Memory bus - 192

    Leadtek Winfast GTX550Ti 2Gb
    Core clock - 950 MHz
    Memory clock - 1900 MHz
    Shader clock - 1800 MHz
    PCI-E 2.0
    Power requirement - 500W Recommended
    Memory Bus 192

    What do you think is better? The Gigabyte or Leadtek? Also I mainly do 3D graphic design instead of gaming. Also in 3D Studio Max 2010, it says I only have 4 Cuda cores to do rendering, I'm not sure if it's just a bug/glitch or what.
  17. Swap out those 301.42 drivers for the newer 304 beta drivers. They fix problems with weird clock speeds. Do a clean install.

    For your memory, don't get confused by programs that show the effective rate (i.e. 4000 mhz) vs. the actual rate (i.e 2000 mhz or 1000 mhz). Since it's DDR memory, they are all the same, but expressed differently. In GPU-Z multiply times 4 to get the effective speed. If you see, for example, 2000 mhz, multply times 2, etc.

    Do you have multiple monitors hooked up?
  18. I thought the memory clock was wierd too.. I know they are different cards but my 470s mem clock is 135 at idle. The second I launch any gpu taskis jumps up to 1674 (3348).

    Question is why is the card stuck at idle speeds? And what is up with the board running in PCIe 1.1?

    Maybe flash the bios of the board? My old ASUS board did wierd stuff with a HD5850 until they ASUS updated the bios.
  19. I use to have multiple monitors, but now I'm running single. First off I'll try the 3.04 Beta drivers, if not I'll flash the bios. And if it's running in PCI-E 1.1 would that explain the weird clocks? But yeah what graphics card do you think would be better? The Gigabyte or Leadtek? Thanks for the help too guys, appreciate it alot!
  20. And I did a render test and it still stayed in PCI-E 1.1
  21. How do I do a Bios Flash with a Gigabyte MA770-UD3 motherboard anyway? Theres a button saying QF Lash, would I press that?
  22. Okay I found the Bios flash, but apparently it's dangerous? What should I do?
  23. Okay, I'm getting PCI-E 2.0 x 16 @ x 16 2.0, and I'm getting my full GPU performance. Thankyou so much guys for your help! I'd never ever think to do this, appreciate it alot!
  24. dGCyanide said:
    Okay I found the Bios flash, but apparently it's dangerous? What should I do?

    Aren't you already running with the latest BIOS Version FKD?

    FKD is the version that was required for you to be able to use a Phenom II X4 840 processor on that motherboard.

    If you're already running with that BIOS version then there's no point in reflashing.
  25. Umm possibly, it's an AM2+ Motherboard running an AM3 processor. But it's all fixed, this whole time it was down to stupid drivers.
Ask a new question

Read More

Nvidia Graphics Cards Graphics