Or at least I haven't seen any. I am wondering what the proper way to OC my EVGA GTX 275 FTW. The first method I tried was to set fan to 100%, set the shader and memory to default (713/1260) and then boost up the core until it gets unstable. I used GPUTOOL to get a core stable at 750MHz without any artifacts reported for over 30 minutes. Next I kept the core and memory at default (713/1260) and boosted the shader up as high as it would go. I got to 1630MHz and beyond that, artifacts would be reported. Finally, I was able to get the memory up to 1300MHz without any artifacts. Once I was able to get my max clocks, I set them all respectively (750/1630/1300). I was able to use GPUTOOL and not report any artifacts for over 30 minutes at 85fps and stable temperature of 81C.
Herein lies the problem. I play CS:S at those clocks for about 10 minutes and then my screen will freeze, flash for 1 sec, then all the textures get screwed up at only 65C. I am assuming that either the core, shader, or memory was clocked to high. I was able to do the same steps above for CS:S and had to use (750/1600/1270) so that I wouldn't report any freezing or artifacts. Since GPUTOOL is more graphic intensive (85fps compared to 180fps in CS:S), how come I have to use lower clocks for CS:S? Would those CS:S clocks cause a more demanding game like Crysis to freeze or artifact just like CS:S did at the GPUTOOL clocks?
And on a related note, does increasing either core, shader, or memory one at a time, lower the possible increase of either? For example, if I get the GPUTOOL results as stated above by setting each at default and overclocking one at a time, will setting all of the clocks at the maximum achieved frequency cause instability? What I guess I am trying to say is that I get stability at a core of 750 and defaults as well as a shader of 1600 and defaults. If I set both the core (750) and shader (1600) at the same time, is that not an accurate results because the core was set using default shader and memory while the shader was set at default core and memory. I hope you can understand this question, if not I will try to restate it.
If anyone can offer some detailed information on this, I would be most greatly appreciative. I haven't seen anything online yet explaining how to properly overclock with shaders or have given some kind of list of software to properly test it. I have tried Furmark which was good, but it doesn't tell you if you are artifacting like GPU tool does. Thanks for your help!
Herein lies the problem. I play CS:S at those clocks for about 10 minutes and then my screen will freeze, flash for 1 sec, then all the textures get screwed up at only 65C. I am assuming that either the core, shader, or memory was clocked to high. I was able to do the same steps above for CS:S and had to use (750/1600/1270) so that I wouldn't report any freezing or artifacts. Since GPUTOOL is more graphic intensive (85fps compared to 180fps in CS:S), how come I have to use lower clocks for CS:S? Would those CS:S clocks cause a more demanding game like Crysis to freeze or artifact just like CS:S did at the GPUTOOL clocks?
And on a related note, does increasing either core, shader, or memory one at a time, lower the possible increase of either? For example, if I get the GPUTOOL results as stated above by setting each at default and overclocking one at a time, will setting all of the clocks at the maximum achieved frequency cause instability? What I guess I am trying to say is that I get stability at a core of 750 and defaults as well as a shader of 1600 and defaults. If I set both the core (750) and shader (1600) at the same time, is that not an accurate results because the core was set using default shader and memory while the shader was set at default core and memory. I hope you can understand this question, if not I will try to restate it.
If anyone can offer some detailed information on this, I would be most greatly appreciative. I haven't seen anything online yet explaining how to properly overclock with shaders or have given some kind of list of software to properly test it. I have tried Furmark which was good, but it doesn't tell you if you are artifacting like GPU tool does. Thanks for your help!