Sign in with
Sign up | Sign in
Your question

Thinking about a 260; How much does Core Clock Matter?

Last response: in Graphics & Displays
Share
November 28, 2008 1:17:03 PM

I'm currently sitting with a two year old 7950GTKO form eVGA. I love eVGA's products and would like to either stick with them or goto XFX or BFG.

It seems every new chip there are new levels of cards that each company manufactures. eVGA has something like 4 or 5 different levels of their GT260 cards acording to how high they overclock the core clock speeds. I know stock for the GT260 is 575Mhz and a 1998Mhz memory clock I believe...

eVGA has a nice price for a stock card at 575Mhz, or I could pay $60 more to get a card with a 650Mhz core clock...

Now, before you say that I could easily overclock it myself, I don't overclock things. I just don't. That's me.

So from going to a 575Mhz speed to a 655Mhz speed, am I really going to notice any difference in game performance? If so how much? Worth the extra $60?

You guys like eVGA or prefer different?
November 28, 2008 1:33:17 PM

buy a stock model and overclock it yourself, whether you typically do this or don't do it; its really not worth paying for in this case.

also buy evga, as their warranty is fully inclusive and covers overclocking-related failures if you happen to push the card too much.

I bought 2 gtx260s and they both easily hit 666 core/1404 shader/ 2214 memory clocks - that likely saved me 75$ per card if I had bought a pre-clocked edition
November 28, 2008 1:36:19 PM

Overclocking yourself is always better than paying for it from the manufacturer, although the manufactured OC'ed cards have lower VIDs for the core which decreases temps. But if you are buying a 260GTX MAKE SURE IT IS A GTX260 *CORE 216* model as these are the newer and better versions that will last longer and have an enhanced architecture.
Related resources
November 28, 2008 1:44:21 PM

ovaltineplease said:
I bought 2 gtx260s and they both easily hit 666 core/1404 shader/ 2214 memory clocks - that likely saved me 75$ per card if I had bought a pre-clocked edition


How hot does it run when you have it at 666? I don't have any extensive cooling for my PC since I don't overclock. Just a single 120mm fan in the side and a 80mm in the back. Nothing special. Should it still be alright? I mean, if I would have to go out and buy more fans or a new case why not just pay the extra for the card?

What's the performance increase going from 575 to 666?
November 28, 2008 4:27:21 PM

Just get XFX GTX 260 xxx edition. It has 192 cores, but its faster than all the others GTX 260 even wit 216 cores. Think if you overclock her more!! I have this card from July and i am very very VERY satisfied! Specs very close to GTX 280.

In Crysis at 1680X1050, very high, 8X AA i get 40,3 fps!!
My cpu is E6850 and 4 GB of Ram.
November 29, 2008 8:11:53 AM

Here are the results guys with the latest Crysis Benchmarks at 1680x1050 res.

gtx 260 = 33fps
gtx 260 FTW = 36fps
gtx 260 core 216 Superclocked = 38.44fps (beats stock 280)
gtx 280 = 37.97fps
gtx 280 FTW = 39fps

the GTX 260 core 216 Superclocked is only 1 frame lower then the GTX 280 FTW which I think is going for $500 right now. That is insane. Go get the GTX 260 core 216 Superclocked. You will beat the GTX 280. Then you can even OC even more for more performance. All this at about $200 cheaper. I found my card finally.

Here are the facts. http://www.youtube.com/watch?v=dcvGXPaQg-8
November 29, 2008 8:33:13 AM

evga is what I would buy, and they dont run hot at all really; like 60-65 degrees at 100% load in a shader-heavy game with SLI running

However, I have a case with exceptional air-cooling and its well tuned to optimize that cooling
November 29, 2008 8:35:03 AM

yeah i had a feeling it was the same guy... and spitfire thats another double post :lol:  he's gettin around that one...

anyway, dude we dont know the facts and figures between 575 and 666 or whatever... it depends on the game, RAM, CPU, Resolution, settings,.. the list goes on there are too many variables to just say "15% better".
!