GTX 280...774mhz core! Need OC "test" button like Riva used to have.

robx46

Distinguished
Sep 28, 2006
115
0
18,680
Yup, my GTX 280 core clock hit 774mhz (775 to be exact). On modest air cooling.
Awesome to know I can hit that. However, as you can guess, things got a bit unstable. Just the mere fact that my temps will still safe and my gpu is still fine is feat enough!

You ask what awful software let my 280 hit 775mhz core? ATI Tool, of course, which so many people swear works fine with nvidia cards.

I tried ATI Tool because like any responsible OC'er I want to OC in increments, being able to do a quick test for each little increase (as opposed to test for 2 hours for each 2mhz bump!).
Once ATI Tool clearly wasn't a good idea, I didn't know where to turn. Yes, I know there are good stability tests out there, but like I said, don't we have any software for vista that has the "test" button for an instant stability check?
That at least gave you a rough idea of where to go, then you could do more thorough testing from there.

I just don't have the time to test for hours for each bump. I've already noticed that artifacts sometimes won't pop up for a good 20 minutes or so. I've been using Crysis, the game itself and its benchmark.

However, things look promising. I have a GTX 280 SSC (the evga version with sweeet OC's that isn't over $700!). I was actually playing Crysis last night (again, modest air cooling at best) for one full hour at 702mhz core and 1440mhz memory, then at that hour mark I started getting some real small artifacts.
Given how well the GTX 280 underclocks itself when idle along with the fact that I rarely even play games, including Crysis, for more than an hour tell me I might be getting close to my optimal OC. I also have a suspision that it might be my RAM I need to bump down but my core clock can go through the roof (since I've already seen it hit 774mhz without crashing).

Anyhow, yeah I'm wondering if there is some quicker way than running Crysis or Fur for an hour to test my increments. Using Vista. Not a huge deal if there isn't, since I do think I'm getting close.

Overall, I'm really happy with my 280 so far. Last night I was playing Crysis quite smoothly with maxed driver settings (other than AA) @ 1680x, DX10, Very High settings, and 4xAA.
I've seen those AMD benchmarks saying that the 280 should only get 24 fps on those settings, but without any AA. I can say that in my case that is definitely wrong. With a modest OC to FTW speeds, 30 fps was the average (perfect, huh?), running the crysis benchmark with those settings, 5 loops. Without any OC at all I hit like 28 fps average.
And no quad core beast helping me out here. Just your run of the mill under $200 C2D.

"A little off topic, but..." while people are complaining about this card, assuming this or that, or judging by benchmarks...I'm sitting here with a single gpu solution playing Crysis quite smoothly with everything maxed @ Very High settings and 4xAA, no game tweaks at all (although I might bump the AA down to 2x or no AA just to give a little FPS headroom, but it doesn't seem like like I NEED to do it). So how the hell can I complain? Temps have been well to me too. This card is doing exactly what I paid for it to do, period.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780
Use Fur Rendering stress test. The stress is so heavy that any amount of instability results in driver crash before artifacts appear.
http://www.ozone3d.net/benchmarks/fur/
 

romulus47plus1

Distinguished
Feb 25, 2008
872
0
18,990


[strike]
Well me too, they charge me around like 50$ per month.
Not forgetting those stuffs we have to pay in school.[/strike]
 

robx46

Distinguished
Sep 28, 2006
115
0
18,680
I was using the "find max core speed" button, or whatever. And it kept going, and going, and going. I left my computer assuming it would stop at a certain point, then when I came back I rubbed my eyes seeing the core approaching 800mhz! Then, crashola!
So I guess there is no good alternative tool to use?

As for being a fanboy. Too late for that! But isn't everybody a fanboy? And if you aren't a fanboy, then you must be a wagon jumper, latching onto whoever is in the spotlight at the moment rather than staying loyal to a particular brand, through the up's and downs, which by the way is much more satisfying IMO.

Even wagon jumpers are fanboys, even if only for a short time, or until the next product gets hyped. Everybody takes a side in ATI v Nvidia, some due to low prices, some jump on the best performer.
Just like when I was a kid back in school, there was always that kid who's favorite NFL team was whoever won the super bowl the previous year! We all know that person! That said, this is a different circumstance because the choices you make have your money at stake. For that reason I have no problem with people jumping off a wagon onto another.

However, there is nothing wrong with those that stay loyal either. They believe in a name and will support it even if it dives into the ground. Nothing wrong with loyalty and feeling a bit of pride when who you support for so long does great things.
In my case, I don't have the $$ to support something inferior. If ATI can put out a single gpu card that can outperform nvidia's high end for a notably lower price, I'm jumping ship (and I'm not interested in SLI, XF, or faux SLI/XF).

But for now, even though it was pricey, the 280 is doing things in single gpu that can't be found elsewhere. Last night I was playing Crysis DX10, Very High, 1680x, 4xAA. Other cards might be better bang for the buck, but they aren't doing that!

True, my pocketboot is a little hurt, but I did save up $$ for a guilty pleasure purchase. Also, thanks to nvidia's amazing success with the 8 series, and eVGA's efforts, I was able to sell my 8800 GTX KO (basically an ultra with better cooling) for $350 despite being used. So while the 280 was expensive, it is nvidia to thank for putting out a card that I could sell used for $350 that really allowed me to buy the 280 at launch day for $300.
I feel like I got a hell of a deal! Once you invest in a high end card, which usually have very good resale value, it makes it a lot easier to keep getting high end. Now going from a 6800GT to a GTX 280...different story.

Lastly, not sure why some people get so jealous. Before this I have never had the fastest card out there. Yeah, I had the 8800 GTX, but this was after the Ultra came out, so that doesn't count. It's always been that way for me, always having the mid range card or at best close to high end. But I never came onto forums heckling those that did own the best. Rather, I would be more inclined to give props to them. Because I know that "most" people do work for what they have.

I'm 29 and work 3 jobs, the main one paying minimum wage, the other 2 are side jobs building web sites and building/fixing computers for people. At the same time I'm trying to finish up college. I pray for the weekend just like everybody else, just because I don't work "as much" on the weekends. I don't get enough sleep, there aren't enough hours in the day. And no, my daddy isn't rich, far from it. Some people save up money and every blue moon buy something for themselves that they really want. This was my blue moon, I guess, since I couldn't go on a vacation this year (or the last 4 years).

So...maybe people shouldn't just assume that people buying a 280 (or whatever other high end project) are spoiled CEO's or spoiled by a rich daddy. It is pretty insulting when that couldn't be further from the truth.
 

robx46

Distinguished
Sep 28, 2006
115
0
18,680


I think I mentioned that I tried Fur. And again, doesn't seem to work with the 280. I tried it with a massive OC (over 700mhz core) and it did, in fact, start to get artifacts, but no driver crash. And also, the artifacts didn't start popping up for a good half hour.
So that doesn't seem to be a good "quick test" option for me either.

Guess I'll just have to be patient and keep trying Crysis while bumping in increments.
 

romulus47plus1

Distinguished
Feb 25, 2008
872
0
18,990
No, I was not pointing to you when I said the rich daddy thing, maybe I should delete that phrase.
I AM REALLY SORRY with those phrases man.

As for the explanation, I am very satisfied with it, yeah I was an nVidia fanboy, now a lil towards ATI due to the 4800s.
 

robx46

Distinguished
Sep 28, 2006
115
0
18,680


Don't worry, didn't ruin my day! And in your defense, it isn't like there aren't those people out there who might have the high end that don't appreciate it because they didn't earn it, rather was given to them. I find that annoying as well, but of course you just can't put everybody in that category because on the flipside are people who really work hard to get what they have. You are big enough to realize that you were wrong in generalizing, which is all that matters. I've been plenty guilty of saying things that I didn't really think about before saying/writing.

Anyhow, while I don't like to comment to much on stuff I don't own or haven't tested in person, I think there is no doubt that the 4800 line is the right choice for many in that $200-$350 range from all the things I've heard, at least at this moment until nvidia makes its next move.

As I mentioned, I don't blame anybody normally down with NV switching to ATI this summer (whether they go back or not). If I only had $200 to spend on a gpu, well, I guess I wouldn't upgrade my GTX KO! But otherwise the 4850 does seem like the top choice.

I guess what bugs me is that some people infer that since X card is more bang for the buck, it is the better card. For the price, yeah it is. However, if you want to play Crysis DX10 @ Very High 1680x w/ up to 4xAA like I was and still be quite playable, and be more likely to get better overclocks to help aid you in that...then you really only have one single gpu option. I don't care how much bang for the buck you get with other cards, they still won't be able to do some of the things that my 280 is doing.

And I don't want that to sound like bragging or pretentious, but it is the truth. I wanted to spend a lot for the 280 because I wanted the fastest single gpu in the world. I paid for it, but I got it.
You want bang for your buck, having a limited budget, then you need to get what you can afford. And luckily you have some great choices that can also play any game, even Crysis, to some extent.

Btw, with some of the talk about temps and OC'ing between the new NV & ATI (and I Just mentioned the OC), I've got some good results to follow up on what I started in the first post.
First off, my idle temps are always below 50c, usually 48c or 49c at the most. This likely thanks to how the 2xx cards underclock themselves when not in use, which I really like.
Now for the good stuff. On average air cooling. Currently I am stable at 683mhz core! 90 minutes of Crysis @ Very High 16xAF 4xAA and no artifacts! So I think I'm staying there!

I'm not even done OC'ing the memory yet. I am stable right now at 2450mhz memory! And still climbing! I'm going to shoot for 2500mhz and beyond. Temps are fine in actual gaming. Even in Crysis I often don't even hit 80c load with these OC's when the fan is at 100%. Even on auto fan I've yet to see 85c while gaming. I have seen the highest temps while running FUR (in the 90's after a good 1/2 hour), but that is just a benchmark. If Crysis doesn't push much past 80c, I'm not at all worried about any other game.

It looks like my final OC will be 683mhz core & near or at the 2500mhz range on memory. That is one hell of a beast! And since I have an eVGA card, I don't even need to be paranoid! In fact, I think eVGA in particular are very confident that you can put these cards through the ringer and they still come out fine.
And I think this is why they even packaged their own OC utility, even with their cards that already have high factory OC's!

I've already found out for myself that these cards are very safe to torture. I mentioned in the first post that my core clock hit 775mhz! All that happened is that the display went black (assuming the card basically shut itself off to prevent damage), then I just rebooted and everything was fine. After that, I never felt safer about overclocking, knowing that my real OC on core wouldn't go beyond 700mhz, let alone 775mhz.
Still don't recommend people shooting for 775mhz, but should something like that happen by accident (damn ATI Tool), then you should be safe.

The card will still get artifacts with an unhappy OC, but you will know when this happens. Even though the card can probably keep going on a long time with artifacts, as I've seen, you might want to exit your game and bump things down a notch just to be safe. But again, you don't need to have a panic attack either.

Finally, I mention this stuff because people keep talking only about price and performance. I also shelled out all that $$ for quality as well, such as OC's, safety, good temps, warranty...the things I just mentioned.
 

Dalyinx

Distinguished
May 20, 2008
88
0
18,630
There's really no reason to OC the memory on your card. If I were you, I'd try to shoot for the highest core and shader clocks possible.
 

romulus47plus1

Distinguished
Feb 25, 2008
872
0
18,990


I like the words at the end of Saving Private Ryan:
Earn it.
Yeah, have a nice day mate!
 
I was with you until you said football/superbowl. Hey, I like ATI, even prefer the underdog. But I own a nVidia till I see the 4xxx series card Ill buy. But, being a fence jumper isnt at all like being a betrayer. If I changed my ball club, then bury me. Ny gfx card? Well thats just common sense bang for the buck, nothing, like a person who doesnt have loyalties towards their favorite ballclub. Not sure whetre youre coming off with these "AMD" benchmarks, as I wouldnt completely trust them, tho, when they did come out, I warned everyone what they were, who they were from, and what theyd portray. Even so, they were still understated. And thats according to all the reviews Ive read. Itll be interesting to see how your card will fare against the new card coming out, some were saying 900 ghz core on a 4870, by Diamond. This should be a good performance race. Im sure that Diamond card will sell for more than 300, but itll be a monster like your card. Cant wait to see how these cardas fare against one another