Sign in with
Sign up | Sign in

MSI N275GTX Lightning: Fully Overclocked

Radeon HD 5770, Radeon HD 4890, And GeForce GTX 275 Overclocked
By

Now it’s time for our own little warning: overclocking can damage your hardware and void your warranty. However, there are some good rules of thumb to help you prevent frying your graphics board. Usually, you can increase a card’s default clock speed by another five to ten percent. Don’t jump to the highest setting immediately. Instead, increase the frequency in increments of 5 or 10 MHz and test stability. Also, don’t try to change all settings at once. Find the highest stable frequency for the GPU, then move on to the shader, followed by the memory.

We recommend using a graphically-demanding game that puts a lot of strain on the GPU to test your settings. Anything with the Unreal 3 engine should work well (Ed.: here at the Tom's Hardware US office, we like to use FurMark for stress testing). Select a high resolution (ideally 1920x1200), turn off anti-aliasing and enable anisotropic filtering. If the game freezes, that usually means the GPU frequency is too high, while visual artifacts and rendering errors tend to mean the memory can’t cope with the selected clock speed. Lower the speeds immediately in either case. Finally, if you get a DirectX error, chances are your shader clock was set too optimistically.

MSI’s factory overclocked settings work flawlessly. However, we encountered several problems when we attempted to overclock the card. Since MSI’s own OC tool wouldn’t let us adjust the shader clock, we used eVGA’s Precision tool instead, with a goal to achieve the highest clock speed possible that didn’t require a voltage tweak. That turned out to be more difficult than it sounds, since our combination of Windows Vista, Forceware 191.07, and MSI’s Lightning Afterburner software turned out to be a little, let’s say, touchy. If the overclocked settings don’t work, Windows Vista kills and restarts the graphics driver, after which the card only runs at half speed. The only way to get out of this 400 MHz mode is to reboot the system.

After half a day of experimenting with various clock speed combinations, we found 720/1,600/1,200 MHz (GPU/shader/memory) to be the safest, most stable settings. Interestingly, the 1,200 MHz memory speed corresponds to the highest setting in MSI’s Lightning tool.

In order to increase the GPU speed further, we had to perform a little ritual in a specific order. Start MSI’s OC tool, raise the voltage, then launch Precision to overwrite MSI’s clock speeds. This took several more hours, partly because MSI’s utility proved to be very dominant. Switching to another performance preset or using one of the Lightning utility’s frequency sliders immediately overwrites any settings selected through Precision. Then, if you choose a clock speed that is too high, the system will freeze or the graphics driver will reset itself, restarting in 400 MHz mode. Either way, you’re forced to reboot the system.

Raising the voltage a little from 1.0665 to 1.0790 V allowed us to push the GPU as far as 770 MHz. Remember, Nvidia’s reference speed is 633 MHz, while the GTX 275 Lightning has a stock clock of 700 MHz. This setting wasn’t without its problems, though, as the system would become unstable under load. Additionally, 1,600 MHz was no longer a viable option for the shader, and we had to reduce the memory frequency to 1,170 MHz as well. After lots of trial and error, we ended up at 770/1,550/1,170 (GPU/shader/memory). Since the lower shader and memory speeds negate any improvement achieved through a faster GPU clock, we decided to take a step back and stick with the settings achieved without a voltage tweak (720/1,600/1,200 MHz).

Clock Speeds in MHz
GPU
Percent
Shader
Percent
Memory
Percent
MSI N275GTX Lightning max OC720113.71,600114.01,200105.8
MSI N275GTX Lightning700110.61,404100.01,150101.4
MSI N275GTX Lightning no OC633100.01,404100.01,134100.0


At MSI’s factory overclocked settings, the GPU runs 10.6 percent faster than a reference card. The 1.4 percent increase in memory speed is negligible, though, and the shader frequency isn’t changed at all. Overall, that gives our card a performance boost of 5.5 percent. According to the retail box, MSI’s overclocked NGTX275 Lightning is supposed to perform on par with a GeForce GTX 285. We can’t really refute that claim, since it all depends on the benchmarks, CPU, and graphics driver version MSI used. Based on our own hardware, we still saw a performance gap of one to two percent.

Our maximum overclock made the situation much clearer, with the GeForce GTX 275 effortlessly reaching performance levels similar to those of a reference GeForce GTX 285. If MSI had also pre-overclocked the shader, this would be a really great card. As it is, the NGTX275 Lightning is a card with a very quiet dual-fan cooler, a very potent graphics chip, and a dubious overclocking tool that doesn’t harness the full power of the hardware. If you buy the GTX 275 Lightning, we would actually recommend not installing the Lightning Afterburner tool at all and using the stock (overclocked) settings instead. If you still feel the need to tweak the memory and shader speeds, use eVGA’s Precision utility instead. But beware that you do so at your own risk.

To end on a positive note, we did like that overclocking did not affect the card’s 2D mode, letting the board idle at 300/600/100 MHz.

Graphics Card and Chip Class
FPS
Percent
MSI N275GTX Lightning Max OC (GTX 275 1,792MB)1,838.5109.6
GeForce GTX 285 (1,024MB)1,795.0107.0
MSI N275GTX Lightning (GTX 275 1,792MB)1,769.1105.5
MSI N275GTX Lightning No OC (GTX 275 1,792MB)1,694.4101.0
GeForce GTX 275 (896MB)1,677.1100.0


Since MSI equips the GTX 275 Lightning with twice as much memory as a reference card, we also tested the card at Nvidia’s reference speeds (labeled No OC) to see whether there was a performance difference compared to cards with only 896MB of memory. As you can see in our table, the larger memory size offers no benefit over the standard configuration in overall performance. On the other hand, certain games (like Grand Theft Auto IV) can be run at higher detail settings with the larger frame buffer.

With the current generation of drivers, we also encountered some stuttering in Fallout 3, with the scene freezing for a short moment when you turn. Interestingly, this was not a problem with the older GeForce 186 generation, and only seems to plague the combined releases for Windows 7 and Vista.

Display all 70 comments.
This thread is closed for comments
Top Comments
  • 16 Hide
    wickedsnow , November 24, 2009 9:52 AM
    While I normally refrain form ever commenting on video card reviews. I could not resist this.

    I agree with falchard (to a degree) that while i don't think the review is biased, I do think something is not right about it. In most of the games listed, the 4890 (1024 version) is not only loosing to the gtx260 192 AND 216 versions, but loosing by a huge margin. I own both cards myself in 2 machines that are the same (except the videocards) and 99% of the time, the 4890 spanks my other rig. (with an evga SSC gtx260 core216).

    I'm not saying anything is biased (just a reminder) I am saying something just is not right. PSU not big enough, wrong drivers,... etc etc... no idea.
  • 12 Hide
    falchard , November 24, 2009 9:12 AM
    Benchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.
  • 11 Hide
    quantumrand , November 24, 2009 6:12 AM
    I'm really disappointed that they aren't any benchmarks from the 5870 or 5850 series included. Why even bother with tha GTX 295 or 4870x2 and such without the higher 5-series Radeons?

    I mean if I'm considering an ATI card, I'm going to want to compare the 5770 to the 5850 and 5870 just to see if that extra cost may be justified, not to mention the potential of a dual 5770 setup.
Other Comments
  • 1 Hide
    amdgamer666 , November 24, 2009 5:22 AM
    Nice article. Ever since the 5770 came out I've been wondering how far someone could push the memory to relieve that bottleneck. Being able to push it to 1430 allows it to be competitive to it's older sibling and makes it enticing (with the 5700 series' extra features of course)
  • 1 Hide
    Onyx2291 , November 24, 2009 5:30 AM
    Damn some of these cards run really well for 1920x1200 which I run at. Could pick up a lower one and run just about anything at a decent speed if I overclock well. Good ol charts :) 
  • 9 Hide
    skora , November 24, 2009 5:47 AM
    If you're trying to get to the next cards performance by OCing, shouldn't the 5850 be benched also? I know the 5770 isn't going to get there because of the memory bandwidth issue, but you missed the mark. One card is compared to its big brother, but the other two aren't.

    I am glad to see the 5770 produce playable frame rates at 1920x1200. Nice game selection also.
  • 11 Hide
    quantumrand , November 24, 2009 6:12 AM
    I'm really disappointed that they aren't any benchmarks from the 5870 or 5850 series included. Why even bother with tha GTX 295 or 4870x2 and such without the higher 5-series Radeons?

    I mean if I'm considering an ATI card, I'm going to want to compare the 5770 to the 5850 and 5870 just to see if that extra cost may be justified, not to mention the potential of a dual 5770 setup.
  • 5 Hide
    presidenteody , November 24, 2009 6:26 AM
    I don't care what this article says, when the 5870 or 5970 become available i am going to buy a few.
  • 0 Hide
    kartu , November 24, 2009 6:27 AM
    Well, at least in Germany 4870 costs quite a bit less (30-40 Euros) compared to 5770. It would take 2+ years of playing to compensate for it with lower power consumption.
  • -3 Hide
    kartu , November 24, 2009 6:30 AM
    "Power Consumption, Noise, And Temperature" charts are hard to comprehend. Show bars instead of numbers, maybe?
  • -3 Hide
    arkadi , November 24, 2009 7:08 AM
    Well that put things in prospective. I was really happy with 260gtx numbers, and i can push my evga card even higher easy. To bad we didn't see the 5850 here, it looks like the optimal upgrade 4 gamers on the budget like my self. Grade article overall.
  • 0 Hide
    B16CXHatch , November 24, 2009 7:08 AM
    I got lucky with my card. Before, I had a SuperClocked 8800GT from EVGA. I ordered a while back, a new EVGA GeForce GTX 275 (896MB). I figured the extra cash wasn't worth getting an overclocked model particularly when I could do it myself. I get it, I try to register it. The S/N on mine was a duplicate. They sent me an unused S/N to register with. I then check the speeds under one utility and it's showing GTX 275 SuperClocked speeds, not regular speeds. I check 2 more utilities and they all report the same. I had paid for a regular model and received a mislabeled SuperClocked. Flippin sweet.

    Now they also sell an SSC model which is overclocked even more. I used the EVGA precision tool to set those speeds and it gave me like 1 or 2 extra FPS is Crysis and F.E.A.R. 2 already played so well without overclocking. So overclocking on these bad boys doesn't really do much. Oh well.

    One comment though, GTX 275's are HOT! Like, ridiculously hot. I open my window in 40 degree F weather and it'll still get warm in my room playing Team Fortress 2.
  • 3 Hide
    Anonymous , November 24, 2009 7:40 AM
    With the 5970 out there seems to be nothing else about graphic cards that interests me anymore :D  Its supposed to be the fastest card yet and beats Crysis too!
  • -3 Hide
    Anonymous , November 24, 2009 7:48 AM
    Excellent article [hindered by poor chart].
  • 12 Hide
    falchard , November 24, 2009 9:12 AM
    Benchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.
  • 16 Hide
    wickedsnow , November 24, 2009 9:52 AM
    While I normally refrain form ever commenting on video card reviews. I could not resist this.

    I agree with falchard (to a degree) that while i don't think the review is biased, I do think something is not right about it. In most of the games listed, the 4890 (1024 version) is not only loosing to the gtx260 192 AND 216 versions, but loosing by a huge margin. I own both cards myself in 2 machines that are the same (except the videocards) and 99% of the time, the 4890 spanks my other rig. (with an evga SSC gtx260 core216).

    I'm not saying anything is biased (just a reminder) I am saying something just is not right. PSU not big enough, wrong drivers,... etc etc... no idea.
  • -8 Hide
    notty22 , November 24, 2009 10:21 AM
    The 5770 does not perform well. Its overpriced right now. All playable numbers, but the Nvdia cards spank it for the same money.
  • 0 Hide
    brisingamen , November 24, 2009 11:17 AM
    the 5770 has great overclocking potential with the stock cooler, with a good cooler the numbers could be phenominal and in crossfire situation really be nice, also not to mention it is direct x 11, and can do things both the 4890 and 275 cannot. deals will be availible on the 5770 sooner than any of the higher models. im considering getting two and overclocking the shinanigans out of them, the 275 spanks nothing with its old tech, IQ matters.
  • 0 Hide
    sparky13 , November 24, 2009 11:32 AM
    I think a better 4890 to use instead of the MSI would be the Gigabyte 4890 OC model I have in my system right now. That MSI cooler is decent but the way it's secured to the GPU is just pitiful.

    http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/24770-value-meets-performance-hd-4890-cards-gigabyte-msi-19.html

    The Gigabyte comes w/a Zalman cooler and is Factory OC'd to 900mhz. I pushed it to 975mhz it it didn't break a sweat. Idle temps hover around 30-34 C. Under load it rarely breaks 52 C. The Zalman is a beast. It stays quiet too, barely audible under my tricool LED fans on low setting. I reccommend it to anyone looking for a GPU in the 170.00 range.
  • 4 Hide
    scrumworks , November 24, 2009 12:00 PM
    falchardBenchmark suite is kind of gimped again. Every game selected except Fear 2 is designed to gimp ATI hardware. I guess its ok if you are comparing ATI to ATI, but when you say the GTX275 is a better buy over the HD4890 based on this review its completely biased. You didn't even come close to portraying the HD4890 in any sort of fair comparison.


    I haven't seen a single review from the author that wouldn't be somehow made selectively nvidia biased. Last Remnant, HAWX DX10.0, no HD5870/HD5970 are just quick examples. Reviewers should stay absolutely neutral in these matters and arrange proper conditions for all parties.

    I won't analyze any deeper of the results but it seems like Radeon's don't perform quite as well as they should perform in many other reviews.
  • 1 Hide
    cinergy , November 24, 2009 12:08 PM
    notty22The 5770 does not perform well. Its overpriced right now. All playable numbers, but the Nvdia cards spank it for the same money.


    I guess they are if you don't care about DX11 and lower power consumption readings. I think AMD can easily drop HD5x00 prices after supply starts exceeding demand.
  • 7 Hide
    cknobman , November 24, 2009 12:44 PM
    nice article.............that is if your in the nvidia camp. Gotta love your sponsors right???

    Guess I need to go to anand or tweaktown to get a non nvidia biased review.

    Dont come back giving me some crap about you cant help what games favor nvidia because you can......dont include them in a damn review of ati cards!!!!!! How come you just so happened to exclude every game that favors ATI???

    Your a tool and a fool Kreiss!!
  • -4 Hide
    siliconchampion , November 24, 2009 1:56 PM
    Whoa, people, people!

    Perhaps in stead of flaming the author out about his choice of Nvidia favoring titles, perhaps you could make some helpful suggestions of game titles you would like to see benchmarked...

    Personally, I would love to see some CoD4 and MW2 benchies, but that's just me.
Display more comments