High-End Graphics Card Roundup

BFG GTX 275 (896 MB)

To see the pictures, please click on the photo of the test card below to access our photo gallery.

The BFG GeForce GTX 275 represents great price per performance value in the high-end segment. Even without overclocking, the GeForce GTX 275 can keep up with the more expensive GeForce GTX 280 while gaming. Those who want more performance than the GeForce GTX 260 is able to deliver, while spending less than the cost of a GeForce GTX 280, would do very well with the 275 (we covered the launch of this board right here).

The GeForce GTX 260 owes much of its popularity to a steep price decline, but you always have to check to see if a particular unit includes 192 or 216 stream processors, since some boards include 55 nm chips, while others use 65 nm GPUs instead. If you buy a GeForce GTX 275, no such checks are needed, since it's basically a tweaked GeForce GTX 280 with 896 MB RAM (and a narrower memory bus) instead of 1 GB, but with 240 complete stream processors and higher clock rates. It uses a 55 nm chip, and even the reference fan is nice and quiet.

The graphics chip supports DirectX 10, PhysX, and CUDA. The circuit board is 10.6" (27 cm) long, and BFG overclocks this model slightly. Reference clock rates on the GTX 275 are 633 MHz for the GPU, 1,404 MHz for the shader, and 1,134 MHz on the GDDR3 memory (for an effective rate of 2,268 MHz). BFG boosts these numbers to 648 MHz, 1,440 MHz, and 2 x 1,152 MHz, for a roughly 1.8% boost in overall performance.

The fan profile extends all the way to 92 degrees Celsius. Our test sample ran at 44.2 decibels, measured at a distance of one meter (dB(A)), which was quieter than a GeForce GTX 280 at 45.5 dB(A). In 2D mode, it's barely audible at 36.8 dB(A).

Despite its use of 55 nm technology, the GeForce GTX 275 doesn't use less power than a GeForce GTX 280 because of its higher clock frequency. The complete test system with a BFG card installed consumed 355 W from the wall socket, while the same setup with the GeForce GTX 280 drew 347 W and a GeForce GTX 260 with 216 stream processors consumed 295 W. The GeForce GTX 275 requires two six-pin PCI Express (PCIe) power connectors that plug into the rear edge of the card. In 2D desktop mode, the card clocks to 300/100 MHz (GPU/RAM).

BFG's bundle is quite minimalist. Two dual-link DVI ports are provided, and there is a power-splitter cable, a driver CD, and a discount coupon for games bundled with the card. There is no HDMI adapter, nor is there an S/PDIF cable included, either.

  • Only one ATi card? What happened to all those OC'd 4890s?
  • And those HAWX benchmarks look ridiculous. ATi should wipe floor with nvidia with that. Of course you didn't put dx10.1 support on. Bastard...
  • cangelini
    quarzOnly one ATi card? What happened to all those OC'd 4890s?
    These are the same boards that were included in the recent charts update, and are largely contingent on what vendors submit for evaluation. We have a review upcoming comparing Sapphire's new 1 GHz Radeon HD 4890 versus the stock 4890. It'll be up in the next couple of weeks, though.
  • ohim
    Am i the only one that find this article akward since looking at the tests done on Ati cards on The Last Remnant game makes me wonder what went wrong ... i mean it`s UT3 engine ... why so low performance ?
  • curnel_D
    Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
  • anamaniac
    The EVGA 295 is the stuff gods game with.

    I would love that card. I would have to replace my whole system to work it properly however.
    I want $1500 now... i7 920 (why get better? They all seem to be godly overclockers) and EVGA 295.

    How about a test suit of the EVGA GTX 295 in crossfire for a quad-gpu configuration? I know there's driver issues, but it would be fun to see what it could do regardless. Along with seeing how far Toms can OC the EVGA GTX 295.
    Actually... Toms just needs to do a new system building recommendation roundup. I find them useful personally, and would have used it myself had my cash source had not lost his job...
  • Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
  • EQPlayer
    Seems that the cumulative benchmark graphs are going to be a bit skewed if The Last Remnant results are included in there... it's fairly obvious something odd is going on looking at the numbers for that game.
  • armistitiu
    Worst article in a long time. Why compare how old games perform on NVIDIA's high end graphic cards? Don't get me wrong i like them but where's all the Atomic stuff from Saphire, Asus and XFX had some good stuff from ATI too. So what.. you just took the reference cards from ATI and tested them? :| That is just wrong.
  • pulasky
    WOW what a piece of s********** is this """"""review"""""" Noobidia pay good in this days.