Best Of The Best: High-End Graphics Card Roundup

BFG GTX 275 (896 MB)

To see the pictures, please click on the photo of the test card below to access our photo gallery.

The BFG GeForce GTX 275 represents great price per performance value in the high-end segment. Even without overclocking, the GeForce GTX 275 can keep up with the more expensive GeForce GTX 280 while gaming. Those who want more performance than the GeForce GTX 260 is able to deliver, while spending less than the cost of a GeForce GTX 280, would do very well with the 275 (we covered the launch of this board right here).

The GeForce GTX 260 owes much of its popularity to a steep price decline, but you always have to check to see if a particular unit includes 192 or 216 stream processors, since some boards include 55 nm chips, while others use 65 nm GPUs instead. If you buy a GeForce GTX 275, no such checks are needed, since it's basically a tweaked GeForce GTX 280 with 896 MB RAM (and a narrower memory bus) instead of 1 GB, but with 240 complete stream processors and higher clock rates. It uses a 55 nm chip, and even the reference fan is nice and quiet.

The graphics chip supports DirectX 10, PhysX, and CUDA. The circuit board is 10.6" (27 cm) long, and BFG overclocks this model slightly. Reference clock rates on the GTX 275 are 633 MHz for the GPU, 1,404 MHz for the shader, and 1,134 MHz on the GDDR3 memory (for an effective rate of 2,268 MHz). BFG boosts these numbers to 648 MHz, 1,440 MHz, and 2 x 1,152 MHz, for a roughly 1.8% boost in overall performance.

The fan profile extends all the way to 92 degrees Celsius. Our test sample ran at 44.2 decibels, measured at a distance of one meter (dB(A)), which was quieter than a GeForce GTX 280 at 45.5 dB(A). In 2D mode, it's barely audible at 36.8 dB(A).

Despite its use of 55 nm technology, the GeForce GTX 275 doesn't use less power than a GeForce GTX 280 because of its higher clock frequency. The complete test system with a BFG card installed consumed 355 W from the wall socket, while the same setup with the GeForce GTX 280 drew 347 W and a GeForce GTX 260 with 216 stream processors consumed 295 W. The GeForce GTX 275 requires two six-pin PCI Express (PCIe) power connectors that plug into the rear edge of the card. In 2D desktop mode, the card clocks to 300/100 MHz (GPU/RAM).

BFG's bundle is quite minimalist. Two dual-link DVI ports are provided, and there is a power-splitter cable, a driver CD, and a discount coupon for games bundled with the card. There is no HDMI adapter, nor is there an S/PDIF cable included, either.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
111 comments
    Your comment
    Top Comments
  • Only one ATi card? What happened to all those OC'd 4890s?
    23
  • Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
    16
  • Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
    10
  • Other Comments
  • Only one ATi card? What happened to all those OC'd 4890s?
    23
  • And those HAWX benchmarks look ridiculous. ATi should wipe floor with nvidia with that. Of course you didn't put dx10.1 support on. Bastard...
    8
  • quarzOnly one ATi card? What happened to all those OC'd 4890s?


    These are the same boards that were included in the recent charts update, and are largely contingent on what vendors submit for evaluation. We have a review upcoming comparing Sapphire's new 1 GHz Radeon HD 4890 versus the stock 4890. It'll be up in the next couple of weeks, though.
    1
  • Am i the only one that find this article akward since looking at the tests done on Ati cards on The Last Remnant game makes me wonder what went wrong ... i mean it`s UT3 engine ... why so low performance ?
    4
  • Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
    10
  • The EVGA 295 is the stuff gods game with.

    I would love that card. I would have to replace my whole system to work it properly however.
    I want $1500 now... i7 920 (why get better? They all seem to be godly overclockers) and EVGA 295.

    How about a test suit of the EVGA GTX 295 in crossfire for a quad-gpu configuration? I know there's driver issues, but it would be fun to see what it could do regardless. Along with seeing how far Toms can OC the EVGA GTX 295.
    Actually... Toms just needs to do a new system building recommendation roundup. I find them useful personally, and would have used it myself had my cash source had not lost his job...
    -12
  • Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
    16
  • Seems that the cumulative benchmark graphs are going to be a bit skewed if The Last Remnant results are included in there... it's fairly obvious something odd is going on looking at the numbers for that game.
    4
  • Worst article in a long time. Why compare how old games perform on NVIDIA's high end graphic cards? Don't get me wrong i like them but where's all the Atomic stuff from Saphire, Asus and XFX had some good stuff from ATI too. So what.. you just took the reference cards from ATI and tested them? :| That is just wrong.
    9
  • WOW what a piece of s********** is this """"""review"""""" Noobidia pay good in this days.
    -7
  • ok i tried playing The Last Remnant on my comp with my 4870x2 and it failed hardcore >.< the game itself is ridiculously boring too. sooo why is it added to the benching list?? *shakes head* makes me sad...
    -1
  • I find it a lack this tests do not include the 3DMark Vantage suite.
    Ok, there aren't many games using DX10, but some very good ones do !.

    Thats the reason i've switched to vista.

    And with me enough people to justify a proper DX10 benchmark.
    1
  • No mention of the GTX 285 2GB version? I'm planning on picking up three of these for a tri-SLI Core i7 build, all water-cooled and overclocked.
    -6
  • well i'm running an factory overclocked gtx285. only because i like solid drivers and DAAMIT doesn't seem to be able to provide these consistently. thats been my biggest problem in picking up an ATI card.

    This review however is terrible. the benchmark selection is dated if nothing else. even toms other reviews of recent have used better benchmarks than this.
    -9
  • This benchmark is not fair for Ati !!!
    10
  • Lets see some 3dmarkVantage pls
    3
  • Like car review magazine (like the one my friend is working for), I THINK they only have cards that were submitted to them and (not sure if this is the case with Tom's) they're only lended for a limited amount of days.

    Although I'm not very satisfied (coz lack of ATI card in your possession), I thank you for the review with Fallout, Left 4 Dead and Last Remnant with DX9. Yup I'm still using XP coz the bog-down symptom with Vista is too noticeable for my rig.

    1920x1200 as minimum threshold? Cool, as my 23" is limted to 1920x1080 anyway :P
    0
  • I think you guys should cut Tino Kreiss some slack this I believe his first publication? Saying things like "this is the worst article I've read in a long time" doesn't actually help. You can blame the choice of benchmarks suites on sites manager/editor not the author as he only does what he is told to write. So with that in mind............Cangelini your fired.

    I am curious though, HAWX is a game sponsored by ATI so why is the HD4890 getting it's backside tanned by the GTX275? It's not just a few FPS behind either the difference is quite remarkable and yes I do realise the BFG GTX275 is overclocked but it's not overclocked by a lot.
    1
  • JeanLucCangelini your fired.


    Perhaps you'll hire me as a copy editor for your posts instead? ;-)

    In all seriousness, Tino has been with Tom's German office for a long time. I've asked the staff responsible for testing there to drop in and provide some feedback on the products and benchmarks used here.

    Best,
    Chris
    0
  • How lame this article is... i was always wondering why they don't use full potential of gpu, if ATI is capable of using DX10.1 (and game uses that technology), why not use it. it might not be fair for nvidia but ffs, i believe that this kind of review should show all potential of products.
    Shame for TH!
    9