Nvidia GeForce GTX 560 Ti 448 Core Review: GF110 On A Diet

GeForce GTX 560 Ti 448 Core Specifications

It’s not often that a graphics card manufacturer goes through the trouble of launching a special, limited-run product just for the holiday season. But that’s exactly what Nvidia is doing with its GeForce GTX 560 Ti 448 Core.

Given the name, you might expect this new card to be an unlocked and enhanced version of Nvidia's existing GeForce GTX 560 Ti. But that's simply not so. Recall that the GF114 graphics processor used in the existing GeForce GTX 560 Ti is already unfettered. All of its 384 cores are functional, leaving no disabled hardware to turn on. Rather, the GeForce GTX 560 Ti 448 Core is equipped with a cut-back GF110.

This GPU was first seen on the company's GeForce GTX 580, slightly handicapped for use in its GeForce GTX 570, and now further trimmed back for the GeForce GTX 560 Ti 448 Core.

GeForce GTX 560 Ti 448 Core Specs:

Compared to the GeForce GTX 580, two Streaming Multiprocessors (SM) are disabled; the GeForce GTX 560 Ti 448 Core utilizes 14 of the GF110’s 16 available SMs. Each functioning SM has 32 shader cores and four texture units. Five of the six 64-bit ROP partitions are left enabled, each capable of handling eight 32-bit integer pixels per clock cycle.

All told, the card has 448 shader cores, 56 texture units, 40 ROPs, and a 320-bit memory interface. Not surprisingly, its power demands necessitate two six-pin PCIe power cables. And because it's one of Nvidia's higher-end boards, the GeForce GTX 560 Ti 448 Core card supports two-, three-, and four-way SLI through its pair of SLI bridges. You cannot match it up to a standard GeForce GTX 560 Ti card, of course. It'll only cooperate with other 448-core models. So, if you'd like to run in a multi-card configuration, buy these boards at the same time, since they're not expected to remain available.

The GF110 GPU, as found in the new GeForce GTX 560 Ti 448 Core

If this card's specs sound familiar, that's probably because they match Nvidia's now-defunct GeForce GTX 470. You might also notice that the GeForce GTX 560 Ti 448 Core is essentially a GeForce GTX 570 with one SM disabled. And speaking of the GeForce GTX 570, the new card has the same 732 core, 1464 MHz shader, and 950 MHz GDDR5 memory frequencies.

Knowing what we know from past reviews on Nvidia's existing cards, the GeForce GTX 560 Ti 448 Core should perform between the GeForce GTX 560 Ti and the GeForce GTX 570. For more information on the company's line-up, check out the following reviews:

Nvidia GeForce GTX 560 Ti Review: GF114 Rises, GF100 Rides OffGeForce GTX 570 Review: Hitting $349 With Nvidia's GF110GeForce GTX 580 And GF110: The Way Nvidia Meant It To Be PlayedGeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!

Swipe to scroll horizontally
Header Cell - Column 0 GeForce GTX 560 TiGeForce GTX 470GeForce GTX 560 Ti 448 CoreGeForce GTX 570
Shader Cores384448448480
Texture Units64565660
Full Color ROPs32404048
Graphics Clock822 MHz607 MHz732 MHz732 MHz
Shader Clock1644 MHz1215 MHz1464 MHz1464 MHz
Memory Clock1002 MHz837 MHz950 MHz950 MHz
GDDR5 Memory1 GB1280 MB1280 MB1280 MB
Memory Interface256-bit320-bit320-bit320-bit
Form FactorDual-slotDual-slotDual-slotDual-slot
Power Connectors2 x 6-pin2 x 6-pin2 x 6-pin2 x 6-pin

Nvidia made it clear to us that its GeForce GTX 560 Ti 448 Core isn’t a replacement for any existing product. A limited supply exists, and it’s exclusive to Asus, Evga, Gainward, Gigabyte, Inno3D, Palit, MSI, and Zotac. This new card is only available in the USA, Canada, the UK, France, Germany, Russia, and the Nordics.

The circumstances of this board's birth are somewhat strange. Perhaps Nvidia has a small collection of GF110 GPUs with two bad SMs, precluding them from use on a GeForce GTX 570. Or, it could simply be a product intended to fill a gap right before the holidays. It could even be a test case of sorts to see if there's a market for something between the GeForce GTX 560 Ti and 570.

Pressed for more information, Nvidia let us know that our first two suspicions were dead-on. Like any chip manufacturer Nvidia bins its processors, and it has a number of GF110s with 14 viable SMs. It chose to put them into a limited product to drum up sales over the holiday season, and tah-dah: the GeForce GTX 560 Ti 448 Core. No matter how few of these boards end up hitting shelves, though, it'll stand or fall based on its performance per dollar, just like any other graphics card.

  • borden5
    this one trade blows with 6950 2gb and cost about $30 more hm?
  • tmk221
    nice gpu but it's to expensive compared to 6950...
  • Ernst56
    I just recently replaced an aging 8800 GTS with the 2GB Twin Frozr 560TI card. I have a large case with 7 fans and with a fan profile running the Twin Frozr at 70%, I can overclock to well past 570 performance.

    Since I got the card, with a game, for $249, I'm very happy. An hour of MW3 or SC2 at max settings shows a max temp of 53C.
  • nhat11
    In the battlefield 3 tests, why aren't the testers testing the settings on Ultra? I don't care about settings on high.
  • borden5
    thanks for great article, does anyone notice the 6950 1gb vs 2gb give same performance even tho at higher resolution ??
  • cleeve
    nhat11In the battlefield 3 tests, why aren't the testers testing the settings on Ultra? I don't care about settings on high.
    Because none of these cards are fast enough to run on Ultra unless you're going to drop resolution, and nobody buys this class of card to run below 1080p.

    We try to make our benchmark settings realistic, not theoretical.
  • jimmy-bee
    Wow, I hate to see the death of 1920 x 1200 resolution monitor to be replaced by 1080P. But liked this benchmark since I have a 560Ti. Always used Tom's benchmarks to help me decide on video cards.
  • I'm with nhat11.

    I play my BF3 on Ultra settings and 1080p with the 6950 2GB. Ans this is not "theoretical". So if the framerate its 10fps everybody should know.

  • dontcrosthestreams
    "the nordics".......skyrim joke please.
  • wolfram23
    I always find it almost shocking that the 6950 1gb and 2gb models have basically identical framerates even at 2560x1600 in all of these super demanding games. Do we really need more than 1gb VRAM? I always think about going triple monitors, and always think my 1gb is going to be a drawback...