Sign in with
Sign up | Sign in

Nvidia GeForce GTX 560 Ti 448 Core Review: GF110 On A Diet

Nvidia GeForce GTX 560 Ti 448 Core Review: GF110 On A Diet
By

Ready for a limited edition graphics card? The GeForce GTX 560 Ti 448 Core just landed. Learn how it differs from the GeForce GTX 560 Ti, why its life will be be short, if it's a decent performer, and what we can do with this thing overclocked.

It’s not often that a graphics card manufacturer goes through the trouble of launching a special, limited-run product just for the holiday season. But that’s exactly what Nvidia is doing with its GeForce GTX 560 Ti 448 Core.

Given the name, you might expect this new card to be an unlocked and enhanced version of Nvidia's existing GeForce GTX 560 Ti. But that's simply not so. Recall that the GF114 graphics processor used in the existing GeForce GTX 560 Ti is already unfettered. All of its 384 cores are functional, leaving no disabled hardware to turn on. Rather, the GeForce GTX 560 Ti 448 Core is equipped with a cut-back GF110.

This GPU was first seen on the company's GeForce GTX 580, slightly handicapped for use in its GeForce GTX 570, and now further trimmed back for the GeForce GTX 560 Ti 448 Core.

GeForce GTX 560 Ti 448 Core Specs:

Compared to the GeForce GTX 580, two Streaming Multiprocessors (SM) are disabled; the GeForce GTX 560 Ti 448 Core utilizes 14 of the GF110’s 16 available SMs. Each functioning SM has 32 shader cores and four texture units. Five of the six 64-bit ROP partitions are left enabled, each capable of handling eight 32-bit integer pixels per clock cycle.

All told, the card has 448 shader cores, 56 texture units, 40 ROPs, and a 320-bit memory interface. Not surprisingly, its power demands necessitate two six-pin PCIe power cables. And because it's one of Nvidia's higher-end boards, the GeForce GTX 560 Ti 448 Core card supports two-, three-, and four-way SLI through its pair of SLI bridges. You cannot match it up to a standard GeForce GTX 560 Ti card, of course. It'll only cooperate with other 448-core models. So, if you'd like to run in a multi-card configuration, buy these boards at the same time, since they're not expected to remain available.

The GF110 GPU, as found in the new GeForce GTX 560 Ti 448 CoreThe GF110 GPU, as found in the new GeForce GTX 560 Ti 448 Core

If this card's specs sound familiar, that's probably because they match Nvidia's now-defunct GeForce GTX 470. You might also notice that the GeForce GTX 560 Ti 448 Core is essentially a GeForce GTX 570 with one SM disabled. And speaking of the GeForce GTX 570, the new card has the same 732 core, 1464 MHz shader, and 950 MHz GDDR5 memory frequencies.

Knowing what we know from past reviews on Nvidia's existing cards, the GeForce GTX 560 Ti 448 Core should perform between the GeForce GTX 560 Ti and the GeForce GTX 570. For more information on the company's line-up, check out the following reviews:

Nvidia GeForce GTX 560 Ti Review: GF114 Rises, GF100 Rides Off
GeForce GTX 570 Review: Hitting $349 With Nvidia's GF110
GeForce GTX 580 And GF110: The Way Nvidia Meant It To Be Played
GeForce GTX 480 And 470: From Fermi And GF100 To Actual Cards!


GeForce GTX 560 Ti
GeForce GTX 470
GeForce GTX 560 Ti 448 Core
GeForce GTX 570
Shader Cores
384448
448
480
Texture Units
6456
5660
Full Color ROPs
3240
40
48
Graphics Clock
822 MHz607 MHz
732 MHz732 MHz
Shader Clock
1644 MHz1215 MHz1464 MHz1464 MHz
Memory Clock
1002 MHz837 MHz950 MHz
950 MHz
GDDR5 Memory
1 GB
1280 MB
1280 MB1280 MB
Memory Interface
256-bit320-bit
320-bit320-bit
Form Factor
Dual-slotDual-slotDual-slotDual-slot
Power Connectors
2 x 6-pin2 x 6-pin2 x 6-pin2 x 6-pin


Nvidia made it clear to us that its GeForce GTX 560 Ti 448 Core isn’t a replacement for any existing product. A limited supply exists, and it’s exclusive to Asus, Evga, Gainward, Gigabyte, Inno3D, Palit, MSI, and Zotac. This new card is only available in the USA, Canada, the UK, France, Germany, Russia, and the Nordics.

The circumstances of this board's birth are somewhat strange. Perhaps Nvidia has a small collection of GF110 GPUs with two bad SMs, precluding them from use on a GeForce GTX 570. Or, it could simply be a product intended to fill a gap right before the holidays. It could even be a test case of sorts to see if there's a market for something between the GeForce GTX 560 Ti and 570.

Pressed for more information, Nvidia let us know that our first two suspicions were dead-on. Like any chip manufacturer Nvidia bins its processors, and it has a number of GF110s with 14 viable SMs. It chose to put them into a limited product to drum up sales over the holiday season, and tah-dah: the GeForce GTX 560 Ti 448 Core. No matter how few of these boards end up hitting shelves, though, it'll stand or fall based on its performance per dollar, just like any other graphics card.

Display 74 Comments.
This thread is closed for comments
Top Comments
  • 16 Hide
    cleeve , November 29, 2011 12:55 PM
    nhat11In the battlefield 3 tests, why aren't the testers testing the settings on Ultra? I don't care about settings on high.


    Because none of these cards are fast enough to run on Ultra unless you're going to drop resolution, and nobody buys this class of card to run below 1080p.

    We try to make our benchmark settings realistic, not theoretical.
  • 15 Hide
    tmk221 , November 29, 2011 12:46 PM
    nice gpu but it's to expensive compared to 6950...
  • 13 Hide
    zooted , November 29, 2011 2:52 PM
    This article just makes the 6950 1gb look very attractive.
Other Comments
  • -1 Hide
    borden5 , November 29, 2011 12:45 PM
    this one trade blows with 6950 2gb and cost about $30 more hm?
  • 15 Hide
    tmk221 , November 29, 2011 12:46 PM
    nice gpu but it's to expensive compared to 6950...
  • -6 Hide
    Ernst56 , November 29, 2011 12:47 PM
    I just recently replaced an aging 8800 GTS with the 2GB Twin Frozr 560TI card. I have a large case with 7 fans and with a fan profile running the Twin Frozr at 70%, I can overclock to well past 570 performance.

    Since I got the card, with a game, for $249, I'm very happy. An hour of MW3 or SC2 at max settings shows a max temp of 53C.
  • -7 Hide
    nhat11 , November 29, 2011 12:49 PM
    In the battlefield 3 tests, why aren't the testers testing the settings on Ultra? I don't care about settings on high.
  • 0 Hide
    borden5 , November 29, 2011 12:50 PM
    thanks for great article, does anyone notice the 6950 1gb vs 2gb give same performance even tho at higher resolution ??
  • 16 Hide
    cleeve , November 29, 2011 12:55 PM
    nhat11In the battlefield 3 tests, why aren't the testers testing the settings on Ultra? I don't care about settings on high.


    Because none of these cards are fast enough to run on Ultra unless you're going to drop resolution, and nobody buys this class of card to run below 1080p.

    We try to make our benchmark settings realistic, not theoretical.
  • 12 Hide
    jimmy-bee , November 29, 2011 1:04 PM
    Wow, I hate to see the death of 1920 x 1200 resolution monitor to be replaced by 1080P. But liked this benchmark since I have a 560Ti. Always used Tom's benchmarks to help me decide on video cards.
  • 0 Hide
    Anonymous , November 29, 2011 1:16 PM
    I'm with nhat11.

    I play my BF3 on Ultra settings and 1080p with the 6950 2GB. Ans this is not "theoretical". So if the framerate its 10fps everybody should know.

  • 3 Hide
    dontcrosthestreams , November 29, 2011 1:27 PM
    "the nordics".......skyrim joke please.
  • 3 Hide
    wolfram23 , November 29, 2011 1:31 PM
    I always find it almost shocking that the 6950 1gb and 2gb models have basically identical framerates even at 2560x1600 in all of these super demanding games. Do we really need more than 1gb VRAM? I always think about going triple monitors, and always think my 1gb is going to be a drawback...
  • 1 Hide
    badtaylorx , November 29, 2011 1:35 PM
    it really bugs me when Nvidia does this crap!!!

    id like to see if this thing is any better than a Sparkle GTX 560 Ti DF Calibre

    i highly doubt it


    the other thing that stands out here is AMD's ever increasing performance on the HD 6970!!!
  • 0 Hide
    lothdk , November 29, 2011 1:42 PM
    On the Zotac page you write

    Quote:
    Zotac’s option is based on its GeForce GTX 570 AMP! Edition card.
    ..
    The 448-core card doesn't get the designation of being one of Zotac's AMP! models


    yet in the conclusion on the same page you write

    Quote:
    The Zotac GeForce GTX 560 AMP! Edition has an MSRP of $299


    Either I am misunderstanding this, or one of those are wrong.
  • -3 Hide
    theconsolegamer , November 29, 2011 1:43 PM
    Ernst56I just recently replaced an aging 8800 GTS with the 2GB Twin Frozr 560TI card. I have a large case with 7 fans and with a fan profile running the Twin Frozr at 70%, I can overclock to well past 570 performance.Since I got the card, with a game, for $249, I'm very happy. An hour of MW3 or SC2 at max settings shows a max temp of 53C.

    I've used a GTX560ti in school with BF3 and it gets mid 50's Celcius with room temp of 60f with A/C.
  • 7 Hide
    fulle , November 29, 2011 2:03 PM
    My favorite part of the review was how min FPS values were included for Batman, AND the comment that the game was unplayable in the first set of tests due unacceptable min values.

    Too many times do I see this sort of thing overlooked. Great job!
  • -1 Hide
    helpy , November 29, 2011 2:09 PM
    how the hell were those temperatures so low ?
    i mean 34c is in idle mode for my msi r6950 tf3 pe/oc
  • 5 Hide
    banthracis , November 29, 2011 2:17 PM
    wolfram23I always find it almost shocking that the 6950 1gb and 2gb models have basically identical framerates even at 2560x1600 in all of these super demanding games. Do we really need more than 1gb VRAM? I always think about going triple monitors, and always think my 1gb is going to be a drawback...


    2560x1600 is 4mp whilst 3 1080p monitors is 6mp, a 50% increase. This makes a significant difference especially if you enable AA options. On a xfire 5850 setup I used to run, several games simply would not run at all (Shogun 2, GTA IV, Crysis come to mind) at 3240x1920, but would run fine if I lowered the resolution. Switching to a 2x 2gb 6950 setup allowed 3240x1920 to run.

    Remember, most review sites simply do no do multi monitor reviews. If the cases where they are done, like the below HardOCP article, there are very clear cases where VRAM walls are hit in triple monitor gaming. In this specific case, tri fire 6970 were able to beat tri sli 580's simply because the 580's didn't have sufficient VRAM even with 1.5gb.

    Is VRAM an issue at 1080p? No. Don't bother worrying about it. However if you're using multimonitorssetups, it makes a big difference.
    http://hardocp.com/article/2011/04/28/nvidia_geforce_3way_sli_radeon_trifire_review/2
  • 0 Hide
    Yuka , November 29, 2011 2:36 PM
    Ok, here are the ingredients:

    - Short lived "special" video card
    - XMas season
    - Option to unlock/OC to the next tier
    - Limited quantity

    Ok, this might not be in everybody's pleasure to read, but I think it's a sadistic way to get more green juice out of fanbois. This card will be at the GTX570 levels or more (price wise). Supply and demand tell me so.

    I don't know if we'll be able to recommend this card at all =/

    Cheers!
  • 0 Hide
    Anonymous , November 29, 2011 2:39 PM
    Why oh why there is never a good low end integrated or at least reasonably priced GPU given for comaprison in the charts. The numbers are quite helpless until I understand how much money I must spend to get N time the performance. People buy card only very rarely - and mostly are coming from low end or at least 2-3 generations back - never from another super card. So at least one low end comparison would be nice to show what this amount of money can do.
  • 3 Hide
    banthracis , November 29, 2011 2:52 PM
    scatmanWhy oh why there is never a good low end integrated or at least reasonably priced GPU given for comaprison in the charts. The numbers are quite helpless until I understand how much money I must spend to get N time the performance. People buy card only very rarely - and mostly are coming from low end or at least 2-3 generations back - never from another super card. So at least one low end comparison would be nice to show what this amount of money can do.



    You'll have to define reasonably priced. For many enthusiast's a $250 card is a reasonable price. As for integrated cards, no point including them in the test and they simply will not run the majority of these tests. Adding the equivalent of a line saying zero for each of these test's is kinda silly.

    To get >0 for integrated cards you'd need much lower settings which are no longer representative of the common settings used by gamers and wouldn't allow the high end GPU to distinguish amongst themselves since they wouldn't be stressed. You'll have shifted bottleneck to CPU and at that point you''ll essentially be looking at a CPU performance graph.
  • -9 Hide
    spookyman , November 29, 2011 2:52 PM
    So would a GTX 590 be able to beat it?
Display more comments