Sign in with
Sign up | Sign in
Your question

192-bit vs 256-bit

Last response: in Graphics & Displays
Share
August 17, 2012 6:06:55 PM

This is a major noob question, but don't laugh - I'm new :p 

The new GTX 660 ti memory interface is 192-bit, but the older GTX 560 ti is 256-bit. I don't know what kind of performance difference this is or how to read the benchmark tests that have been run - I'm very new to hardware. I'm working on my first build, all I have left is to pick a card that I'll be happy with for a few years. I'll be mostly using Photoshop, Illustrator, and InDesign (all at the same time) every day, but I can see myself getting back into PC gaming). From what I've read, the more CUDA cores, the better the performance. But with a smaller 192-bit interface, can I expect some noticeable performance hits? For example, if I were to start playing WoW again, would I be able to easily run everything on ultra?

More about : 192 bit 256 bit

August 17, 2012 6:17:12 PM


I have seen speculation that it really hurts performance with anti-aliasing.

For example, the 660 TI has a narrower memory bus, and it looks like that shows up when anti-aliasing is used. When it's not used, it closer in performance to the 670.
m
0
l
Related resources
a c 105 U Graphics card
August 17, 2012 6:27:28 PM

from what I've seen the smaller interface will cut down on memory bandwidth.................. everything else I've seen between the 2 cards............ 660 literally mops the floor with the 560.
m
0
l
August 17, 2012 6:36:04 PM

michaeljhuman said:
I have seen speculation that it really hurts performance with anti-aliasing.

For example, the 660 TI has a narrower memory bus, and it looks like that shows up when anti-aliasing is used. When it's not used, it closer in performance to the 670.

Scary, anti-aliasing is obviously a huge concern in graphic design...certainly something to think about.

@gamerkila57 I was looking at those when I first started looking around but the $300 660 ti is already over the budget I've set for myself. That's why I was going back to the 560 ti, which like @swifty_morgan said, gets destroyed by the 660 ti . I'm just not sure what to do here :( 
m
0
l
a c 191 U Graphics card
August 17, 2012 6:49:10 PM

Unless your video apps use CUDA (which is nVidia only), take a look at the HD7850 for games. It's also a step up from the GTX560Ti. The higher your resolution, the less likely you'll be to "need" a lot of anti-aliasing, but if you want high settings there, the GTX660Ti is not the way to go. If your budget is lower, the GTX560Ti should do a nice job up to 1920x1080. I've read articles on gamers built with a mere HD6850, and with just a few settings on High (rather than UltraMaxOhWow!), they have smooth and playable frame rates.
m
0
l
a b U Graphics card
August 17, 2012 6:53:55 PM

A 660 Ti has the same cuda cores and clocks as a 670 but with a crippled 192-bit memory bus. So a comparison of benches between a 660 Ti and a 670 should tell you what you need to know: http://www.anandtech.com/bench/Product/647?vs=598

You can see the difference in frames per second that are given by the benchmarks. That is a rough idea of the difference the crippled memory interface makes. The 560ti is a fine card though - you can use that site to compare the 660ti to the 560ti to see if you want to upgrade. I think a 560ti is fine for most people though.

But yeah, if you run WoW, a 660 Ti will run Ultra @1920x1080 no problem.
m
0
l
August 17, 2012 7:00:16 PM

The size of the memory interface width basically dictates the performance outputted from the 660Ti's GDDR5 VRAM (2GB). For each GB of VRAM, there is a 64-bit memory interface , which, when applied to the 660Ti, should equal to a 128-bit interface (2GB x 64-bit = 128-bit) , much less than the 660ti's ACTUAL 192-bit, and superior 670's 256-bit. HOWEVER NVIDIA made some tweaks here that allowed for a wider 192-bit interface.

Just for comparison

7850 and 7870 reference cards = 256-bit - 2GB of GDDR5

GTX 670 = 256-bit interface width


The 192-bit memory interface will limit Anti-Aliasing and Anistroscopic Filtering performance, and other eye-candy textures of the like, and will become even more constricting when you move to multi-monitor set-ups if utilizing the same AA and AF setting.

The card itself is fantastic, and has definitely shaken up the GPU competition, leaving the HD 7870s and 7850s in a tough spot, and becoming arguably the best value card on the market.

As far as graphic design, most high-end cards are more than sufficient to run any Video-rendering/3D-modeling software optimally, but if you want to be absolutely sure, you can go for the 350$ HD 7950, which offers a bit better performance than the 660ti at stock, but provides much more memory interface width (384-bit)
m
0
l
a c 105 U Graphics card
August 17, 2012 7:03:24 PM

Onus said:
Unless your video apps use CUDA (which is nVidia only), take a look at the HD7850 for games. It's also a step up from the GTX560Ti. The higher your resolution, the less likely you'll be to "need" a lot of anti-aliasing, but if you want high settings there, the GTX660Ti is not the way to go. If your budget is lower, the GTX560Ti should do a nice job up to 1920x1080. I've read articles on gamers built with a mere HD6850, and with just a few settings on High (rather than UltraMaxOhWow!), they have smooth and playable frame rates.



the 6850 won't render physx, perhaps that was one of the reasons ?

AA is still needed at 1900 res. 4x usually does it for me.

I would never buy a card with 1gig of vram for gaming above 1600 res. ( especially in DX10/11 )
m
0
l
a b U Graphics card
July 26, 2013 8:23:16 AM

HEY BUDDY I'LL SAY THAT GO WITH GTX 560 TI 2GB 256 BIT GREAT CARD BEST PERFORMANCE PLAY ALL GAMES ON ULTRA HIGH SETTING ;) 

I HAVE CORE i3 4gb ram 560 ti and i had played the heaviest PC GAMES Battlefield 3,GTA 4, CRYSIS 3 on ultra high with 16x antialising

SO ITS QUIET SIMPLE GOTO WITH GTX 560 TI
m
0
l
a b U Graphics card
July 30, 2013 8:00:29 AM

Just got my GTX 660 last week and I've been testing it since then. For resolutions up to 1920x1200, you're golden until you start applying 8xMSAA or texture mods. Because with the 192-bit memory interface, you can only make effective use of 1.5GB of VRAM. The card will essentially throttle once you hit about 1510MB and try not to use any more, and you'll start getting lots of input lag (because the effective memory speed drops 3x lower).

However, no maxed stock game will ask for more than 1.5GB even at 1920x1200 unless you try to use 8xMSAA or higher, so the 192-bit GTXes are still great choices for gaming. If you have the means, go for the GTX 760, though.
m
0
l
a c 271 U Graphics card
July 30, 2013 9:28:19 AM

swifty_morgan said:
from what I've seen the smaller interface will cut down on memory bandwidth.................. everything else I've seen between the 2 cards............ 660 literally mops the floor with the 560.


On a single 1080 monitor that still rings true, no matter how high the AA is cranked up. ;) 
m
0
l
!