Sign in with
Sign up | Sign in
Your question

New GeForce GTX 260 to Feature 216 Shaders - Beats Radeon 4870

Last response: in Graphics & Displays
Share
September 2, 2008 9:33:19 AM



Tuesday, 02 September 2008

The current Geforce GTX 260 has 192 Shaders and it looks that the new one will have 216. The story is quite simple. The GT200 core has 10 units with each featuring 24 Shaders and instead of eight clusters enabled with the old GTX 260, Nvidia will enable nine.

The new card should be available at some point in September and it definitely wins over the Radeon 4870. That is the whole point and it looks like the new GTX 260 will end up around $50 more expensive than the current one.

It looks like that this is the first "new" product that will try to consolidate its roadmap in the war against ATI. ATI came out strong and this time Nvidia has a better answer than simply renaming its current products - it will slightly alter them and market them as new.

http://www.fudzilla.com/index.php?option=com_content&task=view&id=9209&Itemid=1
September 2, 2008 10:25:20 AM

So, is this run of the GTX260 on the newer 55nm process or is it just more of the same with some new features enabled? Why didn't nVidia enable nine clusters with the initial release? At least they could make the MRP the same as existing 260's, how much performance will an additional 24 shaders really buy for +/-$50? Whether it wins over the 4870 has yet to be seen, GIVE US THE REVIEWS AND BENCHES!
September 2, 2008 10:29:46 AM

no die-shrink, no dx10.1 support, no faster ram, slightly more shaders & 50$ more.

Doesnt seem worth it at all. I think nvidia is just wasting time and they just should get there team to work hard on the gtx300 series. 40nm die-shrink dx10.1 support etc..
Related resources
September 2, 2008 10:29:56 AM

Call it gtx 270 nvidia! Ive never heard anything more confusing than their line up
September 2, 2008 10:52:54 AM

just curious cant we just unlock the current gtx260 with a simple bios flash?
September 2, 2008 11:23:12 AM

I agree with Blackwidow _rsa. Please for the love of all that is good, make it the GTX270. The whole point of going over to this new naming scheme was to make things simpler. Simpler my ass! This will be like the 8800GTS all over again. Let's count the versions of the 8800GTS

8800GTS 640MB G80
8800GTS 320MB G80
8800GTS 512 SSC G80
8800GTS 512 G92

Not to forget that the 9800GTX is an 8800GTS 512 G92 with higher clocks, so then you could add.

9800GTX
9800GX2
9800GTX+

Nvidia stop this bs! It's annoying and it misleads customers that aren't extremely computer savvy.
September 2, 2008 12:19:14 PM

DX10.1 is nothing, theres no real diff between DX10 and 10.1, we already know that in alot of game, DX10 dont show improve graphic, only the support of SM4 who increase the perf
September 2, 2008 12:27:57 PM

Make this a 55nm card for heat / noise sake, and I will drop my buggy hot noisy 4870 like a stone.
September 2, 2008 12:29:50 PM

AuDioFreaK39 said:
http://www.fudzilla.com/images/stories/Logos/geforce.jp...

Tuesday, 02 September 2008

The current Geforce GTX 260 has 192 Shaders and it looks that the new one will have 216. The story is quite simple. The GT200 core has 10 units with each featuring 24 Shaders and instead of eight clusters enabled with the old GTX 260, Nvidia will enable nine.

The new card should be available at some point in September and it definitely wins over the Radeon 4870. That is the whole point and it looks like the new GTX 260 will end up around $50 more expensive than the current one.

It looks like that this is the first "new" product that will try to consolidate its roadmap in the war against ATI. ATI came out strong and this time Nvidia has a better answer than simply renaming its current products - it will slightly alter them and market them as new.

http://www.fudzilla.com/index.php?option=com_content&ta...

:sarcastic:  Whatever.
September 2, 2008 12:38:20 PM

As Stated Above.............
New Name Please.

Changing the Product but not the name is beyond words.
September 2, 2008 12:43:48 PM

HTDuro said:
DX10.1 is nothing, theres no real diff between DX10 and 10.1, we already know that in alot of game, DX10 dont show improve graphic, only the support of SM4 who increase the perf


In this you are mistaken. DX10.1 is what DX10 was suppose to be, except M$ catered to nVidia who won't/can't meet the spec. DX9 to 10 looks so unimpressive because of this.
September 2, 2008 12:51:32 PM

Quote:
In this you are mistaken. DX10.1 is what DX10 was suppose to be, except M$ catered to nVidia who won't/can't meet the spec. DX9 to 10 looks so unimpressive because of this.

true

and another thing- 4870 is faster than gtx 280 with 8x aa in most games. I dont know about you, but i start with 8x aa and than go up. I dont even bother with 4x aa or lower. If you like your games looking jagedy, than yes, gtx 280 is faster. gtx 270, if it was to be called, would still be slower than gtx 280- why would I want to pay more to get less? So can i say I have nvidia card? I dont care about the name, only about the performance.

nvidia lost this round as much as it won with the g80 chip.
September 2, 2008 12:58:17 PM

I think they should call it... the G280-minus! Thatd at least be different. Then people would know its not a G260, sorta like a G280, but isnt, sorta like the saying "same thing but different". And then, because they havnt sold near as many G280s as theyd thought they would, even no wheres near the pricing they hoped they could, all theyd have to do is add a -. Easily done. Maybe then they could recoup some of their projected monies from your pocket on the cheap. 8800GS. 8800GTS640 with more shaders. Im just glad for nVidia most of their numbers are even, otherwise theyd really be in a pickle as how to name this thing
September 2, 2008 1:03:52 PM

invisik said:
just curious cant we just unlock the current gtx260 with a simple bios flash?

No :( 

I agree with everyone here, this card should be called the GTX270.

To the DX10.1 naysayers, better MSAA support with deferred rendering alone is worth the price of admission IMO. That is, if developers would get on board...and stay (Ubisoft :fou:  ).
September 2, 2008 1:17:45 PM

The_Abyss said:
Make this a 55nm card for heat / noise sake, and I will drop my buggy hot noisy 4870 like a stone.


Have you tried the driver mod for the fans? I have 2 4870 in Crossfire and they idle at 50 and 45 degrees and are still virtually inaudiable.
September 2, 2008 1:50:03 PM

eodeo said:
Quote:
In this you are mistaken. DX10.1 is what DX10 was suppose to be, except M$ catered to nVidia who won't/can't meet the spec. DX9 to 10 looks so unimpressive because of this.

true

and another thing- 4870 is faster than gtx 280 with 8x aa in most games. I dont know about you, but i start with 8x aa and than go up. I dont even bother with 4x aa or lower. If you like your games looking jagedy, than yes, gtx 280 is faster. gtx 270, if it was to be called, would still be slower than gtx 280- why would I want to pay more to get less? So can i say I have nvidia card? I dont care about the name, only about the performance.

nvidia lost this round as much as it won with the g80 chip.


AA is not that important once you pass a certain resolution threshold. At around 1600x1200 (or 1680 x 1050 WS), more than 2x is really pointless. You are better off using the processing power to things like HDR.
8x is absolutely useless unless you are playing 800 x 640.
September 2, 2008 1:57:28 PM

crimsonfilms said:
AA is not that important once you pass a certain resolution threshold. At around 1600x1200 (or 1680 x 1050 WS), more than 2x is really pointless. You are better off using the processing power to things like HDR.
8x is absolutely useless unless you are playing 800 x 640.


Not true, it depends on your display too. Depending on your pixel pitch, you may notice a huge difference between 2x AA and 8x AA. This especially becomes true when using HDTVs. I'm currently using an HDTV and the more AA I can push the better. Every game I play must have at least 2x AA for it to look halfway decent.

Also, like above posters have mentioned, there's a pretty big difference between DX10 and DX10.1. DX10.1 is what was promised to us by Microsoft, better performance plus better graphical features and quality. What we got was the unified shaders with some extra lighting features, minus a ton of other things. In Assassin's Creed, we saw a huge performance increase for the HD 3870/3850 cards when using DX10.1. So far that's the only game to support it, because Nvidia was the first to release their DX10 GPU, so Microsoft went along with the way Nvidia was doing it, which isn't true to the actual spec.
September 2, 2008 2:14:16 PM

IndigoMoss said:
Not true, it depends on your display too.


Then that is a monitor issue. Your typical computer LCD does not have this issue. Low cost HDTVs do. Pixel pitch reduces perceived resolution. So my statement is still correct. At a certain resolution AA is inefficient. So if your monitor can't achieve that resolution....


Also, like above posters have mentioned, there's a pretty big difference between DX10 and DX10.1. said:

Also, like above posters have mentioned, there's a pretty big difference between DX10 and DX10.1.


No there is not a big difference. It is hype and a little fact that many ATi supporters keep pressing. It is making mountain out of mole hills. It is more about proper implementation of DX10 is the issue. As programmers and devs write more efficient code, the better DX10 result you will get.
September 2, 2008 2:21:44 PM

AuDioFreaK39 said:
and it definitely wins over the Radeon 4870.



fud... FUUUUD!!!!

Let's see some ACTUAL results, THEN I'll be impressed. Til then, screw you nvidia. (Give me good performance at a reasonable price and I'm all over that ****....)
September 2, 2008 2:39:26 PM

HTDuro said:
DX10.1 is nothing, theres no real diff between DX10 and 10.1, we already know that in alot of game, DX10 dont show improve graphic, only the support of SM4 who increase the perf



The issue, as I see it, with DX10.1 is that is has not gained wide support from the game makers, yet. And, I say "yet" for a reason. Given that DX11 is realistically at least 2 years away from the consumers, DX10.1 is/will be the defacto API, bet on seeing more games supporting DX10.1 within the next year.

The real shame about DX10 & DX10.1 is the poor perception and slowed adoption of Vista.

I haven't seen any real articles effectively comparing DX10 to DX10.1 to say whether one is better than the other. But I do know that Crysis running in Vista using DX10 is far more amazing to look at than Crysis in XP using DX9c. I'm actually considering another boot partition and a gpu upgrade just so I can replay Crysis with the eye candy.
September 2, 2008 3:14:00 PM

chunkymonster said:
The issue, as I see it, with DX10.1 is that is has not gained wide support from the game makers, yet. And, I say "yet" for a reason. Given that DX11 is realistically at least 2 years away from the consumers, DX10.1 is/will be the defacto API, bet on seeing more games supporting DX10.1 within the next year.


Exactly. It is not a DX10 vs DX10.1.
The real issue is DX9 vs DX10. It is the code. If you understand the difference between DX10 and Dx10.1 then you KNOW the difference is not huge - at least when compared to DX9 and DX10.

Until devs get proficient with DX10, some people will have a wrong perception that somehow DX10.1 is a huge factor. It is not.
September 2, 2008 3:28:29 PM

i bet by the time this is out, ati will have most or the entire lineup already released... the 4850x2 and the 4870 XOC edition... the one with watercooling... IM WAITING FOR THAT

who knows, they prolly even have the 4870x2 1gb edition ready, i hope i can afford this junk lol
September 2, 2008 3:28:39 PM

Doesn't dx10.1 only add performance once AA is applied?
September 2, 2008 3:36:21 PM

chunkymonster said:
I haven't seen any real articles effectively comparing DX10 to DX10.1 to say whether one is better than the other. But I do know that Crysis running in Vista using DX10 is far more amazing to look at than Crysis in XP using DX9c. I'm actually considering another boot partition and a gpu upgrade just so I can replay Crysis with the eye candy.


After reading people saying things like that, I think Microsoft owes the makers of Crysis some major $$$. What other compelling (gaming-related) reason is there to upgrade? I've already got Vista Home Premium so don't anyone start up with me on that... I'm just asking... what was the killer, must-have game that really required Vista to shine? Crysis is really all I'm hearing.
September 2, 2008 3:46:33 PM

blackwidow_rsa said:
Doesn't dx10.1 only add performance once AA is applied?
If developers take advantage of tessellation, we could see HUGE performance and quality increases.
September 2, 2008 4:02:04 PM

dx10.1 is what dx10 should have been....which microsoft failed to achieve when they claimed performance increase of over 2 times or something.
and we all saw the performance difference dx10.1, assasins creed runs a LOT faster on dx10.1 card then on dx10.i am obviously talking about the version with dx10.1 support.

September 2, 2008 4:02:31 PM

rodney_ws said:
After reading people saying things like that, I think Microsoft owes the makers of Crysis some major $$$. What other compelling (gaming-related) reason is there to upgrade? I've already got Vista Home Premium so don't anyone start up with me on that... I'm just asking... what was the killer, must-have game that really required Vista to shine? Crysis is really all I'm hearing.


You're right, I partially mentioned Crysis because it is the game everyone else mentions. Crysis in of itself as a game is nothing super spectacular, HL and HL2 had a guy in a suit that gave him extra "powers" so nothing new there (but the idea of a nano-suit is friggin bad-a$s), the gameplay in Crysis is/was nothing extraordinary from any other shooter, and the story line is/was just mediocre.

So, Crysis as a game is nothing super special its just the first game that has been popularized to showcase DX10 support as well as all the hype nVidia and M$ threw behind it.

You're right, what other compelling reason, aside from gaming, is/was there to install/upgrade to Vista and DX10?! None that I can think of...I still am very disappointed that DX10/DX10.1 isn't supported by XP. But how else was M$ supposed to force Vista on the masses aside from OEMs?

As far as DX10 games go, Assasin's Creed looks fantastic, and so does Gears of War and Age of Conan.

I impatiently await the release of Starcraft2...
a b Î Nvidia
September 2, 2008 5:30:37 PM

Heyyou27 said:
If developers take advantage of tessellation, we could see HUGE performance and quality increases.


Add to that better buffer management for deferred shading, material management and render from MSAA buffers and virtualized memory making almost all of the performance improvements we were expecting from DX10 ended up in DX10.1

Remember when Company of Heroes DX10 patch came out most of us said "WTF did they forget any of the performance tweaks?" it was little image changes and a great performance hit. IMO DX10.1 offers that second part of the equation people wanted improved graphics quality WITH IMPROVED PERFORMANCE.

But still need devs to make use of them. May take until the first DX11 games for it to be a killer app, but hopefully they can make good use of it before then.

a b Î Nvidia
September 2, 2008 5:41:36 PM

AuDioFreaK39 said:

The new card should be available at some point in September


Meaning their 55nm cards are far off. Obviously they are running into problems if their solution is to unlock some shaders on the still expensive G200 chip than to start selling a significantly cheaper & faster 55nm refresh.

So all the hype/hope of the 55nm by September seems unlikely now, the bigger question is when will it arrive, and can it do anything to challenge the X2, sure doesn't seem like a small boost in shader speed alone is going to be convincing as a later launch, it's more likely to continue the win some / lose some that is the current situation, maybe breaking o creating a tie compared to before.

I'm also skeptical of adding a few shaders to the GTX 260 helping all that much, in the shader department the GTX280 is below the HD4850, so those games that were shader and bandwidth bound will still favour the HD4870, those that are still texture, ROP and VRAM bound will still favour the GTX260/280.

All this essentially does is make any tie or close battle a little bit better for the GTX260+ or whatever it's called, nothing revolutionary.
And for $50 more, would you even bother?

Seriously, this news is as boring as the GF8800GTS-SSC with 112 SPUs, the real news was the G92, and the same situation here, who cares about the 260+, the 55nm refreshes are what really matters, until then it's window dressing on a delay.
September 2, 2008 7:39:02 PM

invisik said:
just curious cant we just unlock the current gtx260 with a simple bios flash?



Inspite of what homerdog said; i'm willing to question this as its not the first time this has happened.

a GTX260 is a GTX280 with the shader units /disabled/ NOT removed completely.

Likewise, the gtx260 "clocks" can ramp WAY higher than the stock clocks - actually they can pretty much hit nearly any clock speed that the GTX280 can on average.

I'd be willing to bet that you could flash a GTX260 with a GTX"260+" Bios; unless the manufacturing process literally laser cuts the shader groupings to damage them so that they aren't usable, I don't see why you couldn't. Its already been said that "defective" GTX280 shader cores were in turn binned into GTX260s.
September 2, 2008 7:39:59 PM

Ape i know dx11 adds tessellation and i also know the 4850/4870 has tessellation will that make them dx11 compliant
September 3, 2008 2:26:49 AM

ovaltineplease said:
I'd be willing to bet that you could flash a GTX260 with a GTX"260+" Bios; unless the manufacturing process literally laser cuts the shader groupings to damage them so that they aren't usable, I don't see why you couldn't. Its already been said that "defective" GTX280 shader cores were in turn binned into GTX260s.

That's basically it, they're physically disabled and there is no way ever ever ever to bring them back. I'm afraid the good old days of unlocking 'pipes' are gone :( 
September 3, 2008 2:30:17 AM

rangers said:
Ape i know dx11 adds tessellation and i also know the 4850/4870 has tessellation will that make them dx11 compliant

Has it been confirmed that the tessellator in RV770 is DX11 compliant?
September 3, 2008 2:35:32 AM

homerdog said:
Has it been confirmed that the tessellator in RV770 is DX11 compliant?



that what i would like 2 know
September 3, 2008 3:37:32 AM

homerdog said:
That's basically it, they're physically disabled and there is no way ever ever ever to bring them back. I'm afraid the good old days of unlocking 'pipes' are gone :( 



I QQ endlessly

Ok, I feel better now.
September 3, 2008 9:00:49 AM

I posted elsewhere that DX10.1 is getting closer and closer, and eventually, denying it will be to nVidias disgrace as being seen as cavalier again. Even if the games coming out thatll use DX10.1 arent uber games, therell be more than just 1 instance for comparisons across manufacturers. With 3 or 4 such examples, the tide could turn quickly, and not having DX10.1 support could become ugly. If this card comes out at the same price as the current 260, and they lower the price of the 260, then thered be real competition, but unless it does, its another sticker change/get more money attempt by nVidia
September 3, 2008 12:18:59 PM

JAYDEEJOHN said:
I posted elsewhere that DX10.1 is getting closer and closer, and eventually, denying it will be to nVidias disgrace as being seen as cavalier again. Even if the games coming out thatll use DX10.1 arent uber games, therell be more than just 1 instance for comparisons across manufacturers. With 3 or 4 such examples, the tide could turn quickly, and not having DX10.1 support could become ugly. If this card comes out at the same price as the current 260, and they lower the price of the 260, then thered be real competition, but unless it does, its another sticker change/get more money attempt by nVidia

If DX10.1 could somehow be patched into the current UE3 games I would go out and buy a 4870 today. Just imagine Bioshock and Mass Effect with true MSAA that works right, doesn't have to be forced in the driver, and doesn't kill performance :love: 

S.T.A.L.K.E.R. too :sol: 
September 3, 2008 12:23:08 PM

Because ATI aren't about making money, advertising 10.1 when there's nothing really around that users it, and people who upgrade every 12 months will have changed cards by the time there is.

I wish there would be some balance posted here sometimes.
September 3, 2008 1:28:42 PM

The_Abyss said:
Because ATI aren't about making money, advertising 10.1 when there's nothing really around that users it, and people who upgrade every 12 months will have changed cards by the time there is.

I wish there would be some balance posted here sometimes.


People sticks with the card as long as there's not a "good" upgrade for them, they're not bound to a fixed amount of time. There are the non-price-sensitive fellow around, but i wouldn't say they're majority.

DX10.1 has been out a lot of time, can't deny it, but cause nVidia was selling good and ATi wasn't, DX10.1 games were an illusion. Now, that ATi is on top and nVidia has to struggle, we might see a turn in the table and actually see some DX10.1 titles. Maybe not big ones, but titles to show off (like Crysis did) DX10.1 and it's performance.

Esop!
September 3, 2008 2:18:27 PM

Ill never understand the negativite non chalance attitude towards DX10.1. Like was said, it was part of the original DX10 package, changed due to certain circumstances, added on later, and in the only example we have shows high promise. Do people feel that way because nVidia doesnt have cards for it? Lets put the shoe on the other foot. Since Crysis is really the only true example of DX10, then should we care aboput it as well, or forsake DX10 altogether? If Crysis were DX10/1, it may be playable today, certainly closer than what we have. Maybe if devs saw a desire in the gaming community for it, it would have some effect, being exclusive, which I think AC tried to do, but was shot down. And by who? Too much influence, and bad at that
September 3, 2008 2:56:07 PM

JAYDEEJOHN said:
Lets put the shoe on the other foot. Since Crysis is really the only true example of DX10, then should we care aboput it as well, or forsake DX10 altogether?


Most people couldn't care less about DX10 because Microsoft made it Vista-only. And if DX10 is irrelevant to most people because they're not running Vista, DX10.1 is pointless.

Eventually most people will end up on Vista or 'Windows 7', but it will still be a couple of years before DX10-only games make any sense to developers; and by then we'll probably have DX11 anyway (which, the way they're currently going, Microsoft will only release on 'Windows 7').
September 3, 2008 3:28:31 PM

Sorry to say, but xp is slowly losing out, and faster and faster as time goes by. So xp DX9 wont matter. Itll be history. How much longer can we deny DX10.1? Or Vista? Soon, a few games will be out, and we will see how well its then accepted by the gaming community. Im sure you know the whole story behind DX10, and you know that its impossible to run it as is on xp. What I dont understand is, if you buy a new cpu, you may have to buy a new mobo for it. Thats been accepted. Im not a M$ fan, but I am all for going forward in tech. This isnt forwards looking. All I heard was before, that nVidia would just muscle its way, raw power yadda yadda , now whats happened? ATI is out muscling, out hustling nVidia and guess what? It still has DX10.1.
September 3, 2008 4:21:01 PM

MarkG said:
Eventually most people will end up on Vista or 'Windows 7', but it will still be a couple of years before DX10-only games make any sense to developers; and by then we'll probably have DX11 anyway (which, the way they're currently going, Microsoft will only release on 'Windows 7').

DX11 will be available on Vista.

It isn't just Vista keeping devs away from DX10, it's the hardware too. There are still a lot of DX9 cards out there.
September 3, 2008 4:45:30 PM

chunkymonster said:


As far as DX10 games go, Assasin's Creed looks fantastic, and so does Gears of War and Age of Conan.

I impatiently await the release of Starcraft2...


Is AoC even DX10? I've got Vista, a DX10 GPU and AoC and I don't recall seeing any options for it. I don't play it any more, but I was curious.
September 3, 2008 4:47:00 PM

Nevermind... I answered my own question. So there's one LESS reason to make the switch to Vista LOL. Oh well, I've already got AoC AND Vista. Drat.

Funcom has regrettably announced that the DirectX 10 version of its MMO, Age of Conan, will not ship with the initial launch. Funcom says it has decided to ship only the DirectX 9 version and spend more time on building a DirectX 10 version "worthy of Microsoft's great vision for the future of PC gaming".

The extra development time will give Funcom time to implement more features in the DirectX 10 version of Age of Conan than originally planned.

The new, enhanced DX10 version will be premiered at Games Convention in Leipzig in August 2008. A special preview will be unveiled this summer at Nvidia's NVISION event in San Jose, California, August 25 - 27, 2008.
September 3, 2008 6:27:06 PM

JAYDEEJOHN said:
Sorry to say, but xp is slowly losing out, and faster and faster as time goes by. So xp DX9 wont matter. Itll be history.


XP will still be popular by the time 'Windows 7' is released, if Microsoft actually keep to their schedule (it's late 2009, isn't it?). Anyone releasing a DX10 game will either have to abandon the XP market completely, or write a DX9 renderer too.

Quote:
How much longer can we deny DX10.1? Or Vista?


Probably at least until 'Windows 7' comes out unless it's delayed as much as Vista was. Most graphics chips sold are either Intel or Nvidia, so an ATI-only DX10.1 is not a very interesting proposition for most games companies; if they already have to write a DX10 renderer and a DX9 renderer, why bother writing a DX10.1 renderer as well?

Quote:
Im sure you know the whole story behind DX10, and you know that its impossible to run it as is on xp.


LOL, I know a lot more than that. The only thing stopping DX10 running on XP is Microsoft; and not allowing it to run on XP was an absolutely colossal blunder on their part.
September 3, 2008 6:45:19 PM

Hi Guys

Which of these cards will give better performance with games yet to be released? gtx260 or HD4870. DX 10.1 / 11 excluded.
September 3, 2008 8:35:47 PM

keyter said:
Hi Guys

Which of these cards will give better performance with games yet to be released? gtx260 or HD4870. DX 10.1 / 11 excluded.

I would answer that but I seem to have misplaced my crystal ball.
!