Sign in with
Sign up | Sign in
Your question

Nvidia renames 2xx parts to 3xx

Last response: in Graphics & Displays
Share
a b U Graphics card
August 29, 2009 9:56:40 AM

If they made the gtx260-gtx275 a midrange card with dx11 with higher clocks and power efficient for the gtx 3xx series i wouldn't personally mind.
a c 172 U Graphics card
a b Î Nvidia
August 29, 2009 10:46:59 AM

If true, wow. I've been saying for awhile I don't see how Nvidia will handle the move to DX11 well. They never shrank the GT200 down to the mainstream parts, and they never supported DX10.1. Their chips are alredy quite large, and they need to get even larger if they will add in the parts that DX11 require. Its been purposed that Nvidia will only support DX11 in the high end, as their mid range/low end parts are renamed G92/G94 parts.

If their yields are bad and they're just going to rename old parts as new parts to give them a full product line up, it looks like DX11 for high end only might be true. Worrisome times for Nvidia.
Related resources
August 29, 2009 10:53:25 AM

Pretty sad.

Even more confusing for the consumer.
a c 271 U Graphics card
a c 171 Î Nvidia
August 29, 2009 12:39:21 PM

I saw who the author was and stopped reading.
a b U Graphics card
August 29, 2009 2:17:22 PM

I think this deserves to be reposted considering the subject of the thread.

August 29, 2009 2:24:50 PM

invisik said:
If they made the gtx260-gtx275 a midrange card with dx11 with higher clocks and power efficient for the gtx 3xx series i wouldn't personally mind.


Well then it wouldn't be a rebrand because that requires more than a new sticker.
August 29, 2009 2:53:10 PM

invisik said:
If they made the gtx260-gtx275 a midrange card with dx11 with higher clocks and power efficient for the gtx 3xx series i wouldn't personally mind.


well, this is technically not possible, dx11 is a suprest of dx10.1 which isn't even supported by those cards, to make a card able to run dx11 u need a hardware modification to make full use of its features (tessellation for example).

now putting in mind that the gtx 3xx series will be marketed as a dx11 cards, it doesn't make sense to ruin that with a gtx 3xx card that doesn't support dx11 even when it's a mid-range card. it won't stand a chance against AMD's mid-range dx11 cards.
a b U Graphics card
August 29, 2009 3:02:27 PM

With a little hardware modification (dx11 support) wouldn't they be able to use the gtx 2xx series and rename them to the gtx 3xx series or would the card have to be completely redesigned?
August 29, 2009 4:45:06 PM

invisik said:
With a little hardware modification (dx11 support) wouldn't they be able to use the gtx 2xx series and rename them to the gtx 3xx series or would the card have to be completely redesigned?


The HD2xxx till the HD4xxx had built in tessellator but a non functioning on due to the lack of support of the API side (DX10) and even though those cards won't support the full features of dx11. a "little hardware modification" might be the word in that case. but in case of the gtx 2xx, it won't be the proper word to describe it, and i'm not sure if has to be completely redesigned either, but in both cases this is far from being rebranding.
a c 172 U Graphics card
a b Î Nvidia
August 29, 2009 4:45:12 PM

Depends. If Nvidia has a tessellation engine in the G92 and simply never turned it on, then a "simple" tweak to make it DX11 compliant and active is all it would take. If however they never included one, then the chip would have to be redesigned to include the things needed to support DX11. If they do rename G92 as 300 cards, then the lower 300 series won't support DX11.
a b U Graphics card
August 29, 2009 5:05:55 PM

MX up the cards is all I have to say

If this is true, and we have DX10 still dragging its way thru this gen, we get what we deserve

I am/will be extremely discouraged if this comes about. May be time to ditch PC gaming, and find greener pastures, and no, not consoles either
a b U Graphics card
August 29, 2009 5:45:13 PM

'rant'
OK, Ive heard enough, seen enough, watched enough, and basically, if were stuck with DX10, and this is what were given by the supposed leader in sales of discrete gfx cards, this is nothing but a shame.
Anyone who supports this action, defends it, well, you better kiss nVidia goodbye, because the only thing standing between LRB and everything else is the DX model.
This is my personal opinion, and some may disagree, everyone may, but its the way I see things trending.
I dont like it either, but thats the way it goes. The only option each one of us has to move gaing along is thru our pocketbooks.
They have plans for us, cloud gaming, all SW driven, eventually no discrete at all in the way we know it.
You may find yourself in a few years not being able to own a game, just a download.
It may not have better gfx, but, its good enough, it sells, everyone makes money, so....why not?
Maybe the enthusiasm has died in PC gaming, I just dont know, but things like this, rehashing old DX models that arent even now currently up to date, let alone the new one coming, well, to me, it speaks volumes.
I hope Im wrong, I hope these rumors are also wrong, and we see PC gaming improving in the future, but, Im not so sure anymore. People are trite with the way things have always gone etc, and yes, itll be over 4 years to a new adoption by alot of claims here on Toms, by many here.
In the end, if thats moving forwards, and if thats exceptable for people willing to spend hundreds of dollars per year, and literrally thousands over than 4 year stretch, and this is all we have is grudgingly inch forwards, one has to ask, is it worth it?

If people accept this, theyll get wjat they deserve
end rant
a b U Graphics card
August 29, 2009 6:56:33 PM

Maybe you need to ask them what cards they use, or possibly theyre giving their opinions.
Always post a link if you have one, at least then, people can form a better opinion one way or the other
a b U Graphics card
August 29, 2009 7:00:50 PM

Latest rumors on true DX11 parts are, nVidia ships small quatities of G300 around secong week of November, whereas the x2 product from ATI has been rumored as that time of release has been going around for awhile now.
So, it may mean no top end unit for nVidia at all this time around, not even long enough for them to drastically cut their prices 200$
August 29, 2009 7:08:41 PM

4745454b said:
If true, wow. I've been saying for awhile I don't see how Nvidia will handle the move to DX11 well. They never shrank the GT200 down to the mainstream parts, and they never supported DX10.1. Their chips are alredy quite large, and they need to get even larger if they will add in the parts that DX11 require. Its been purposed that Nvidia will only support DX11 in the high end, as their mid range/low end parts are renamed G92/G94 parts.

If their yields are bad and they're just going to rename old parts as new parts to give them a full product line up, it looks like DX11 for high end only might be true. Worrisome times for Nvidia.

yep, this very well could be the end of nvidia.. and even though i dont like them, we all kinda need them on the market or else amd/ati has free reign over the market of graphics cards.
i mean, nvidia still hasnt even gotten to gddr4 yet and AMD is already at gddr5.

things are lookin pretty grim...
August 29, 2009 7:11:13 PM

rambo117 said:

i mean, nvidia still hasnt even gotten to gddr4 yet and AMD is already at gddr5.


GDDR4 offers little benefit so it makes sense that Nvidia is skipping it.
August 29, 2009 7:20:30 PM

turboflame said:
GDDR4 offers little benefit so it makes sense that Nvidia is skipping it.

not nessisarily. its clocked alot higher than gddr3 and can OC much more smoothly, not to mention its more power efficient
a b U Graphics card
August 29, 2009 7:44:24 PM

JAYDEEJOHN said:
Latest rumors on true DX11 parts are, nVidia ships small quatities of G300 around secong week of November, whereas the x2 product from ATI has been rumored as that time of release has been going around for awhile now.
So, it may mean no top end unit for nVidia at all this time around, not even long enough for them to drastically cut their prices 200$


I highly doubt that nvidia isn't releasing a high end card. However WHEN they release it is another story. I think if they drag their feet and we seen a 5890 x2 (or just even crossfiring normal ones) before we see a 395 nvidia is going to lose this round. To many hardcore gamers and computer builders will switch to Ati, some will do it saying it is just to tie them over till 395 but then they will see how powerful and smooth ATI runs lately and may just make the switch.

They could lose a lot of the "fanboy high end" buyers with a move like that.

August 29, 2009 8:31:47 PM

Well, I may be the only one thinking this way, but here it goes...

I do not think that nVidia or ATI is really bringing a game changer card in the next generation. The reason behind this is that today most of producers are targeting multi-plataform developments so they can reach a greater part of the consumer market.

In fact, some of the new games are coming first to the consoles and later to PC. In addition, I don't remember any game other than Crysis that was PC exclusive and that it would push hardware beyond its limits. I know that some RTSs and MMORPGs came in the meantime to PC only, but I would not consider them game changers or a disruptive leap-frog in hardware/software technology.

If the above is correct and knowing that even today's video cards are already way more powerful than consoles, I don't believe developers are just waiting for super duper DX11 cards to bring something new to the market.

Honestly I think that we will only have a leap-change in the games when we have a refresh in the console market, which in my opinion, is far from happening. Current consoles require HDTV and a Home Theater to show it's full strength and I don't believe that the majority of homes, regardless of the region of the world, are massively HDTV ready today.

In the end, I believe that future PC games will have the core of the software coded for videogames, but running with some extra-features and higher resolution, until something really drastic happens in the market.

The only thing (in my mind) that could push hardware for PC a little further is the 3D gaming. Since you need basically "double performance" to run a game in 3D versus in "2D", that would justify an increase in performance in the hardware by a factor greater than 2. However, this technology is not widely supported yet, still have some flaws, it does not work with the current installed base hardware (regular LCDs monitors) and it has an expensive entry cost. But, who haven't tried the shutter nVidia glasses yet should check it. It's really impressive and truly adds something that consoles can't do today.

I'm a proud owner of almost all gaming platforms available today (Xbox360, PS3, Wii and one E8400 with 8800GT SLI hooked in a 50" DLP TV with a Hi-Def Receiver and the 3D Vision glasses - and another I7 with GTX275 SLI in a 24" Monitor) and I can see advantages in every one of them, but unfortunately, when it comes to PC gaming today, I get myself asking why all of that tremendous processing power not being put at work as it should...
a b U Graphics card
August 29, 2009 8:46:02 PM

OK, I say this. The only way devs will change is thru money, our money.
If they insist on throwing crap our way, then ditch them and their backwards games.
Just wait til they drop in price, dont buy them til they do, wait 3 months.
Thatll get their attention.
Im tired of this. We want better? We need to somehow demand better, and by delaying thier planned early high buck return on investment, all platform only style sales is one way to do so
a b U Graphics card
August 29, 2009 8:50:47 PM

Boycotting DX10 cards is another. Theyre putting food in their mouths from having all the cross form reductions, meanwhile we need to put our money where our mouths are.
Dont like it? Change it
a b U Graphics card
August 29, 2009 9:02:31 PM

While most came come out across the board the PC version is always different, especially when it came out on the consoles first. They always have better graphics, usually but not always have higher multilayer map counts (until M.A.G comes out for the PS3 that is) and almost ALWAYS have something different about them. Such as harder difficulties, different muiltplayer maps, more unlockables, more DLC etc. Same thing that happens between PS3 and 360 most the time.

And I think the next gen of cards main "ground breaking event" is DX11. Even if it isn't going to be used for a while there is a possibility that with some of it's features we can see some great things added to gaming. If Tessellation is actually used correctly you can see a far larger detail in graphics coming.

Then there is Crysis. Even though I don't think as a game it is all that amazing, good but not amazing, it sparked something. So many people ran out and bought that game, so many people who didn't play PC games went out and upgraded their PC's or bought new ones just for Crysis.

That is all it takes to sell amazing hardware and games. If 1 game comes out that truly pushes the limits then people buy hardware to play it, and since they have all that high end gear they buy other games that look good as well. For instance most people had upgraded for Crysis so they also got Far Cry 2. That is how PC gaming as always been, and always will be. A hand full of games that really need it push the rest.

Cards very rarely bring the game changers, it is the devs that do that.
a b U Graphics card
August 29, 2009 9:04:39 PM

Bye Bye Nvidia, and your crappy overheating products.
August 29, 2009 9:21:06 PM

darkvine said:
Then there is Crysis.



thats really all you had to say ....Crysis.... the game that pushed nvidia and ati to make us more powerful graphics cards. I mean, a game in 07 is finally maxout-able after 2 years...

August 29, 2009 9:37:17 PM

Instead ATI cards rely on new technology, how not. The only motive the ATI cards can now compare with Nvidia is for the fast gddr5, the technology is already 2 years old now.

In my view that article is total crap, the author calls a move as "making an old 2xx with GDDR5 a new 3xx chip doesn't make sense at all" when ATI did just that before. What's the difference between this and what ATI has already done? Guess this didn't bother the author and for ATI instead it made a lot of sense just for the different brand. I despise fanboys, really, they only see what they want to see.

Anyway what he says just doesn't make any sense.

Nividia will have to change technology, the 2xx chip cards have not hidden support for DX10.1 as someone else has suggested. To include DX11 the technology must be new (or heavily modded, but the cards are already so large) and they will have to release DX11 compliant hardware. It will make no sense at all to create a new technology just for the high-end sector not to talk about the fact that Nvidia just made an open interview where they stated that in the next 6 years GPU technology will advance twenty times. Declaring these sort of things without having nothing new up their sleeve will be utterly nonsense.

People let's be objective, please. To think that Nvidia will just handle the GPU sector hands on to ATI is the delirium dream-world of a fanboy, nothing else.
a b U Graphics card
August 29, 2009 10:06:05 PM

OK, details need to be explained. The best guess?
nVida has mobile parts that are GDDR5, those same parts are now GDDR3 and labled as G300 derivatives.
Why they did this?
GDDR5 uses less power for the same BW. If this is true, then its still possible the GDDR5 mobile variants will most likely NOT be faster than the G300 GDDR3 DT variants, as the mobile solutions will most likely have lower clocks, tho may also have higher BW.

If this is true, this is the way I see it playing out
These cards are not DX11 capable, and theyre lower end models, as the story suggests
Also, it appears nVidia partners have no DX11 mid and low end variants coming to at least the 1st qtr of next year, meaning maybe 2nd qtr at the earliest, which plays well with nVidia tendency to use last gen models in those categories, which, in this case are also non compliant DX11 parts.
It makes sense if you break it down, as we saw no real G200 mid or low end cards, and their approach doesnt lend to this ability until there may be enough defective parts, or castrated parts to actually fill this volume of sales in these categories
a b U Graphics card
August 29, 2009 10:22:56 PM

selea said:
Instead ATI cards rely on new technology, how not. The only motive the ATI cards can now compare with Nvidia is for the fast gddr5, the technology is already 2 years old now.

In my view that article is total crap, the author calls a move as "making an old 2xx with GDDR5 a new 3xx chip doesn't make sense at all" when ATI did just that before. What's the difference between this and what ATI has already done? Guess this didn't bother the author and for ATI instead it made a lot of sense just for the different brand. I despise fanboys, really, they only see what they want to see.

Anyway what he says just doesn't make any sense.

Nividia will have to change technology, the 2xx chip cards have not hidden support for DX10.1 as someone else has suggested. To include DX11 the technology must be new (or heavily modded, but the cards are already so large) and they will have to release DX11 compliant hardware. It will make no sense at all to create a new technology just for the high-end sector not to talk about the fact that Nvidia just made an open interview where they stated that in the next 6 years GPU technology will advance twenty times. Declaring these sort of things without having nothing new up their sleeve will be utterly nonsense.

People let's be objective, please. To think that Nvidia will just handle the GPU sector hands on to ATI is the delirium dream-world of a fanboy, nothing else.


Objectively speaking much of what is being said is based off of things Nvidia has done before and many things point to them doing it again. So while Nvidia should change technolgy that doesn't meant they will, they have shown that in the bast and let ATi catgh up to them in sales. They need new technogly and lower prices or else the tables are going to turn.

Do I believe Nvidia will go out of th game like many people here and other palces are saying? Of coruse not, to think that they would go belly up is iditotic. AMD and Intel have had crappy products that are over price come out but they are still around and kicking well. However I think the majory can and will flip, making ATI the top dog and Nvidia playing catch up if they don't make a move with the 300 series.
August 29, 2009 10:34:33 PM

^ dont forget larabee... xD
a b U Graphics card
August 29, 2009 10:44:37 PM

Think about each companies approach.
nVidia uses a large monolithic design, thus we see a 8800GTX and a 8800GTS, the same with the 9800GTX and GT, but after so long, there was room to use the castrated cards to fill, as in the 8600 8400 etc, but in G200, we saw an even larger design, where it makes that mid low transition unavailable until possibly 2 shrinks, ala 40nm

Now, take ATIs approach. Make your highend smaller, though equivlent of your competitors second highest to their flagship model. Use a x2 product for your flagship to compete with your competitions flagship large monolithic design.
What that leaves ATI with is a less costly alternative for a forced castrated part, or in this instance, since both were waiting on node changes, nVidia for their x2 product for their halo, and ATI for their mid end, 4770, but TSMCs 40nm process was somewhat bungled, so they dropped the 4850 to fill the gap, made the 4890 for their refresh, which gave them a bit more ooomph between nVidias new 285 in comparison, vs the 4870 and the 280, which were farther apart.
Doing the 40nm part served 2 purposes, it was to fill that boid in their lineup, and, as it also turned out, for them to iron out any problems the 40nm node would bring, heading towards the 5xxx series release.
So, its harder from nVidia perspective to actually fill an across the board segmentation than it is for ATI, simply due to each ones preferred approach.
nVidias approach, if they insist on keeping their old gen for the low/mid end will coninuously lead to renaming with node enhancements, like core speed improvements, shader clocks etc, but unfortunately does nothing for having a new DX model
If youre fine with this, thats anyones perogative, I just find it severly lacking in a time where PC gaming is struggling to make enough influence as it is with DX enhancements, to not include them in your portfolio across the board
a b U Graphics card
August 29, 2009 10:45:00 PM

Yeah I didn't forget them either but I wasn't saying Larrabee because they are made by Inetl and after the last time they tried a graphics move I think it will take a lot of reviews and benchmarks for most people to switch over. Personally I am hopeful that they do well, which it looks like they might.

In fact Larrabee might be what makes Nvidia lower their overall prices, as I am sure Intel will be high like they are on all the products which will be just another reason for some people to go ATi. Nvidia might decide to split the difference and be mid way between the two companies (other then with their x2 cards like the 295/395)
a c 106 U Graphics card
a b Î Nvidia
August 29, 2009 10:52:44 PM

Even though ATI currently offeres better performance for the price, nVidia has a much larger market share. Even if they botch this gen, at least as bad as this article makes it sound anyway, they can still get away with it if they have enough marketing it seems. It's sad, but that's how it is.

Anyway we'll just have to wait and see what comes out. I'm looking forward to the DX11 chips from ATI.
a b U Graphics card
August 29, 2009 10:54:59 PM

yeah that is what will happen in a nutshell. If they mess it up then all that will happen is a slip in market shares, making ATi the king for a short time, but now with a third party coming in (Intel) market shares will be further divided.
a b U Graphics card
August 29, 2009 10:55:05 PM

The problem with LRB is, the devs may take to it like flies to leftovers.
If this happens, the console market could end up being owned by Intel, and any gpgpu apps will use it for a model, and not nVidia.
Each app maker looks at their bottom line, and if its easier to code on LRB, thats exactly what theyll do, even if nVidias approach is somewhat faster.
This situation could also enter its ugly head into PC gaming, where the faster product just wont be worth it to the devs to code on, and use an easier, less demanding model, like LRB
This is why I had my rant. If nVidia doesnt promote the DX model, which makes for somewhat more ease of dev for the devs themselves, and provides much more usability, as we see with DX10.1, where it alone sees 10-30% gains, which would allow for more slack, or more intensity in our games, from the devs POV, depending upon cost issues.
nVidia failing this, its as tho theyre ultimately failing themselves, and PC gaming along with it
a b U Graphics card
August 29, 2009 10:59:59 PM

Oh I already know LRB is going to fill that spot in. Intel is and always will be a 'household name' in the computer world. If nothing else they will get a large part of the market shares for the every-day Joe who when buying a computer will see "Intel" and think it most be the best.

Even in the unlikelyhood that LRB sucks bricks mnay many OEM computers such as Dell,HP, Gateway maybe even Mac will still stick them in their computers and they will sale to some degree or another and devs will see those deep pockets of Intel and switch over.

I don't think Intel will get control of the market either in sales or in dev favor anytime this gen but soon it will shift one way or the other.
a b U Graphics card
August 29, 2009 11:10:20 PM

Either way, the discrete market will shrink. AMDs fusion, and Intels integrated LRB systems will shrink the discrete market.
Not sure how this is going to look near short term, say 3 years or so from now, buit its going to happen, and while this is going to happen, nVidia is stepping away somewhat from PC gaming for its very survival, moving into gpgpu realms, to make that a better part of their market.
Cant blame em, but dont have to like it either
August 29, 2009 11:31:54 PM

darkvine said:
Yeah I didn't forget them either but I wasn't saying Larrabee because they are made by Inetl and after the last time they tried a graphics move I think it will take a lot of reviews and benchmarks for most people to switch over. Personally I am hopeful that they do well, which it looks like they might.

In fact Larrabee might be what makes Nvidia lower their overall prices, as I am sure Intel will be high like they are on all the products which will be just another reason for some people to go ATi. Nvidia might decide to split the difference and be mid way between the two companies (other then with their x2 cards like the 295/395)


i know it seems a bit early to be talking about specs but do you think that larrabee will actually be able to compete with the future offerings? i mean, i read the specs of what the 5870 is supposed to have on it and holy s***, thats gonna be a hell of a card!

Larrabee was reviewed back in 08 so i guess Intel is gonna have to veto it's dx10 card idea then...?

a b U Graphics card
August 29, 2009 11:38:07 PM

rambo117 said:
i know it seems a bit early to be talking about specs but do you think that larrabee will actually be able to compete with the future offerings? i mean, i read the specs of what the 5870 is supposed to have on it and holy s***, thats gonna be a hell of a card!

Larrabee was reviewed back in 08 so i guess Intel is gonna have to veto it's dx10 card idea then...?


I am unsure how they will match up with specs but if you look at specs alone then many ATi cards should be kicking Nvidia's ass. It comes down to how they design the card as well as it's specs.

If you are asking me personally how I think they will compare then I believe as it stands they will be somewhere in the mid range. At least at lunch, maybe with a high end somewhere around the end of the next gen.

I am not sure what Intel is even aiming at despite any claims they are/will be making. While they are surely looking to compare in sells I think they are aiming to cover a larger part of the market from high to low and we have to also take into account it will likely work amazingly well with their own CPU's.
a b U Graphics card
August 30, 2009 12:03:36 AM

Forget about seeing a dual g300, sandwich board or not.

The reason the gtx295 was able to be made was because of the shrink nvidia made from 65nm to 55nm.

They are now attempting to move to 40nm (with a lot of issues, not all of them their fault i must add). However, what this means is pretty simple.

There is no half series nanometre drop coming to save them this time. The only thing that stoppped ATI from running away completely with last series was Nvidia getting onto the same nm tech (55nm) halfway through. That let them double up gtx260 to make the gtx295.

If the g300 cannot beat the 5870x2 on realease, Nvidia will never hold the fastest single card position this round. There are no magical doublings of g300 coming. 280w tdp is impossibly high and cannot be doubled onto a single pci-e slot at 40nm. They would need 28nm to come available to them first, and it isn't going to.
a b U Graphics card
August 30, 2009 12:19:29 AM

As for the point of thread, haven't I been saying the same thing for months now?

Nvidia have nothing to fight ATI with at the low-mid range. All they can possibly do is release the current g200's at 40nm and hope they can compete with ATI's 40nm dx11 parts.

Think about what that means. Nvidia will have no sub $100 dx11 parts. They will probably have no sub $150 dx11 parts. Why do you think they are talking down dx11?

The only place Nvidia are 'competing' at is at the top end and immediately below, much like where they have been competing for the last year. The g300 will be faster than a 5870, but it will not be faster than a 5870x2, and the difference this time around is - there will be no g300 sandwich board either.
a c 172 U Graphics card
a b Î Nvidia
August 30, 2009 3:18:25 AM

Is this the first time that a series of cards will have seperate DX abilities? I don't recall a time when a companies current line up of cards support different DX levels. It worked renaming the G92 to 200 cards as they all support DX10. But if the 300 cards will be DX11, and the "310" is only DX10, this will lead to nothing but confusion.
a b U Graphics card
August 30, 2009 3:23:42 AM

The article is by Charlie Demerjian...aka the ATI fanboy/Nvidia basher.

I won't believe this article until other more credible sources confirms this.
a b U Graphics card
August 30, 2009 3:53:17 AM

4745454b said:
Is this the first time that a series of cards will have seperate DX abilities? I don't recall a time when a companies current line up of cards support different DX levels. It worked renaming the G92 to 200 cards as they all support DX10. But if the 300 cards will be DX11, and the "310" is only DX10, this will lead to nothing but confusion.

This happened one other time, by whom? nVidia. The MX and the G2
a b U Graphics card
August 30, 2009 3:54:26 AM

Oh, and by the way, that also held up the dev of shader usage in PC gaming
a b U Graphics card
August 30, 2009 4:15:11 AM

Dont remember the exact times and dates, but 28nm and 28nm HKMG may happen sometime next year at TSMC.
Im excited about HKMG, if it can be applied to gfx cards , if so, we may see a further 30% jump in perf, besides the node shrink

As for HKMG and the 28nm process, heres 1 link
http://www.xbitlabs.com/news/other/display/200908251214...
August 30, 2009 9:04:13 AM

To quote myself from another website....

"Patiently waiting, and watching...

Rumours and propaganda are again rife, both in the GPU and CPU arenas. The truth will out, eventually, and personally I can afford to sit back, relax, suck some beers and patiently wait."
a b U Graphics card
August 30, 2009 9:28:06 AM

JAYDEEJOHN said:
Dont remember the exact times and dates, but 28nm and 28nm HKMG may happen sometime next year at TSMC.
Im excited about HKMG, if it can be applied to gfx cards , if so, we may see a further 30% jump in perf, besides the node shrink

As for HKMG and the 28nm process, heres 1 link
http://www.xbitlabs.com/news/other/display/200908251214...


Yep I read about that recently. Risk production in 2nd quarter 2010 means we're talking over a year before it's available at best, and even then it would be available to ATI as well.

What I'm trying to say is, Nvidia have been used to this half-term die shrink in order to reclaim the performance lead, and with them attempting to get even at the start with 40nm means they have to be leading at the start this time.

Looking at some figure going around, like 6 Tflops 5870x2, that's unmatchable by any single gpu no matter how big. I can't see Nvidia putting two of those onto one card. Even at 40nm the proposed g300 is going to draw more watts than the 65nm gtx280. It sure will be a powerful chip, but doubling it up onto one card can't be doable without a die-shrink.
!