Sign in with
Sign up | Sign in
Your question

**NVIDIA RELEASING 270,290,GX2********* read

Last response: in Graphics & Displays
Share
October 20, 2008 3:56:01 AM

HEY! i was doing some research and stuff and i came across an interesting article...

*click the link and read*
http://www.guru3d.com/news/nvidia-270-290-and-gx2-in-no...

Im currently looking to upgrade my videocard, and im not sure if i should wait till november.....

Has anyone heard anything about this?



October 20, 2008 4:01:56 AM

China Syndrome eh? So I will expect this card to be cheap and break quickly?

"This part is what we were calling the GT200b in May, but the public code name is GT206. It is simply an optically shrunk GT200, so clock for clock, you won't get any speed boost out of it." - This is a killer here, I was at least expecting a boost out of revisions. Typical nvidia fashion.

Given the current price of the GTX cards, im gonna assume the GX2 one will be another 7950GX2 disaster, at $900.
October 20, 2008 4:22:19 AM

I'll agree to that, what a waste of Nvidia's resources, they should be thinking of the next GPU instead of just revisions and money hogs.
Related resources
October 20, 2008 4:25:30 AM

I just want to add that there are soo many speculations, and none of them have been proven yet.

Realisticly we should see these cards Q1 2009. I doubt we will be seeing anything come next month or so.
a c 106 U Graphics card
a b Î Nvidia
October 20, 2008 4:35:39 AM

I'm pretty sure AIBs would be pretty pissed if nVidia was already trying to replace the 260 core 216 with the 270 as well as trying to drive the price of the 260 down further.
October 20, 2008 5:11:07 AM

megamanx00 said:
I'm pretty sure AIBs would be pretty pissed if nVidia was already trying to replace the 260 core 216 with the 270 as well as trying to drive the price of the 260 down further.


Palit and Gainward "jumping ship" is evidence of this. And awhile ago there was a "rumor" that XFX had left Nvidia completely and EVGA will be doing Nvidia and ATI, I dont know if there was any truth to that or not. Nvidia let alot of vendors go because they cant afford it right now. And you hit the nail on the head, the LAST thing Nvidia needs to do right now is do yet ANOTHER round of revisions.
October 20, 2008 3:39:58 PM

What's wrong with revisions? ATI did that with x800---->x850 and x1800---->x1900---->x1950
October 20, 2008 4:00:29 PM

What's wrong is that they can't afford it right now. They need to FOCUS, not work on bringing in 55nm of their current tech, low/midrange versions of the GTX 200's (Called GT100's I believe). They've been mainly renaming/revising since the 8800GTX. Besides, aren't they headed toward 40nm early next year, too? Should've just skipped 55nm, and gone 1 1/2 steps.
a b U Graphics card
a b Î Nvidia
October 20, 2008 5:49:43 PM

wahjahka said:
HEY! i was doing some research and stuff and i came across an interesting article...

Has anyone heard anything about this?


Yeah the first time it came out as the InQuirer article :p  ; and while the 55nm transition makes alot of sense the GX2 makes very VERY little sense at all.

It's also going to have issues with cooling a chip that produces that much heat right into the backplate of the card infront of it.

nVidia has to go 55nm to cut costs, but the GX2 seems like a waste of time compared to getting the G200B out there ASAFP to lower costs.
That the 55nm refresh didn't hit before October leads me to believe they had problems in their transition. They definitely need it out there by November to make Xmas sales, but depending on AMD's pricing reaction it may waste alot of it's benefits trying to once again sell below it's best return price. And trying to sell a huge GX2 with 2 PCBs and 2 expensive chips (still much more expensive to make than the RV770) in a time when people are reconsidering their purchases/finances doesn't make much sense. You willing to Pay ~$800 for something that would likely be close to mirroring the performance of 2 GTX280s in SLI? Especially considering that those will drop in price dramatically as the new stuff arrives.

Focusing on the next generation is a good idea, but unless they've abandoned the GTX series as too costly (which would make sense if you're rushing to the next generation) then the 55nm is needed to lower costs per chip. Selling alot of GTX260 and 260+ models still cost as much on the chip level as a GTX280 if you're not just selling castoffs.

At this point a GX2 card sounds similar to the first GX2 card and the GF7800Ultra, essentially just a card to fill a performance hole and retain the PR crown, but at a very high cost, high price, and ultra-low availability. The difference being that nV was making a mint from their other cards when they did that, I think the GX2 would be an uneeded waste of resources compared to the benefits of the 55nm transition (which might have mobile opportunities).

Anywhoo, the only reason you should wait for Novemeber is if you want essentially a 5-10% boost in performance from a 55nm GTX. It's not going to be anything ground breaking.
a c 84 U Graphics card
a b Î Nvidia
October 20, 2008 5:50:00 PM

^^not so soon, late Q2 or early Q3

http://www.nordichardware.com/news,8261.html
Quote:

Sources has informed us of another chips that is in the works. NVIDIA is also working on GT216 which is said to be the first chip to reach the market using TSMC's 40nm process. It will go up against AMD's RV870 chip and should hit the market at about the same time. That time is late Q2 2009, or early Q3. There are rumors that this chip will be DirectX 11 compatible, which would put it even with RV870 in terms of DX support.

a b U Graphics card
a b Î Nvidia
October 20, 2008 6:14:52 PM

Dekasav said:
They need to FOCUS, not work on bringing in 55nm of their current tech, low/midrange versions of the GTX 200's (Called GT100's I believe). They've been mainly renaming/revising since the 8800GTX.


GT100s will be rbadged GF9 series cards. The midrange will likely keep their GT2xx naming just lower down on the list, and the lack of any buzz about them is not a good thing.

Quote:
Besides, aren't they headed toward 40nm early next year, too? Should've just skipped 55nm, and gone 1 1/2 steps.


That doesn't make much sense. The 55nm is needed to lower costs of a huge expensive chip. ATi can produce almost triple as many RV770 on single wafer and are also said to have a higher yield rate after that. The G200 is 2.25 times the size, and the RV770 can fit more in the high yield center and even in the spaces around the edges. I think the numbers are 228 : 86 per wafer right now, and the 55nm shrink should make it just under 120 raw G200b die per wafer (a 256bit memory interface might yield about 135/wafer), which still is a long way from the RV770 which would still be near the 2:1 ratio, especially if yields don't improve. Right now supposedly the yields are making it even worse and closer to 4:1 ratio, so they need to improve both dies per wafer and yield ontop of that.

As for 40nm, they can't magically skip to 40nm until it's ready, and until then they have to do something about the large cost of the G200s if they want to keep selling them.
An optical shrink isn't very much, and should be pretty easy (baring unforseen issues like crosstalk, heat, etc), the delay to me implies they went the G92 route again and simplified things while they are at it, and that redesign took a bit of time ontop of the previous August deadlines.

It'll be interesting either way, but I think it's going to struggle to be as profitable as the RV770, while almost definitely being better performing on a 1:1 basis.
a b U Graphics card
October 20, 2008 6:27:09 PM

Do you think G200b can go mobile? And be a seller?
October 20, 2008 6:29:57 PM

Wait till 40nm before upgrading your current high end gpus. Anything from a geforce 8800 or radeon hd3850 is more than enough to last till q3 2009. I wonder if Nvidia or ATI is going to keep using 512 bit memory interface with 40nm? Ive been told you need a big die for enough pins for 512 bit memory.
a b U Graphics card
October 20, 2008 6:36:07 PM

Its a possibility youll see more mid range cards going to 128 bit actually, and 512 may become a thing of the past, as GDDR5 ramps up in speeds
October 20, 2008 9:03:29 PM

JAYDEEJOHN said:
Do you think G200b can go mobile? And be a seller?



You're kidding, right?

And on topic, I stated in another thread that I don't think we'll see 512-bit buses mainstream (even in the high end) soon, if ever. GDDR5 is blazing fast, even on a 256-bit bus, and I don't think they're going to slow down on making new memory faster. Buses have issues because of physical pins, something very difficult to overcome, but memory deals with much smaller, more electronic things, rather than pins and making 500 some traces on a PCB.
October 20, 2008 9:12:29 PM

Mobile? The GTX 260 requires 36amps....how the hell they will ever make these chips mobile without basically making it a 8000/9000 is beyond me.

Besides, with the latest Apple/HP shenanigans going on within Nvidia, the last thing they need to do is touch another mobile platform. I think they need to get their current **** working first.
Anonymous
a b U Graphics card
October 20, 2008 9:30:43 PM

i'm just waiting for the generation after the next generation to upgrade... the 6870 or the GTX 480 lol
a b U Graphics card
October 20, 2008 10:05:46 PM

Well, TGGA had mentioned it, and Ive always wondered, but remember, this would be a 55nm slowed down version. Maybe on a killer gaming mobile?
October 21, 2008 12:15:38 AM

Dekasav said:
You're kidding, right?

And on topic, I stated in another thread that I don't think we'll see 512-bit buses mainstream (even in the high end) soon, if ever. GDDR5 is blazing fast, even on a 256-bit bus, and I don't think they're going to slow down on making new memory faster. Buses have issues because of physical pins, something very difficult to overcome, but memory deals with much smaller, more electronic things, rather than pins and making 500 some traces on a PCB.



Last night, I was just thinking we could be seeing dual channel buses for video cards. 2x256 bit buses would rock! As for physical pins, im sure they may invent a new way for buses, such as using lasers to transmit data :hello: 
October 21, 2008 12:46:41 AM

according to the inquirer it cost's nvidia $260 to make the 260, so any you see under that price they are selling for a loss
October 21, 2008 1:18:51 AM

rangers said:
according to the inquirer it cost's nvidia $260 to make the 260, so any you see under that price they are selling for a loss


What a clever name for the card :pt1cable: 

However doubtful, if that is indeed true then there really is no reason to even sell the card. $20 profit per card dosent even cover the distribution/packaging cost.
a b U Graphics card
October 21, 2008 3:25:00 AM

Ive heard its as high as 5000$ a wafer for cpus.... going by that, and what TGGA says, thats around 60$ just for the core alone. Add in that massive expensive pcb, almost a gig of ram, hsf etc plus shipping and packaging, then retailers cuts as well as partners cuts....
October 21, 2008 5:41:31 AM

you can knock the inquirer but they do get a lot of insider information that a lot of other site's will not publish
a b U Graphics card
a b Î Nvidia
October 21, 2008 6:32:28 AM

JAYDEEJOHN said:
Do you think G200b can go mobile? And be a seller?


Yeah on both counts.

Think of the monster that was the G80, and what they did to the G92 to bring it to mobile form, shrink, declock and decrease the memory bus.

All of those things could come to the G200b, but of course you'd also likely need to downclock it a little bit more, but is it possible, sure.
Just look at the SLi and QuadCore mobile rigs out there.

Will it be a big seller, no it'll be niche, just like the SLi GF9800s, but still if they can bring a crippled mobilized version out it helps defer the cost of development.

I'm not saying it's going to happen, but it's definitely something they hoped for and considered even if it never does get beyond that stage.

Quote:
Ive heard its as high as 5000$ a wafer for cpus.... going by that, and what TGGA says, thats around 60$ just for the core alone. Add in that massive expensive pcb, almost a gig of ram, hsf etc plus shipping and packaging, then retailers cuts as well as partners cuts....


And remember that's raw dies per wafer, then subtract the number of defective chips and the price increases. At launch the numbers being tossed around were about 50% yield, meaning even before all the other costs it's costs about $90+ just for the chip.
a b U Graphics card
a b Î Nvidia
October 21, 2008 6:45:10 AM

spathotan said:

However doubtful, if that is indeed true then there really is no reason to even sell the card. $20 profit per card dosent even cover the distribution/packaging cost.


You would sell the card simply to keep your name in the running, and to maintain all the other support systems that rely on people buying nV. Also you have the halo effect of a high end part influencing the mid-level buyers, etc.

ATi went through the same problem with their HD2900 where they were coming nowhere near profitabilty, but they needed to sell something.

However also remember that while the G200 may cost about $260 to build and get to market it sells for near that price in a GTX260 but the GTX280 sells for more than that which is the equalizer somewhat.
!