**NVIDIA RELEASING 270,290,GX2********* read :)

wahjahka

Distinguished
Oct 19, 2008
517
0
18,990
HEY! i was doing some research and stuff and i came across an interesting article...

*click the link and read*
http://www.guru3d.com/news/nvidia-270-290-and-gx2-in-november/

Im currently looking to upgrade my videocard, and im not sure if i should wait till november.....

Has anyone heard anything about this?



 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
China Syndrome eh? So I will expect this card to be cheap and break quickly?

"This part is what we were calling the GT200b in May, but the public code name is GT206. It is simply an optically shrunk GT200, so clock for clock, you won't get any speed boost out of it." - This is a killer here, I was at least expecting a boost out of revisions. Typical nvidia fashion.

Given the current price of the GTX cards, im gonna assume the GX2 one will be another 7950GX2 disaster, at $900.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
I just want to add that there are soo many speculations, and none of them have been proven yet.

Realisticly we should see these cards Q1 2009. I doubt we will be seeing anything come next month or so.
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780


Palit and Gainward "jumping ship" is evidence of this. And awhile ago there was a "rumor" that XFX had left Nvidia completely and EVGA will be doing Nvidia and ATI, I dont know if there was any truth to that or not. Nvidia let alot of vendors go because they cant afford it right now. And you hit the nail on the head, the LAST thing Nvidia needs to do right now is do yet ANOTHER round of revisions.
 

Dekasav

Distinguished
Sep 2, 2008
1,243
0
19,310
What's wrong is that they can't afford it right now. They need to FOCUS, not work on bringing in 55nm of their current tech, low/midrange versions of the GTX 200's (Called GT100's I believe). They've been mainly renaming/revising since the 8800GTX. Besides, aren't they headed toward 40nm early next year, too? Should've just skipped 55nm, and gone 1 1/2 steps.
 


Yeah the first time it came out as the InQuirer article :p ; and while the 55nm transition makes alot of sense the GX2 makes very VERY little sense at all.

It's also going to have issues with cooling a chip that produces that much heat right into the backplate of the card infront of it.

nVidia has to go 55nm to cut costs, but the GX2 seems like a waste of time compared to getting the G200B out there ASAFP to lower costs.
That the 55nm refresh didn't hit before October leads me to believe they had problems in their transition. They definitely need it out there by November to make Xmas sales, but depending on AMD's pricing reaction it may waste alot of it's benefits trying to once again sell below it's best return price. And trying to sell a huge GX2 with 2 PCBs and 2 expensive chips (still much more expensive to make than the RV770) in a time when people are reconsidering their purchases/finances doesn't make much sense. You willing to Pay ~$800 for something that would likely be close to mirroring the performance of 2 GTX280s in SLI? Especially considering that those will drop in price dramatically as the new stuff arrives.

Focusing on the next generation is a good idea, but unless they've abandoned the GTX series as too costly (which would make sense if you're rushing to the next generation) then the 55nm is needed to lower costs per chip. Selling alot of GTX260 and 260+ models still cost as much on the chip level as a GTX280 if you're not just selling castoffs.

At this point a GX2 card sounds similar to the first GX2 card and the GF7800Ultra, essentially just a card to fill a performance hole and retain the PR crown, but at a very high cost, high price, and ultra-low availability. The difference being that nV was making a mint from their other cards when they did that, I think the GX2 would be an uneeded waste of resources compared to the benefits of the 55nm transition (which might have mobile opportunities).

Anywhoo, the only reason you should wait for Novemeber is if you want essentially a 5-10% boost in performance from a 55nm GTX. It's not going to be anything ground breaking.
 

Kari

Splendid
^^not so soon, late Q2 or early Q3

http://www.nordichardware.com/news,8261.html
Sources has informed us of another chips that is in the works. NVIDIA is also working on GT216 which is said to be the first chip to reach the market using TSMC's 40nm process. It will go up against AMD's RV870 chip and should hit the market at about the same time. That time is late Q2 2009, or early Q3. There are rumors that this chip will be DirectX 11 compatible, which would put it even with RV870 in terms of DX support.
 


GT100s will be rbadged GF9 series cards. The midrange will likely keep their GT2xx naming just lower down on the list, and the lack of any buzz about them is not a good thing.

Besides, aren't they headed toward 40nm early next year, too? Should've just skipped 55nm, and gone 1 1/2 steps.

That doesn't make much sense. The 55nm is needed to lower costs of a huge expensive chip. ATi can produce almost triple as many RV770 on single wafer and are also said to have a higher yield rate after that. The G200 is 2.25 times the size, and the RV770 can fit more in the high yield center and even in the spaces around the edges. I think the numbers are 228 : 86 per wafer right now, and the 55nm shrink should make it just under 120 raw G200b die per wafer (a 256bit memory interface might yield about 135/wafer), which still is a long way from the RV770 which would still be near the 2:1 ratio, especially if yields don't improve. Right now supposedly the yields are making it even worse and closer to 4:1 ratio, so they need to improve both dies per wafer and yield ontop of that.

As for 40nm, they can't magically skip to 40nm until it's ready, and until then they have to do something about the large cost of the G200s if they want to keep selling them.
An optical shrink isn't very much, and should be pretty easy (baring unforseen issues like crosstalk, heat, etc), the delay to me implies they went the G92 route again and simplified things while they are at it, and that redesign took a bit of time ontop of the previous August deadlines.

It'll be interesting either way, but I think it's going to struggle to be as profitable as the RV770, while almost definitely being better performing on a 1:1 basis.
 

NewLCD123

Distinguished
Oct 7, 2008
157
0
18,680
Wait till 40nm before upgrading your current high end gpus. Anything from a geforce 8800 or radeon hd3850 is more than enough to last till q3 2009. I wonder if Nvidia or ATI is going to keep using 512 bit memory interface with 40nm? Ive been told you need a big die for enough pins for 512 bit memory.
 

Dekasav

Distinguished
Sep 2, 2008
1,243
0
19,310



You're kidding, right?

And on topic, I stated in another thread that I don't think we'll see 512-bit buses mainstream (even in the high end) soon, if ever. GDDR5 is blazing fast, even on a 256-bit bus, and I don't think they're going to slow down on making new memory faster. Buses have issues because of physical pins, something very difficult to overcome, but memory deals with much smaller, more electronic things, rather than pins and making 500 some traces on a PCB.
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
Mobile? The GTX 260 requires 36amps....how the hell they will ever make these chips mobile without basically making it a 8000/9000 is beyond me.

Besides, with the latest Apple/HP shenanigans going on within Nvidia, the last thing they need to do is touch another mobile platform. I think they need to get their current **** working first.
 
G

Guest

Guest
i'm just waiting for the generation after the next generation to upgrade... the 6870 or the GTX 480 lol
 

NewLCD123

Distinguished
Oct 7, 2008
157
0
18,680



Last night, I was just thinking we could be seeing dual channel buses for video cards. 2x256 bit buses would rock! As for physical pins, im sure they may invent a new way for buses, such as using lasers to transmit data :hello:
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780


What a clever name for the card :pt1cable:

However doubtful, if that is indeed true then there really is no reason to even sell the card. $20 profit per card dosent even cover the distribution/packaging cost.
 
Ive heard its as high as 5000$ a wafer for cpus.... going by that, and what TGGA says, thats around 60$ just for the core alone. Add in that massive expensive pcb, almost a gig of ram, hsf etc plus shipping and packaging, then retailers cuts as well as partners cuts....
 


Yeah on both counts.

Think of the monster that was the G80, and what they did to the G92 to bring it to mobile form, shrink, declock and decrease the memory bus.

All of those things could come to the G200b, but of course you'd also likely need to downclock it a little bit more, but is it possible, sure.
Just look at the SLi and QuadCore mobile rigs out there.

Will it be a big seller, no it'll be niche, just like the SLi GF9800s, but still if they can bring a crippled mobilized version out it helps defer the cost of development.

I'm not saying it's going to happen, but it's definitely something they hoped for and considered even if it never does get beyond that stage.

Ive heard its as high as 5000$ a wafer for cpus.... going by that, and what TGGA says, thats around 60$ just for the core alone. Add in that massive expensive pcb, almost a gig of ram, hsf etc plus shipping and packaging, then retailers cuts as well as partners cuts....

And remember that's raw dies per wafer, then subtract the number of defective chips and the price increases. At launch the numbers being tossed around were about 50% yield, meaning even before all the other costs it's costs about $90+ just for the chip.
 


You would sell the card simply to keep your name in the running, and to maintain all the other support systems that rely on people buying nV. Also you have the halo effect of a high end part influencing the mid-level buyers, etc.

ATi went through the same problem with their HD2900 where they were coming nowhere near profitabilty, but they needed to sell something.

However also remember that while the G200 may cost about $260 to build and get to market it sells for near that price in a GTX260 but the GTX280 sells for more than that which is the equalizer somewhat.