9800 GX2, and its effect on the current market.

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
To all of the ATI/AMD fan boys who have continuously ripped apart the 9800 GX2 since the news was released, stay out of this. I have just built a new computer, im not an experienced system builder, but I managed it well. That said I am running a 7600 GT, and must upgrade. I just bought a 680i, the new 700 series seems silly to me, as I will most likely not go SLI, but I wanted the gaming features. My question is: will the 9800 GX2 (said to be expected at the price of $450) cause a drop or plummet in the price of nvida's 8 series, and if you think so, which do you think will be most effected? I personally see the 30% increase in performance over the Ultra a GOOD thing, thats a lot of extra power and almost the exact same thing as 2 Ultras in SLI for supposedly cheaper than 1 Ultra. I am not so sure on the expected price, but if that is so, I see no way the price of all 8800s, mainly the Ultra and GTX, will not plummet. If you can buy 2 Ultras in 1 card for $450, I can not see then the Ultra costing even $400, and the GTX atleast must drop to $350 ect. Even though im very skeptical on the $450 price point, I must admit the card can not be much more, if not less, than the Ultra's current price of $650-$700. What do you think?
 

LAN_deRf_HA

Distinguished
Nov 24, 2006
492
0
18,780
I don't think it will drive the price down. I think its retail will be $450-$550. The 9800 GTX will probably be set at $400-$420, and the 512 GTS and GT will most likely stick to $320 and $250 repectively (suggested).
 

korsen

Distinguished
Jul 20, 2006
252
0
18,780
The Ultra's and most likely GTX 8 series cards probably won't drop in price at all. For some reason, the flagship cards of each series never drops in price. The GT/GTS will drop however, but probably not by much. If you're going to wait for it, you might as well get 1 9800GX2 and be happy about it.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
As usual the High end cards will slowly drop until the manufacturer stops to produce them because of a cheaper to produce and equally performing part. At the end of the 8800 GTX/Ultras lifecycle it will still cost a fortune. As usual.
 

Granite3

Distinguished
Aug 17, 2006
526
0
18,980
When the 9800 GX2 hits the actual retail sales market, the price for new ultra's will hold for a short period, and everything on the second hand market will plunge.

The last GX2 stayed at its then current price for over 6 months till it dropped, even though it sucked vs the 8800 GTX and GTS.
 

Xazax310

Distinguished
Aug 14, 2006
857
0
18,980
im still wondering why they continue with the Dual GPU's if the 7950GX2 is any indication as well as the forgotten X1950PRO x2(from Sapphire) driver support is weak and game support is even weaker, many games may support SLI with two GFX cards but most dont see gains and having a single board it going to be the same but from some there isnt much of a gain, having dual GPU's requires alot more encoding and support from Developer, which i dont think isnt going to happen the 7950GX2 had alot of horsepower in there but as was reviewed and seen almost no games used it, having them in SLI created more problems and only now is Nvidia supporting the 7950GX2 in Vista.

the 9800GX2 IMHO will be a flop, im going to stick to a single 9800GTX if the Inquirer is right about its specs then i might actually be able to play Crysis :)
 
I agree with Slobogob. Look at the 7900GTX, it never dropped to $50, they just stopped making it. I expect the 8800GTX and Ultra to be available for $400..$500 for a few months and then disappear. Right now the only thing keeping them alive is that they support Triple SLI while the 8800GT and 8800GTS G92 512MB don't. Apart from that, I can't see why somebody would buy the GTX.
 


It's scary that both ATI and nVidia are going this way, with their next releases (February, March) being made of two cards sandwiched together and relying on SLI or Crossfire. I don't like this one bit. I don't think they are very happy about it either. I'm guessing they've got some technical difficulties and had to go this way.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280

The 8800 GTX shows pretty fine what kind of problem they have. Nvidia was unable to put the actual GPU together with the IO unit on a single die. That was using 80nm i think.

The GPUs are getting more and more complicated and thus the transistor count is rising. GPUs have left CPUs in the dust regarding the transistor count. Now taking into account that a GPU is basically a big parallel processor, why not chop it up in smaller independent units.
I.e. take a 6600GT core, shrink it to 55nm and put three of them on a single die. Now manufacturing it, one of the cores is defective, gets deactivated and you still get a 6600GT SLI. Now place a neat interconnect on the same chip so you can easily link multiple of those chips in a row on a single PCB. Basically that is already happening - its just not called cores but shaders. It's a little more difficult than that, but any monkey can find the details on the net.

I think both companies are trying to get around the limitations of their manufacturing capabilities while trying to remain profitability. Neither ATI nor Nvidia could just invest a big chunk of money to get 45nm production running. If Nvidia would have had the ability to produce the 8800 GTX at a tested 45nm process at the time it was designed, they would have done that. The reality looks different though. NVida and former ATI too, are both fabless. They have their production outsourced to TSMC for example. The smallest process they can get is the one offered by those fabs.
To do it themself they would have needed to invest everything they had into a new technology that wasn't tested before, might have yielding problems with a new product that has the possibility of flopping? Dangerous course of action with a few parallels to what AMD has done with their Phenom. I'm not even sure if there is enough money in the entire GPU market to start a 45nm process from scratch or if Nvidia and ATI together could keep a 45nm fab profitable and running at full load for long enough to pay its cost.

Instead GPU manufacturers feed on the crumbs falling off the table of the big players. IBM, Intel, etc. that can keep multiple fabs running at load and have a wide palette of products they can manufacture at those fabs. Intel uses their older 90nm process for chipsets now, for example, until they switch it to a newer process.

The whole SLI/xfire is a stop-gap measure until either the companies can get their chips done on a single die OR until they have figured out a way to improve scaling like we see it with multicore CPUs today.

A neat example of the whole situation is S3, i think. They are not really competing in the GPU market anymore but they still produce chips and even though they are targeted at a different market segment, they know that they will have the same trouble as the big players. Just look at the S3 Chrome series. The newest one will be based on a 90nm and another one following later on 65 nm - a production technology that Nvidia or ATI don't even use anymore by then. Still, S3 developed their own form of SLI/xfire called multichrome.