Nvidia's 3rd-generation GeForce GTX260 features a new printed circuit board (PCB) design, set to reduce overall cost.
Expreview reports that the third-generation GTX260 design, codenamed P897/D10U-20, will feature changes that cuts down the manufacturing cost. This means changing the FBVDDQ power solution from 2-phase to single phase, reducing the overall PCD layer count from 10 to 8, and lowering the PCB board height down 1.5cm while keeping the original length. The MOSFET package will see an alteration as well, changing from LFPAK to DPAK. It's also likely that the BIOS ROM will diminish from 1M to 512K, and the DVI connector will receive modifications in order to cut costs even further.
In comparison to the P654 design, the newer P897 GeForce GTX260 is expected to save Nividia around $10 to $15, although the design could be mistaken for the GeForce 9800GTX+ (due to the GT200 and NVIO2 chip). Originally, the P654 layout reduced the number of PCB layers from 14 to 10 in comparison to the previous design, the P651; the design also removed the expensive Volterra chip to reduce cost. All versions use 55nm processing technology.
According to Expreview, the new product will be available in the third week of this month. Currently Chinese manufacturer Colorful is utilizing the P897 layout for the iGame Series' GeForce GTX260 card; the design replaces the TV-out connector with HDMI and also adds a set of overclocking jumpers. Modifications to PCB design not only reduce cost on behalf of the manufacturer, but the discount also trickles down to the consumer.
Yes, the boards cost $15 in parts to make. Never mind that the first board they made cost them $80 million, it's the cost to make one *today* that matters, right? In related news, would you like to be the person to buy the $80 million GeForce GTX360 when it comes out?
In case you were completely oblivious to everything around you (on this site especially) there isn't a whole lot of margin on graphics cards since there is stiff competition both within the market (AMD vs Nvidia) and external pressure (Xbox, PS3, Wii) for the consumer's money.
The interesting part will be to see how much this potentially reduces the price for the consumer. I'd love to get into a new GTX 260 or better, but frankly my 8800GTS 512MB does fine for everything I play, and I can't see spending the money they want for the new GTX line.
Either way, this is potentially good news for us consumers.
Hmm, I see a flaw in this logic, but it is nVidia's product.
Albeit I have read in numerous reviews that Nvidia's 55nm designs are equally efficient on heat & power usage as ATI's 45nm chips. Despite, I don't want that extra bit of improvement to really sweeten the deal.
2 phase to single phase. This doesn't sound like a good idea to me.
I want prices to come down, but price drops for enthusiast level products should come from improved parts, improved packaging, and better processes, not cost cutting that degrades the actual end product. Maybe the real reason for the change is that they are now moving it down the scale, but I do not see $15 moving it down far enough to reach mainstream card buyers, which would be below $150 according to most charts I see.