GeForce GTX260 with New PCB Design

Nvidia's 3rd-generation GeForce GTX260 features a new printed circuit board (PCB) design, set to reduce overall cost.

Expreview reports that the third-generation GTX260 design, codenamed P897/D10U-20, will feature changes that cuts down the manufacturing cost. This means changing the FBVDDQ power solution from 2-phase to single phase, reducing the overall PCD layer count from 10 to 8, and lowering the PCB board height down 1.5cm while keeping the original length. The MOSFET package will see an alteration as well, changing from LFPAK to DPAK. It's also likely that the BIOS ROM will diminish from 1M to 512K, and the DVI connector will receive modifications in order to cut costs even further.

In comparison to the P654 design, the newer P897 GeForce GTX260 is expected to save Nividia around $10 to $15, although the design could be mistaken for the GeForce 9800GTX+ (due to the GT200 and NVIO2 chip). Originally, the P654 layout reduced the number of PCB layers from 14 to 10 in comparison to the previous design, the P651; the design also removed the expensive Volterra chip to reduce cost. All versions use 55nm processing technology.

According to Expreview, the new product will be available in the third week of this month. Currently Chinese manufacturer Colorful is utilizing the P897 layout for the iGame Series' GeForce GTX260 card; the design replaces the TV-out connector with HDMI and also adds a set of overclocking jumpers. Modifications to PCB design not only reduce cost on behalf of the manufacturer, but the discount also trickles down to the consumer.

  • razor512
    I bet each video card only cost about $15 to make and by the time it flows through the companies greed, it ends up costing the consumer $500
    Reply
  • joebob2000
    razor512I bet each video card only cost about $15 to make and by the time it flows through the companies greed, it ends up costing the consumer $500
    Yes, the boards cost $15 in parts to make. Never mind that the first board they made cost them $80 million, it's the cost to make one *today* that matters, right? In related news, would you like to be the person to buy the $80 million GeForce GTX360 when it comes out?

    In case you were completely oblivious to everything around you (on this site especially) there isn't a whole lot of margin on graphics cards since there is stiff competition both within the market (AMD vs Nvidia) and external pressure (Xbox, PS3, Wii) for the consumer's money.
    Reply
  • vertigo_2000
    $15 per card might cover the cost of the parts themselves, but like most companies, you have a lot of other expenses that are essentially passed on to the customer. Heating and maintenance of any buildings owned by the company, interest on any accounts payable, salary and benefits of employees, and the big one for nVidia - R&D of new products --- these are all just some examples of how a $15 graphics card becomes %400-$500 to the consumer.
    Reply
  • jerreece
    Vertigo and joebob are correct. But let's keep in mind. The article says nVidia's cost would be REDUCED by $10 to $15. They didn't say that's the actual cost to manufacture.

    The interesting part will be to see how much this potentially reduces the price for the consumer. I'd love to get into a new GTX 260 or better, but frankly my 8800GTS 512MB does fine for everything I play, and I can't see spending the money they want for the new GTX line.

    Either way, this is potentially good news for us consumers.
    Reply
  • A Stoner
    We have successfully hamstrung our device such that it will be rare that a unit will overclock very much over stock and even if it does overclock, it will be totally unstable due to power fluctuations. Our chips are top notch, but the PCB powering the chip is basically cardboard with a little lead free solder. So, to our enthusiast base, who are the only target audience for this product, sorry, to everyone else, who would not buy this product at a $15 discount anyways, we invite you to buy the product.

    Hmm, I see a flaw in this logic, but it is nVidia's product.
    Reply
  • traviso
    I am somewhat excited about this new redesign, but considering I have a 65nm 8800GTS 512mb, this is a decent but only mildly tempting upgrade. I'm really holding out for 45nm before I upgrade my video card.

    Albeit I have read in numerous reviews that Nvidia's 55nm designs are equally efficient on heat & power usage as ATI's 45nm chips. Despite, I don't want that extra bit of improvement to really sweeten the deal.
    Reply
  • hairycat101
    A StonerWe have successfully hamstrung our device such that it will be rare that a unit will overclock very much over stock and even if it does overclock, it will be totally unstable due to power fluctuations. Our chips are top notch, but the PCB powering the chip is basically cardboard with a little lead free solder. So, to our enthusiast base, who are the only target audience for this product, sorry, to everyone else, who would not buy this product at a $15 discount anyways, we invite you to buy the product. Hmm, I see a flaw in this logic, but it is nVidia's product.
    + 1

    2 phase to single phase. This doesn't sound like a good idea to me.
    Reply
  • A Stoner
    What the heck, I am sorry Haircat, i thought I was hitting the quote button. Anyway to fix a vote?
    Reply
  • A Stoner
    hairycat101+ 12 phase to single phase. This doesn't sound like a good idea to me.Exactly right here. And while the normal video card buyer is not going to care much about 2 phase or 1 phase, the people who buy the high end cards actually do care about these things.

    I want prices to come down, but price drops for enthusiast level products should come from improved parts, improved packaging, and better processes, not cost cutting that degrades the actual end product. Maybe the real reason for the change is that they are now moving it down the scale, but I do not see $15 moving it down far enough to reach mainstream card buyers, which would be below $150 according to most charts I see.
    Reply
  • roofus
    i could see them pulling something like that with the rebranded 9800GTX that was the 8800GTS that is going to be the 2** but not the 260. glad i already got mine sheesh.
    Reply