Sign in with
Sign up | Sign in
Your question

Nvidia's Fermi GTX480 is broken and unfixable

Tags:
  • Graphics Cards
  • Laptops
  • Nvidia
  • Graphics
  • Product
Last response: in Graphics & Displays
Share
February 18, 2010 3:02:18 AM

http://www.semiaccurate.com/2010/02/17/nvidias-fermigtx...

makes you think?

I personally will hold my thoughts till release

be warned it is a Charlie Demerjian article

And from what I remember it usually is like this:

Charlie :0c==============3 ATI

But like I said I'm going to hold my thoughts for now, just because we have seen Nvidia's constant mistakes (one of which is the laptop line 8600Ms) but nothing as dumb as this.

More about : nvidia fermi gtx480 broken unfixable

a b U Graphics card
February 18, 2010 3:43:16 AM

Kinda sad really, im a bit of a ati fanboy but like to keep a open mind.. would love to see nvidia come out with something great in the next month or so and then see what ati has up its sleeve, competition = lower prices = what most consumers care about
a b U Graphics card
a b Î Nvidia
February 18, 2010 4:12:16 AM

Yawn,
I think I read "Nvidia didn't do its homework' at least 3 times. Ok he's got Fermi outperforming Cyrpess. What else matters ? lol
a b U Graphics card
February 18, 2010 4:13:59 AM

yield issues for a start... and the ability to launch a product within 50$ of cypress that performs similary, i wish nvidia luck... theyll need it i thk
a b U Graphics card
February 18, 2010 4:21:10 AM

This looks to cost nVidia the better part of another year.
a b U Graphics card
February 18, 2010 4:26:45 AM

man if its true, and i say if lol. Nvidia really shot themselves in the foot, then decided to donkey punch themselves, then decided to put sandpaper on dildos and sit on them.

will be fun to see what happens. But i hope this doesnt stall the 5830 means it wont have competition.
a b U Graphics card
February 18, 2010 4:37:39 AM

...so much for seeing some samples show up in that convention in a few weeks time...
a b U Graphics card
February 18, 2010 5:41:47 AM

On one hand this is Charlie...on the other hand he was right so far about all news regarding Fermi.
I hope things are not that bad for nVidia. We need healthy competition.
a c 177 U Graphics card
a b D Laptop
a b Î Nvidia
February 18, 2010 6:01:03 AM

Some things I remember about the article. I still believe that the GTX480 (or whatever they are calling it this week) will still have 512 shaders. It came out awhile ago that the workstation cards would have 448. This was assumed because they would be running cuda code and all shaders would be working at 100%. By lowering the number, you'll cut down on the heat.

If the yields are as bad as claimed, this is horrible for Nvidia. You should see more GTS460/440s, and the GTX480 will be mostly a paper product. I am believing more and more that Nvidia will still produce all the cards they have out now. GTX4xx will be DX11, GTX3xx will be DX10.1, and the GTX2xx cards will be DX10. The big question now will be who will be able to afford the GTX4xx?

Also bothersome was the line about AMD simply pricing the 5xxx so that Nvidia sells in the red. This will mean high prices for all of us. I don't think we'll see any price cuts for quite some time this time around. One thing not mentioned is that AMDs costs per card didn't really change much. They doubled the number of shaders, but moved to a smaller process. If you compare 5xxx to 4xxx, the die sizes should be similar. The 5xxx will be slightly bigger, but not by much. This means, assuming equal yields, that 5xxx costs about the same as 4xxx. This is not the case with GTX4xx. They could barely afford to produce the GTX280, how long will the GTX480 stay around?
February 18, 2010 6:11:19 AM

Every manufacturer knows about defects. The big thing is if you choose to be prepared for those defects or just wing it and see what happens, thats the game Nvidia has been playing ever since thier monolithic designs.

This page is a perfect description of defects on a manufacturing process for computers, be it cpu, gpu, or memory. http://www.anandtech.com/video/showdoc.aspx?i=3740&p=8 It is also a very good article.

Nvidia has been able in the past to dodge the bullets that are involved in a monolithic design, defects. Now with TSMC not being the best at 40nm, its killing them in the worst possible way. Picture this. Right now Nvidia's top chips are about as scarce as the chips AMD is cherry picking for the 5970's, maybe less. This means if Nvidia went forward with the release, they will be higher priced and harder to find than the 5890's.

Lets examine the sorting for a second. ATI makes the 5800s on the same die, same wafer, and bins them. First you apply low voltage to the chips, see wich ones pass and can be used on the 5970 (very low yield). You then up the voltage, and picks the chips suitable for the 5870s (little higher yield). Do this X times ... AKA binning, you get the variations in products from the exact same die, and exact same wafer.

Fabrication processes also applies to cpus as well. Intel has thier 32nm chips going, but what was thier first one? A Tiny dual core chip, rather than trying to make thier monolithic 6 core chip, and they also left the graphics and memory controller off of it (its on the other chip). Improve the process then increase the size. Nvidia has a huge problem on thier hands. ATI and Intel learned from their past mistakes, what will Nvidia learn? More importantly how long will it take Nvidia to have a product to sell?

(hmm sounds like the same boat AMD is in with thier CPU's)

a b U Graphics card
February 18, 2010 7:33:13 AM

What Intel did may or may not fit your scenario, as the markets usually drive what Intel chooses to do.
Another thing to consider here is, the 40nm process is becoming very expensive, being the yields have been poor on everything, and why we see the lower nVidia cards cut down and priced high as well, including ATIs lower solutions.
Add in the higher problems with yields on such a huge die, and theres room for alot of trouble for the green team.
My biggest concerns are, we havnt actually heard a thing yet, and thats not good, and for those who say this is what they also did with the G80, we knew more then a month before release than we do now with Fermi, and lastly, I finally heard my first rumor about Fermis lil bros coming out, or at least taping out, but only 1 so far, which isnt good either
February 18, 2010 8:16:44 AM

Nvidia should has improved the GT200 as ATI is doing at the present with R600 -> Rv770 -> Rv870.
The GT300 is remembering the R600, ATI met a lot of problem with the PCB, put back the launch into the market etc...
When the GT300 will be launch into the market : The card will be noisy, high consumption, huge size, with the same powerful than HD 5870. I don't imagine the price.
a c 274 U Graphics card
a c 173 Î Nvidia
February 18, 2010 8:39:08 AM

clement4413 said:
When the GT300 will be launch into the market : The card will be noisy, high consumption, huge size, with the same powerful than HD 5870. I don't imagine the price.

The GT3xx have already been released but there are no reports of them being noisy, having high power consumption or being a huge size because they are OEM parts.
February 18, 2010 8:58:50 AM

I don't understand
The GT300 = Fermi = GTX480 ? No ?
a c 177 U Graphics card
a b D Laptop
a b Î Nvidia
February 18, 2010 9:04:41 AM

Thats a good point Jay. We are around a month from launch and still nothing. No real leaked benchies at all.
February 18, 2010 9:34:25 AM

Ha yes, I had forgotten so I talked nonsense sorry
a c 274 U Graphics card
a c 173 Î Nvidia
February 18, 2010 10:57:44 AM

Quote:
Here's a better equation. Easier to understand.

GT300 = Fermi = 0

How does GT300 = Fermi? :heink: 
a c 130 U Graphics card
a b Î Nvidia
February 18, 2010 11:10:20 AM

Some people just don't bother reading the whole thread.

I'm waiting for a release, I'm fed up with people claiming this and that based of off a guess that this did this so the new one should do that. I'm talking press here not on the Forum.
I have read articles claiming its the next coming of the silicon messiah and then there are those who claim its a complete waste of silicone.

So I'm just happy to wait now

Mactronix
a b U Graphics card
February 18, 2010 11:17:01 AM

+1 matronix, with only 6 weeks or so to go, waiting seems like a good idea
though for entertainment purposes, these threads get hilarious after about page 8
February 18, 2010 12:10:28 PM

Looks like a complete disaster. Luckily nVidia doesn't depend solely on high end graphics, but I see quarterly losses for a long time.

If their price drops below $7, Intel may look to acquire them. We have a long way to go to get to $7. Intel can make 28nm processors and can probably figure out how to make Fermi, given enough time.
a b U Graphics card
a b Î Nvidia
February 18, 2010 12:21:30 PM

KidHorn said:
Looks like a complete disaster. Luckily nVidia doesn't depend solely on high end graphics, but I see quarterly losses for a long time.

If their price drops below $7, Intel may look to acquire them. We have a long way to go to get to $7. Intel can make 28nm processors and can probably figure out how to make Fermi, given enough time.


Another post full of nonsense. LOL Its ATI thats in danger of financial failure , Nvidia is doing fine. And if people want a example of hardware failure there is one in progress with the continued 5700 debacle going on. Just look in these forums.
Nvidia swings to quarterly profit as sales more than double

http://www.marketwatch.com/story/nvidia-swings-to-profi...
SAN FRANCISCO (MarketWatch) - Nvidia Corp. said Wednesday it swung to a profit for its fiscal fourth quarter, as sales more than doubled thanks to strong demand for chips used in personal computers and workstations.

Santa Clara, Calif.-based Nvidia /quotes/comstock/15*!nvda/quotes/nls/nvda (NVDA 16.85, -0.99, -5.55%) said net income for the period ended Jan. 31 was $131.1 million, or 23 cents a share, compared to a loss of $147.7 million, or 27 cents a share in the same period a year earlier. Excluding special items, Nvidia said fourth-quarter earnings were 23 cents a share.

Revenue more than doubled to $982.5 million from $481.1 million.

Analysts polled by Thomson Reuters had expected Nvidia to post earnings of 20 cents a share, and $957.2 million in revenue.

"Nvidia's business continued to accelerate," Chief Executive Jen-Hsun Huang said in a prepared statement.
a c 173 U Graphics card
a b D Laptop
a b Î Nvidia
February 18, 2010 12:28:10 PM

This is going to be a fail chip regardless of performance and Nvidia will agree to some extent being not what they had expected of their hard work. 40nm process is just not enough so a shrink is necessary to be successful at all in any price segment. I had read some ware but will look it up later that TSMC isn't doing the 32nm process but instead chose 28nm so that is Nvidia's best bet. Shrink the chip below that or equal to the G80 in order to get passed the yield problem. It is a shame that the first samples that we may see won't be full spec nor have intended clocks. With the GT200 out of production the only thing left is those 40nm low end cards and the G92 cards. For me I will stay with my 9800gt 1gb cards. For ATI I will wait till prices come down but will continue to add to my collection.
a b U Graphics card
February 18, 2010 12:30:13 PM

In my opinion nVidia will be OK even if the Fermi generation is going to be a flop, but due to the delay it is released the problem will get bigger when the 6xxx series comes from ATI.
Maybe JHH will be replaced then...one can only hope :D .
a b U Graphics card
a b Î Nvidia
February 18, 2010 12:35:06 PM

Armchair engineers prophesizing about a product they know about as much as ATI's advertising monkeys do. Why do these companies employ engineers and teams of people in research + development when they could get it all from genius's on the internet.
a b U Graphics card
February 18, 2010 12:37:26 PM

notty22 said:
Armchair engineers prophesizing about a product they know about as much as ATI's advertising monkeys do. Why do these companies employ engineers and teams of people in research + development when they could get it all from genius's on the internet.


They don't need to since they have PR people like you.
a c 130 U Graphics card
a b Î Nvidia
February 18, 2010 1:15:54 PM

Ok my two cents is this (yes i know i said i would wait but :)  )
The way i see it Nvidia will get a product out but im begining to think that they will have to settle for something similar to when ATI had the 2600 cards, something usable but not quite up to what the other team have.
What it should mean is they will learn some big lessons this time and my personalopinion is that the next generation will be what they hoped Fermi would be.
I can understand how it got this far, what with respins etc and nvidia just wanting to get it right first time.
For what ever reason ( too many speculations to cover here) it just hasnt worked for them and they stubornly put them selves in a corner trying to seem as if nothing was wrong.
Ati said they couldnt compete and gave reasons why when the DX10 debarcle happened and stated they would need a generation or two to catch up again.
I dont thing Nvidia are as far behind as Ati were then but its all just my take on things.

As i said we wont know untill its released

Mactronix
a b U Graphics card
February 18, 2010 2:39:56 PM

Notty don't you get bored signing on your five other accounts in order to thumbs up yourself and thumbs down others?

@Mac Nvidia have never, ever admitted they did anything wrong and they won't start now.
a b U Graphics card
February 18, 2010 2:58:35 PM

If we ignore Anands article, I can see that this particular article from CD doesnt add anything, but, since it aligns with Anands article as well, then this may be right.
If anyones read Charlie lately, hes been pounding his gavel saying nVidias been ignoring the physical limits, which also plays into the Anand article, only IF we assume that nVidia found these problems (Im sure they did), but the higher ups over rode the engineers (it happens), TSMC made promises (how many?) and itd all work out in the end, which then kept drawing out the release dates, from respin to respin.
Otherwise, I agree, Fermi should be a killer product, so should have the 2900/3800 products, but once again, the 80nm process sucked, and the clocks never hit like they were supposed to, it wasnt til a die shrink and a few tweaks that we saw it shine, and may be exactly whats going on now, minus the tweaks (vias,tranny length variances) and die shrink (28nm)
a b U Graphics card
February 18, 2010 2:59:32 PM

jennyh said:
Notty don't you get bored signing on your five other accounts in order to thumbs up yourself and thumbs down others?

@Mac Nvidia have never, ever admitted they did anything wrong and they won't start now.



As true as that is, thats probably the saddest thing. The one thing NV need right now is a bit of humility and the balls to say they did something wrong. Oh and in reply to KidHorn's comment about Intel acquirng NVidia, NV has about $1.4b in the bank with no debts right about now, ATI are in more serious danger of financial failure and has only announced a profit last quarter because of that huge settlement deal with Intel; if you actually subtract the $1b or whatever it was, AMD as a whole only made about $38m in revenues last quarter, hardly enough to sustain a company like that. Anyway i really don't want to bash ATI or AMD i simply used that example to give perspective on just how much Intel WON'T be acquiring NV anytime soon, or ever..?
a b U Graphics card
February 18, 2010 3:05:02 PM

Mousemonkey said:
How does GT300 = Fermi? :heink: 


Just his way of saying they both suck ass, is my guess.
a b U Graphics card
February 18, 2010 3:24:23 PM

notty22 said:
Another post full of nonsense. LOL Its ATI thats in danger of financial failure , Nvidia is doing fine. And if people want a example of hardware failure there is one in progress with the continued 5700 debacle going on. Just look in these forums.
Nvidia swings to quarterly profit as sales more than double

http://www.marketwatch.com/story/nvidia-swings-to-profi...
SAN FRANCISCO (MarketWatch) - Nvidia Corp. said Wednesday it swung to a profit for its fiscal fourth quarter, as sales more than doubled thanks to strong demand for chips used in personal computers and workstations.

Santa Clara, Calif.-based Nvidia /quotes/comstock/15*!nvda/quotes/nls/nvda (NVDA 16.85, -0.99, -5.55%) said net income for the period ended Jan. 31 was $131.1 million, or 23 cents a share, compared to a loss of $147.7 million, or 27 cents a share in the same period a year earlier. Excluding special items, Nvidia said fourth-quarter earnings were 23 cents a share.

Revenue more than doubled to $982.5 million from $481.1 million.

Analysts polled by Thomson Reuters had expected Nvidia to post earnings of 20 cents a share, and $957.2 million in revenue.

"Nvidia's business continued to accelerate," Chief Executive Jen-Hsun Huang said in a prepared statement.


Debacle? What debacle? There is no 57xxx series debacle, unless you are referring to the overblown and mostly cured driver errors that effected the entire 5xxx series.

Also you forget that your are talking about AMD, not ATI. ATI is a division of AMD, and out of AMD's major product models, CPUs, chipsets, and GPUs, only the GPUs, or ATI, are doing well with quick increases in market share.

Your right though, anyone who thinks nVidia is going under if Fermi fails, or that their stocks will plummet, needs to do some research. nVidia really doesn't need a desktop solution right now, they have Tegra which is being adopted by many companies now. They also have a much larger life-line than the competition. They could stop production of all desktop GPUs for a year and still be in business.

That said, those earnings you quoted are from nVidia's mostly non-GPU business model with workstations and Tegra pulling the way, a truly genius move by nVidia IMO.
a b U Graphics card
February 18, 2010 3:32:37 PM

AMW1011 said:
Debacle? What debacle? There is no 57xxx series debacle, unless you are referring to the overblown and mostly cured driver errors that effected the entire 5xxx series.

Also you forget that your are talking about AMD, not ATI. ATI is a division of AMD, and out of AMD's major product models, CPUs, chipsets, and GPUs, only the GPUs, or ATI, are doing well with quick increases in market share.

Your right though, anyone who thinks nVidia is going under if Fermi fails, or that their stocks will plummet, needs to do some research. nVidia really doesn't need a desktop solution right now, they have Tegra which is being adopted by many companies now. They also have a much larger life-line than the competition. They could stop production of all desktop GPUs for a year and still be in business.

That said, those earnings you quoted are from nVidia's mostly non-GPU business model with workstations and Tegra pulling the way, a truly genius move by nVidia IMO.

The only problem I have with this is, if you eliminate the GForce end of the equation, then the overall costs change.
Making chips thatre used in mass production vastly pays down the R&D costs, as well as creates for a cheaper overall cost thru process tweaks etc, which arent seen in many other scenarios of exclusive type products, and being low in production numbers, have a huge cost, thus eliminating the high returns, which nVidia now gets from their high production and special products together.
a b U Graphics card
a b Î Nvidia
February 18, 2010 3:47:19 PM

AMW1011 said:

Also you forget that your are talking about AMD, not ATI. ATI is a division of AMD, and out of AMD's major product models, CPUs, chipsets, and GPUs, only the GPUs, or ATI, are doing well with quick increases in market share.


I have not forgotten. Since you brought this up, Here is what I see happening and its not good news for ATI. When AMD gets their act together and introduces 32nm(on a new socket, it seems LOL) with onboard IGP, look for them to market the solution as having 'AMD graphics' separating it from ATI for various reasons. All of which will not be good news for the ATI division.
a b U Graphics card
February 18, 2010 4:03:53 PM

You seem somewhat extra deluded today notty. ATI is AMD. AMD keeps the ATI brand for graphics and nothing else.
a b U Graphics card
February 18, 2010 4:25:33 PM

Yeah, from what I've heard, very few original ATI employees work for the ATI division. Basically, it is very plausible that AMD might just drop the ATI name all together, it really doesn't matter as it is all the same thing.
a b U Graphics card
February 18, 2010 4:46:09 PM

AMD will start adding little bits of that brand to all their ATI products. They already have a green AMD logo on the 5800 cards.

Right now the ATI brand is stronger than AMD and AMD is basically hoping to piggy back on their reputation.
a b U Graphics card
February 18, 2010 4:57:27 PM

I agree, I think that the only reason that the ATI brand hasn't been totally taken out is that it would likely confuse customers who don't know about the AMD/ATI merger, but know ATI by name.
February 18, 2010 4:59:33 PM

looks like Nvidia needs to take their business to GF.
a b U Graphics card
February 18, 2010 5:00:18 PM

Oh and to get back on-topic, I very much doubt that Fermi is broken and unfixable. I expect them to have a late March to early May paper launch with poor yields that result in the actually ability to purchase the cards to be impossible for about another month or so after the launch, much like what happened to ATI, possibly slightly worse.
February 18, 2010 5:09:19 PM

I have said paper in March, trickle to market in May since Nov 09. :) 
February 18, 2010 5:14:24 PM

notty22 said:
Another post full of nonsense. LOL Its ATI thats in danger of financial failure , Nvidia is doing fine. And if people want a example of hardware failure there is one in progress with the continued 5700 debacle going on. Just look in these forums.
Nvidia swings to quarterly profit as sales more than double

http://www.marketwatch.com/story/nvidia-swings-to-profi...
SAN FRANCISCO (MarketWatch) - Nvidia Corp. said Wednesday it swung to a profit for its fiscal fourth quarter, as sales more than doubled thanks to strong demand for chips used in personal computers and workstations.

Santa Clara, Calif.-based Nvidia /quotes/comstock/15*!nvda/quotes/nls/nvda (NVDA 16.85, -0.99, -5.55%) said net income for the period ended Jan. 31 was $131.1 million, or 23 cents a share, compared to a loss of $147.7 million, or 27 cents a share in the same period a year earlier. Excluding special items, Nvidia said fourth-quarter earnings were 23 cents a share.

Revenue more than doubled to $982.5 million from $481.1 million.

Analysts polled by Thomson Reuters had expected Nvidia to post earnings of 20 cents a share, and $957.2 million in revenue.

"Nvidia's business continued to accelerate," Chief Executive Jen-Hsun Huang said in a prepared statement.


You may want to read more than the headlines. Their free cash flow is much lower in Q4 than in Q3 and yet they report an increase in GAAP earnings. Q4 was the Christmas season so you would expect free cash flow to go up. They're playing games with their numbers to make a good headline, but apparently the market can see through this and has punished them today.

I never said they were going bankrupt. They have a lot of money and almost no debt, so yes they're in better financial shape than AMD, but it looks like they have a bleaker future and I would expect their share price to reflect this.
February 18, 2010 5:20:15 PM

NV has put so much focus on other products that I dont think this really came as a surprise. It seems fairly obvious to me that high end desktop video cards are not their primary focus anymore.
a b U Graphics card
February 18, 2010 6:01:25 PM

I honestly don't see the Fermi doing too well even if released on schedule.

With the prices ATI has set it will be tough competition for Nvidia who often carries the Nvidia Tax. Also let's not forget we can expect the 6000 series from ATI to roll out sometime later next year bringing the 5000s series down in price which means Nvidia will be in a worse scenario then now.

Then again we may have failed to see something. What if this Fermi gpu is but a test? Seriously think about, what do you think would happen if shareholders were NOT given any results for the amount of time Fermi has been announced?

I predict Fermi to be the test hog, the same way Vista was in the operating system arena. Too advanced, or too powerful for the current generation of hardware. What if Nvidia is just prepping up Fermi to push the industry to the next level to make way for their next generation GPU's (past Fermi). Push the power consumption near 300w, and you can expect PCI-E Certification to consider a revision of specifications. The same way Vista had to push the hardware market to cope with several software enhancements like 64 bit.
Nvidia has the capital to invest in this type of risky business, and the payoffs are a near monopoly of the high-end market. Push a GPU powerful enough to rival 1-2 next generation of GPU's and you can easily monopolize the high end market.
February 18, 2010 6:16:18 PM

AsAnAtheist said:

I predict Fermi to be the test hog, the same way Vista was in the operating system arena. Too advanced, or too powerful for the current generation of hardware. What if Nvidia is just prepping up Fermi to push the industry to the next level to make way for their next generation GPU's (past Fermi). Push the power consumption near 300w, and you can expect PCI-E Certification to consider a revision of specifications. The same way Vista had to push the hardware market to cope with several software enhancements like 64 bit.
Nvidia has the capital to invest in this type of risky business, and the payoffs are a near monopoly of the high-end market. Push a GPU powerful enough to rival 1-2 next generation of GPU's and you can easily monopolize the high end market.


I can see how people will be forced to buy more powerful PSU's to run future cards, but the problem is with heat dissipation. 300 watts is almost impossible to deal with unless you have a big case like a full tower or a large mid tower. Or if you have liquid cooling.

Long term, the trend is for smaller and smaller electronics that use less power. I doubt nVidia planned on creating a power hungry GPU. Also, there's a lot of money in gaming systems like PS, XBox and if I worked for Sony, Nintendo or MS, I would steer clear of nVidia. A large power hungry GPU with unpredictable yields is definitely not what they want in their hardware.
February 18, 2010 6:21:11 PM

notty22 said:
I have not forgotten. Since you brought this up, Here is what I see happening and its not good news for ATI. When AMD gets their act together and introduces 32nm(on a new socket, it seems LOL) with onboard IGP, look for them to market the solution as having 'AMD graphics' separating it from ATI for various reasons. All of which will not be good news for the ATI division.

You remind me of the political commercials, throw around as much crap as you can and see if it will stick to a wall, mostly complete nonsense. AMD has already announced thier bulldozer is planned for AM3 sockets. Fustion is an entirely different story, just like Intel's i3s, requires a Motherboard with a VGA output.

Oops, I found your picture...

February 18, 2010 6:36:47 PM

Nvidia soon will join the dead legendary gpu makers like 3dfx.
a b U Graphics card
February 18, 2010 6:38:20 PM

KidHorn said:
I can see how people will be forced to buy more powerful PSU's to run future cards, but the problem is with heat dissipation. 300 watts is almost impossible to deal with unless you have a big case like a full tower or a large mid tower. Or if you have liquid cooling.

Long term, the trend is for smaller and smaller electronics that use less power. I doubt nVidia planned on creating a power hungry GPU. Also, there's a lot of money in gaming systems like PS, XBox and if I worked for Sony, Nintendo or MS, I would steer clear of nVidia. A large power hungry GPU with unpredictable yields is definitely not what they want in their hardware.


Three hundred watts is a lot of power, however you must understand there are various methods of cooling the leaks.

Heat sinks took home because of the cheap manufacturing costs, not to mention R&D costs are so low for Heat sink design it's quite frankly pathetic.

The lapping of heat sink straight from factory could easily lower several degrees but this could get slightly expensive, in addition to lapping the use of higher performance thermal compounds would also knock off a few degrees. Add dual fan designs, and you can knock off heat even more. Sacrifice aesthetics for practicality, a few degrees more. There's many ways to keep GPU's cooler now. Hell there's already HD 4850's using passive cooling that are not too shabby by powercolor and gygabyte I believe.

Combine it with the increasing performance of computer cases in the present days, and you got one hell of a cooling recipe. Now a days the cooling taking place in cases now a days is near overkill compared to older computer cases. I dont expect 300w to be an actual limit, more of a nuisance. -points at two GTX 295 requiring 1000w+ PSU's- Something which is a nuisance to cool, but not anywhere near impossible. Those GTX 295's can be placed into a high performance mid tower now a days with little modification (mostly modifications done the GPU's).

As stated by that article, Fermi's leaking is going on big time which puts heat production at an all time high which lowers performance which then leads to higher voltages required etc (thus we got 280w power consumption and who knows what the heat dissipation requirements are). Once the manufacturing techniques are perfected more higher clock bins will be produced and the leakage will lower so you can expect power consumption at around 220w (which is totally acceptable considering the HD 5870 uses 180w alone, 40w is not a significant increase. (maybe not with Fermi but most definitely with the next generation)
As far as heat dissipation goes look at the HD 5970 with a reference cooler with specifications of 300w heat dissipation.

With the pressing on for 28nm lithography, many issues like leaks and pathetic bins may disappear in the next generation.

@Trends. Yes smaller/less power consuming electronic devices are the trend. However we're not speaking of electronic devices, we are speaking of computer hardware. Smaller and smaller advancements towards lithography will be required before Fermi's architecture can take place realistically. This smaller lithography lowers power consumption, and heat by product. However I still have not seen GPU's getting smaller physically. Why is that? Because the small market share that mid-high end products demand powerful components. Energy efficient is just a bonus, not a priority.
The HD 5970 is already 12-13 inches depending on manufacturer, the HD 5870 is pushing past the size of the HD 4890. Every generation passing I have only seen the mid to high end GPU's to increase in PCB size, and lower power consumption.
a b U Graphics card
a b Î Nvidia
February 18, 2010 6:43:17 PM

noob2222 said:
You remind me of the political commercials, throw around as much crap as you can and see if it will stick to a wall, mostly complete nonsense. AMD has already announced thier bulldozer is planned for AM3 sockets. Fustion is an entirely different story, just like Intel's i3s, requires a Motherboard with a VGA output.

Oops, I found your picture...

http://www.retardcentral.com/wp-content/uploads/2009/07/you_win_the_prize.jpg


Ok, lol, AMD said , AMD said, your just plain gullible and to dumb to know. Parroting fud as he goes along and AMD loses money. Do you find yourself singing showtune as well ?


http://www.fudzilla.com/content/view/17731/1/
Nvidia blames TSMC for supply issues

The company believes it lost $100 to $150 million due to 40nm shortages.
However, Nvidia claims TSMC is doing a "fabulous job" and that it has improved its yields

Thats sounds like everything is going according to schedule to me, and demand is HIGH for their GT 240 parts.

    • 1 / 6
    • 2
    • 3
    • 4
    • 5
    • More pages
    • Next
    • Newest
!