Sign in with
Sign up | Sign in
Your question
Closed

Has FERMI failed this early in the DX11 game?

Last response: in Graphics & Displays
Share
August 20, 2010 5:45:49 AM

Nvidia has already claimed a loss for the 2nd quarter. Ati has been rising in sales for its consumer GPU's and their price hasnt gone down drastically as they started out very reasonably to begin with.

Has Fermi and the GTX4xx line failed this early?
August 20, 2010 6:09:45 AM

There is still hope for their GF104. I guess it's about time for them to have a change in strategy, go for value for money instead and I'd consider purchasing it when I upgrade my machine.
Score
0
a c 189 U Graphics card
a b Î Nvidia
August 20, 2010 6:20:58 AM

I think the best selling FERMI cards is GTX460...
Score
0
Related resources
August 20, 2010 6:32:26 AM

The only problem with the gtx 460 is there pricey starting at 189.00. Vs Atis 5770 starting at 139.00
Score
0
a c 174 U Graphics card
a b Î Nvidia
August 20, 2010 6:33:25 AM

I don't feel this way at all. The GTX470 isn't that much more expensive then the 5850. If you look at DX11/Tesselation then the 5850 isn't even close. The 470 can come close to the 5870, while the GTX480 it near(ish....) the 5970, depending on the game of course. I was surprised at the price Nvidia is selling the GTX470/80 for. The biggest knock on those cards is the heat and power. And I bet that Nvidia will fix this in the next go around.

While the GTX460 is a great card, one card does not (or should not) make or break a product lineup. So far we have seen 3 cards. A bit early to say how good or bad fermi will end up being. It will be nice to see some new lower end cards in Nvidia's lineup. G92 shouldn't have a 5yr life span.
Score
0
a b U Graphics card
August 20, 2010 6:35:00 AM

We have barely even seen fermi yet. the GF100 architecture was a cut down version of the 'real' fermi. I honestly believe that Nv released it simply to keep the Shareholders happy, and not push back another release date. GF104 is the beginning of the real launch IMO. so far, Nv had pretty much failed. ATI released better cards and have reaped the rewards, but i sense the GF104 versus southern islands is going to be the real test. and Nv could easily push back to the top spot.

I'm looking forward to seeing both. and a good old fashioned price war between competetors, instead of artificially inflated prices in the lack of competition...
Score
0
August 20, 2010 6:47:59 AM

For 428 bucks I got 2 460's in SLI the run circles around the 480 and the other ATI offerings in crossfire.

Everytime I boot up Metro or any other game and its just melting thru them Im sitting there and thinking boy this an epic fail!

But seriously ATI fanboys want it to fail and also are praying it will fail. With shoddy drivers and horrible scailing in crossfire, GTX 460 is the best bang for your buck card hands down. Not trying to sound like im totally green, I had a 480 and swapped it for two 460's.

You can say the 480 was and is a failure but thats not the only fermi chip they have.
Score
0
August 20, 2010 7:01:13 AM

In camparison mainstream cards the gtx 460 is a bit faster than the Ati 5770 and nvidias drivers are better true. But theres a price difference of 50.00 dollars between cards for slightly more performance is it worth the extra 50?
Score
0
August 20, 2010 7:05:01 AM

Good they need it gtx 460 is a bit pricey for the general public
Score
0
a b U Graphics card
August 20, 2010 7:08:09 AM

Dip is right on this one. the GTX 460 is the only card in Nvs current lineup worth buying IMO. but its also one of the best cards on the market. if you have an SLI capable board its the best choice by far. i just hope that NV pull off the same thing with the rest of the GF104 chips.

In many ways, i would like for the new 6 series to dominate once again when they release. but since that would mean a similar bout of price stagnation, im now rooting for Nv to get back on form.

I don't care about stupid fanboy arguments, i want the best performance for the best dollar. ATI offered that, but they held prices too high for too long for my liking. Hopefully more competition will remedy that.
Score
0
August 20, 2010 7:18:27 AM

Both good companys its just at the moment nvidias cards seem a bit pricyer pound for pound and direct x 11 only available on 460 and up? or does the gtx 260 do direct x 11? Ati does do a good job on provideing direct x 11 on even some of there lower end cards wich nvidias doesnt do. Wich seems kinda odd. Not by any means taking sides just looking for the best value.
Score
0
August 20, 2010 7:28:30 AM

perhaps at the moment true but down the road when newer models are released who knows.
Score
0
a b U Graphics card
August 20, 2010 8:53:22 AM

Dip, not sure if you ever read much on dx11 (and i won't hold that against you), but once the API is implemented at an engine level, it will actually allow for BETTER performance.

Dx11 has multiple features that can improve performance as well as visual fidelity. one of the best is Tess. currently all it does is add load to a GPU, but when implemented correctly it will add only a tiny load to a card, but more importantly can be scaled on the fly to meet demand. there are plenty of other performance enhancing features that mean eventually we will be seeing games that will not only look better in dx11, but run better too.

which is why it will be great to have on low end cards too.

i think saying Nv have failed is far too premature. we are only early on into an API that will last for years. its only just being implemented and not a single game has been build around it yet. by the time we see stuff like that, we will have southern islands and GF104 competing head to head and i'm sure things will be close.
Score
0
a b U Graphics card
August 20, 2010 9:54:35 AM

fair enough, but you are only attesting to your ignorance. i don't know who said what about Dx10, but the fact is that it was a marginal improvement of dx9. whereas 11 offers a number of exciting features to developers. it really is a big deal, and one of the reasons ATI and Nv see this as such a crucial generation.
Score
0
a c 130 U Graphics card
a b Î Nvidia
August 20, 2010 10:47:35 AM

Its way way to early to tell how the GTX4 series will pan out financially we will need the next quarters revenue to see if its made a difference or not.
I'm in the UK and over here the 5770 is about £125 with the 768 versions of the 460 starting at £150. To my mind this pricing, unless you are on a tight budget or just want ATI and not Nvidia, makes it a no brainer.
As has been posted already there will be a 450 coming soon which is meant as direct competition for the 5770 and as such should end up at the same price point i would think.
I would think there are quite a few people around with 4770's and 4850's that are itching to upgrade for one reason or another and the 460 in my opinion is the first card that makes sense to upgrade to. I'm basing that on a price/performance basis based on UK pricing. It may well be a lot different in other parts of the world but over here if you go to most hardware sites the most popular/sort by popularity pages are mostly filled with 460's.
The 450 will be a single 6pin according to reports so that will be the one that people with pre bought systems and weak ish psu's will want to look at, as long as the thermals and performance are ok of course.
I think the outlook is pretty good for Nvidia at the moment with the southern island chips only being a small bump in performance it should all add up to some decent competition and prices that are better for us consumers.

Mactronix
Score
0
a b U Graphics card
August 20, 2010 11:15:52 AM

Quote:
I know as much about dx11 as u do


obviously not.

Quote:
uses dx11 nicely without hammering performance nothing can be said about its future


game development cycles are many years. the game that are being released now, would have been already part built when Dx11 launched, to expect full utilization would be ridiculous.

Quote:
.if dx11 donot become mainstream by then it will also be an epic failure


another stupid statement, DirectX builds are based off of one another. a new version will retain all the features of the previous builds. We could be on Dx22, but as long as the features implemented in Dx11 are being used in a way to benefit the industry, then it is a success.

Quote:
API depends upon how well developer accepts it and uses in games and also by how well general public accepts it.And if microsoft keeps on launching new OS with new dx version every 2 to 3 yrs no dx version will be as popular as 9.Take my word on that.


again, you don't seem to realize that new Dx versions are simply enhanced version of their predecessors. the point is that the features of Dx11 allow for better performance on cards capable of utilizing them. hence why the lower end cards can make just as much a use of it. it doesn't matter what OS or release of Dx11 we are on, so long as the card is capable of using Dx11 or later, and is taking advantage of hardware tessellation or other features as a way of boosting performance. If correctly implemented, a GTS450 or lower end cards could easily outperform their dx10 equivalent cards, which is why you can't call it useless on such cards.
Score
0
a c 130 U Graphics card
a b Î Nvidia
August 20, 2010 1:07:25 PM

Your both right you know, DX11 on a low end card isn't much use with the implementation where it is at the minute. the horsepower needed to use it as it is at the moment is quite a bitmore than a low end card has.
However once games do turn up that were built on DX11 from the ground up then it becomes useful as the way the tessellater for DX11 works allows for different results to come from the same data so the lower card would scale down the detail/quality of the output to match the hardware available.
By the time that happens we will be at least 2 generations of GPU down the road so yes todays DX11 low end GPU's are little more than a technical exercise as far as the end user is concerned. This is important for the companies involved though. Not withstanding the fact that DX11 has quite a few hardware features and as such will be on the chip regardless of where it is in the line up, the info/feedback they get from this generation all goes to improve the next.

Mactronix :) 

Score
0
August 20, 2010 5:12:20 PM

460 1G OC is beating ATI hardly by my notice from either local store or online. New Gigabyte GTX 470 Super OC has very impressive performance and may be a best choice now. SC2 release will also help nVidia more than ATI. but it will take time to get PC OEM to pick nVidia again.

http://www.tomshardware.com/reviews/gv-n470SO-13I-super...

nVidia GPU is good but looks like don't know how to make a good reference card. Their partner is helping it out.
Score
0
a b U Graphics card
August 20, 2010 5:37:44 PM

nvidia was late to the party and they are paying for it. the 460 is doing great, and is probably the only good card in the fermi lineup as of now. it is still to early to count nvidia out, but they have a lot of catching up to do
Score
0
a b U Graphics card
August 20, 2010 6:03:33 PM

Nvidia is actually doing better in DX11 and tessellation in terms of performance, the places where Nvidia failed this time around is as follows:

Timing: ATI got a nice 6-month jump on them
Power/heat: ATI cards require much less power and produce less heat
Pricing: This is basically a tie in regards to performance


In my opinion, more people are jumping on the ATI bandwagon because ATI came through this series and produced and have been producing lately. Next time around it will be a much closer race as Nvidia will surely be stepping up their pace on getting their product out there in a timely manner and ATI will be stepping up their game on improving DX11 features and tessellation. This will be VERY good for all of us if they are close, its all a matter of who steps up their game further...

I ink that Nvidia will eventually edge out ATI due to the computational skills that Fermi architecture has...
Score
0
a b U Graphics card
August 20, 2010 7:25:20 PM

Why does everyone hate the GTX 470 suddenly? It has all the SLI scaling benefits, overclockability, and even more tessellation performance, with an additional 20-25% performance lead over the GTX 460 for only $60-$70 more. Seems like a damn good card to me.
Score
0
a b U Graphics card
August 20, 2010 7:47:37 PM

The reason is because the 470/480 eat electricity :p  Not to offend anybody, but comparing it to a 5850 and 5870, the power draw of the 470/480 is just a lot more. Also don't think SLI and CF for a second. The 5850 is one of the better deals because it can be OC'd to the value of a stock 5870. As far as I'm concerned you really don't need anything more than a 5850 OC'd or a 460 OC'd. Also The reason most people look toward the 5770 is not only because you save some dough. But also because on lower resolutions the 5770 performs great. The 460 would be overkill and using money you don't need to spend. If you were budget and using any resolution higher than 1680x1050 (1600x900) then go with the 460 :p 
Score
0
a c 217 U Graphics card
a c 81 Î Nvidia
August 20, 2010 7:51:18 PM

One thing that many people don't realize is that the number of cards sold is only part of the equation to making a profit. You still have to make money on each card sold.

The fermi is more costly to produce, so they may not be making as much profit per card as the 5000 series.

You also have to consider that their real money makers are the budget cards priced closer to $100, like all these prebuilt machines have.
Score
0
a b U Graphics card
a b Î Nvidia
August 20, 2010 8:21:26 PM

Yeah bystander is on to the biggest issues for nV.

They are price competitive now, but aren't transistor competitive.

The GTX460 is a bigger die than the full Cypress HD5870 die, and the Juniper die is less than half that size, so production wise major difference (still need board cost, etc).

Also yields are lower on the nV parts even the GTX460, so that's not helping.

And there is no $15- and lower part which is still a huge chuck of the market, especially the one many people here seem to keep forgetting... the OEM market, which is where the disparity is huge.

And a GTX450 may be able to help, but it's still taking that more expensive die to sell a cut-down part, so price competitiveness is really going to be tough if AMD does change it's pressure in that segment.

One of the things hurting OEM positioning is the heat & power issues. As I mentioned many times before, enthusiasts don't care, but OEMs take those two issues very seriously because they need to base their design heavily on those two factors. This is an area where it's hard for even the GTX460 to be as attractive.

None of this means Fermi failed, it just means that it's not as attractive a design. It's likely still going to generate a tone of money for nV, just not at the same rate and margin of AMD's parts.

I think TSMC's woes are actually helping nV alot, because if AMD were able to ramp up production more to meet demand, you might see some very different competition out there.

For now they (nV) have succeeded in getting capable DX11s part out there, and to be able to price it competitively without losing alot of money (and really a big chunk of their loses are right downs not sales issues).

The only failures have really been in delivery and strategy; IMO not having a $100-150 part ready for back to school instead of competing against the SI repositioning in the next Xmas buying season is a strategic failure where they could've taken alot more sales and prevent more AMD sales this fall.

To me Evergreen is a success; but Fermi isn't a failure, just not as bright as promised/hyped/hoped.
Score
0
August 20, 2010 8:48:01 PM

what ppl dont realize here is whatever nvidia does for this series, the first impressions is the most important for the average consumer and they are hard to improve, I dont believe nvidia can do alot to improve their image, the hope is for the next generation.
Score
0
August 20, 2010 9:05:31 PM

bystander said:
One thing that many people don't realize is that the number of cards sold is only part of the equation to making a profit. You still have to make money on each card sold.

The fermi is more costly to produce, so they may not be making as much profit per card as the 5000 series.

You also have to consider that their real money makers are the budget cards priced closer to $100, like all these prebuilt machines have.


no consumer buy GPU chip but a card. consider 460 card is 1/4 shorter than 5850 which means they use less components to build a card, I don't think ATI 5000 series has much advantage on price wise.
Score
0
a b U Graphics card
a b Î Nvidia
August 20, 2010 9:54:50 PM

Imo, this gpu die size importance is overblown. There is another article claiming the g104 is actually smaller.
Quote:
The GF104 die is measured at 13.7x24.2mm, i.e. 331.54mm2. Note that some publications incorrectly mentioned die size being 366mm2 - thus, bigger than AMD's current high-end chip. That is not correct, if only by a small margin [332 vs. 337 mm2]. Direct competitor: AMD Cypress, 337mm2.
http://www.brightsideofnews.com/news/2010/8/9/nvidia-fe...
The cost of doing business is more than just dies per wafer. Everything else is assumed, that nvidia and ati pay the same amount per wafer etc. Does anyone not think that Nvidia spends more on developers and advertising. Programs like , The way its meant to be played , etc. A poor financial quarter is not inspiring, but it reflects a time when Nvidia wasn't competing with any dx11 parts. Which could be called poor planning.
Score
0
August 20, 2010 10:05:34 PM

notty22 said:
Imo, this gpu die size importance is overblown. There is another article claiming the g104 is actually smaller.
Quote:
The GF104 die is measured at 13.7x24.2mm, i.e. 331.54mm2. Note that some publications incorrectly mentioned die size being 366mm2 - thus, bigger than AMD's current high-end chip. That is not correct, if only by a small margin [332 vs. 337 mm2]. Direct competitor: AMD Cypress, 337mm2.
http://www.brightsideofnews.com/news/2010/8/9/nvidia-fe...
The cost of doing business is more than just dies per wafer. Everything else is assumed, that nvidia and ati pay the same amount per wafer etc. Does anyone not think that Nvidia spends more on developers and advertising. Programs like , The way its meant to be played , etc. A poor financial quarter is not inspiring, but it reflects a time when Nvidia wasn't competing with any dx11 parts. Which could be called poor planning.


if you compare two company business. AMD lose much more money each quarter than any IT company does.
Score
0
a c 217 U Graphics card
a c 81 Î Nvidia
August 20, 2010 10:14:00 PM

yanje03 said:
no consumer buy GPU chip but a card. consider 460 card is 1/4 shorter than 5850 which means they use less components to build a card, I don't think ATI 5000 series has much advantage on price wise.


A smaller card does not mean less components. It does not mean less costs either. In the tech world, smaller (with the same amount of parts), costs more.

The Fermi GPU has a much higher transistor count, which results in more defects and fewer GPU's per wafer.

Edit: to be clear, I don't know how much it costs for the card itself, just that the GPU chip, the most costly part of the card, costs more to produce. I also am just saying that size of the card does not mean it costs more or less. There are other factors envolved.
Score
0
a b U Graphics card
August 20, 2010 10:15:19 PM

yanje03 said:
if you compare two company business. AMD lose much more money each quarter than any IT company does.



Not in the Graphics division.
Score
0
a b U Graphics card
August 20, 2010 10:19:50 PM

liquidsnake718 said:
Nvidia has already claimed a loss for the 2nd quarter. Ati has been rising in sales for its consumer GPU's and their price hasnt gone down drastically as they started out very reasonably to begin with.

Has Fermi and the GTX4xx line failed this early?


1. GTX 460 is a resounding success.
2. In spite of increasing market share for Radeon cards AMD has been a money loser for years and continues to be.
3. Nvidia has a billion dollar + warchest, a quarter or two of loses isn't going to bankrupt them.


jonpaul37 said:
Not in the Graphics division.


They've been unified for some time now. AMD as a corporation continues to lose money.
Score
0
August 20, 2010 10:46:58 PM

bystander said:
A smaller card does not mean less components. It does not mean less costs either. In the tech world, smaller (with the same amount of parts), costs more.

The Fermi GPU has a much higher transistor count, which results in more defects and fewer GPU's per wafer.

Edit: to be clear, I don't know how much it costs for the card itself, just that the GPU chip, the most costly part of the card, costs more to produce. I also am just saying that size of the card does not mean it costs more or less. There are other factors envolved.


for this case, a smaller PCB design does mean use less components to support its GPU. Do you think AMD's partner is stupid to make a bigger card to not fit some computer case? That's all because they have no choice.
Score
0
August 20, 2010 10:54:39 PM

jonpaul37 said:
Not in the Graphics division.


Well, looks like ATI is saving AMD then. An acquired company/division normally means more politics, less production in long run.
Score
0
a b U Graphics card
August 20, 2010 10:55:09 PM

jeffredo said:
1. GTX 460 is a resounding success.
2. In spite of increasing market share for Radeon cards AMD has been a money loser for years and continues to be.
3. Nvidia has a billion dollar + warchest, a quarter or two of loses isn't going to bankrupt them.




They've been unified for some time now. AMD as a corporation continues to lose money.




I know this, everyone knows this, that is why i said "Division", AMD's Graphics division is in the positive and thus helping AMD out of the hole that they have been in for quite some time...
Score
0
August 20, 2010 11:04:05 PM

Quote:
Fermi = Fail, GTX 460 = Poor Mans Card

5870 and above = FTW. Bringing the power and bragging rights to you since Oct 2009.

http://i71.photobucket.com/albums/i145/Soldier36/DSC05079.jpg



5870 cross fire = bug as hell in game = not compatible with many motherboard. I guess whoever bought ATI card normally just for one card.
Score
0
August 20, 2010 11:11:40 PM

I think that nVidia is bleeding slowly from having such a large vacuum in their lineup down at the lower end. Nobody is going to buy an old 2xx series model when Ati offers much better solutions.
I hope to see nVidia get a little more aggressive in pricing as time goes on. The problem is, they can make some sales to early adopters at higher prices before lowering them. Until they lose the hype of putting out a new product, they aren't going to make it any cheaper.
It would be nice to see some lower end cards from them, right now, but we don't cause they haven't.

nVidia probably would if they could.

The economic landscape just isn't amiable to high end graphics companies.
Score
0
a c 173 U Graphics card
a b Î Nvidia
August 20, 2010 11:14:08 PM

Honestly I think there is numerous programing issues with Fermi and not just what is at the surface. As for ATI I believe that it is the same and drivers are either yet to fully mature or there is some inefficiency in the shaders yielding not the full power of the chip. Then clocks are crap except on the 104 and heat will kill most after a year or two. High power consumption always end up working against the card resulting in a short life compared to other cards that survive long enough to be hopelessly obsolete. The full spec 100 requires two 8pin connectors just to run and only delivered 5% improved on a per clock bases. Can't forget the titanic cooler it requires just to run. In short the 40nm process isn't going to delver the next big thing.



The Radeon 6k series could end up just as hot if they did go with their original spec that was meant for the r870 including sideport. I don't want to even think of what could be replacing 100.
Score
0
a b U Graphics card
August 20, 2010 11:17:05 PM

Quote:
Fermi = Fail, GTX 460 = Poor Mans Card

5870 and above = FTW. Bringing the power and bragging rights to you since Oct 2009.

http://i71.photobucket.com/albums/i145/Soldier36/DSC05079.jpg


And yet two GTX 470s will beat out 2 5870s for almost $200 less...

Sorry, but things have changed, Fermi is now here and is on top, but its just a bit too late. In a few months it will all be irrelevant because ATI will be back on top again with SI.

And don't give me the power consumption BS, if you can't power the cards then you will get the ones you can with lower performance. It is as simple as that, as usual the best performance comes at a price, and with Fermi that price is energy while Fermi is very competitive with the actual cost.
Score
0
a b U Graphics card
August 20, 2010 11:20:41 PM

nforce4max said:
Then clocks are crap except on the 104 and heat will kill most after a year or two. High power consumption always end up working against the card resulting in a short life compared to other cards that survive long enough to be hopelessly obsolete.


Wow, are you serious? So the GTX 4xx series are going to die in a year or two? Odd that the G80 and 8800 GT cards are still working even though they were almost as hot. I'm going to politely call BS on that one.
Score
0
a b U Graphics card
August 20, 2010 11:26:16 PM

The thing you forget AMW is that its down to the individual to decide if the heat and power are an issue for them. for me, and many other, it would be.

there is no way i would ever buy a gtx480, and i would only buy a 470 if it was a superb deal. but a 5850, or 5970 would be my go to choices at the higher end. or two 460's assuming a capable mobo.

GF100 is just too hot and inefficient, and i don't want it. based on the lower sales (still good, but not great) i would say I'm not in a minority.

just because two 5870's are 200 more expensive, doesn't make the 470 a better card. especially when the 5970 exists for the exact reason of replacing that setup. when you get similar performance with MUCH less heat and power draw, it seems like a no brainier to me.

though on a new build, I'd sure as hell be going for a GTX460 SLI setup. I'm sure GF104 will be what puts Nv back on top.
Score
0
a c 173 U Graphics card
a b Î Nvidia
August 20, 2010 11:27:47 PM

AMW1011 said:
Wow, are you serious? So the GTX 4xx series are going to die in a year or two? Odd that the G80 and 8800 GT cards are still working even though they were almost as hot. I'm going to politely call BS on that one.



Not the 460 but the rest are just to expensive to produce for Nvidia to maintain production with the current yield rates. Also down the road the heat and power consumption is going to work against them. Don't forget that 28nm is on the way and both Nvidia as well ATI will be quick to capitalize on the new process. Fermi won't rock the gaming world like the G80 did and later the G92. The older cards didn't run hot enough to slow roast a 5lb brisket. The G92 barely used 100w on a nuked GT and barely any more on a GTX. The G80 only topped out at 189w when maxed out. The Fermi blows then both out of the water at stock clocks at 250w~ They draw more power than the old R600 did and that was a hot gpu for the time.
Score
0
August 20, 2010 11:47:36 PM

nforce4max said:
Not the 460 but the rest are just to expensive to produce for Nvidia to maintain production with the current yield rates. Also down the road the heat and power consumption is going to work against them. Don't forget that 28nm is on the way and both Nvidia as well ATI will be quick to capitalize on the new process. Fermi won't rock the gaming world like the G80 did and later the G92. The older cards didn't run hot enough to slow roast a 5lb brisket. The G92 barely used 100w on a nuked GT and barely any more on a GTX. The G80 only topped out at 189w when maxed out. The Fermi blows then both out of the water at stock clocks at 250w~ They draw more power than the old R600 did and that was a hot gpu for the time.


read this new review before you think 470 is too hot.

http://www.tomshardware.com/reviews/gv-n470SO-13I-super...
Score
0
August 20, 2010 11:57:23 PM

nforce4max said:
Not the 460 but the rest are just to expensive to produce for Nvidia to maintain production with the current yield rates. Also down the road the heat and power consumption is going to work against them. Don't forget that 28nm is on the way and both Nvidia as well ATI will be quick to capitalize on the new process. Fermi won't rock the gaming world like the G80 did and later the G92. The older cards didn't run hot enough to slow roast a 5lb brisket. The G92 barely used 100w on a nuked GT and barely any more on a GTX. The G80 only topped out at 189w when maxed out. The Fermi blows then both out of the water at stock clocks at 250w~ They draw more power than the old R600 did and that was a hot gpu for the time.


read another new review before you think 480 is too hot

http://www.tomshardware.com/reviews/gtx-480-amp-edition...

you guys need some info up to date. The only thing i saw on nVidia is this company don't know how to make a good reference card.
Score
0
a c 173 U Graphics card
a b Î Nvidia
August 20, 2010 11:58:48 PM

yanje03 said:
read another new review before you think 480 is too hot

http://www.tomshardware.com/reviews/gtx-480-amp-edition...

you guys need some info up to date. The only thing i saw on nVidia is this company don't know how to make a good reference card.



Let me guess you are one those people that make $300k a year and don't have to worry about paying bills?
Score
0
August 21, 2010 12:15:25 AM

nforce4max said:
Let me guess you are one those people that make $300k a year and don't have to worry about paying bills?

huh, isn't your topic about hot issue now?
isn't someone just saying " GTX 460 = Poor Mans Card ". 480 is $100 higher than 5870 but provides better performance when SLI and provides less bug in game. so we can believe someone do select 480.
Score
0
August 21, 2010 12:35:50 AM

yanje03 said:
huh, isn't your topic about hot issue now?
isn't someone just saying " GTX 460 = Poor Mans Card ". 480 is $100 higher than 5870 but provides better performance when SLI and provides less bug in game. so we can believe someone do select 480.



anyone who can sli 2 480's isn't worrying too much about heat or electricity costs. the reason most people stay away from the 480 or 470 is because of the cooling/power draw requirements

the 480 draws so much power i personally wouldn't want to buy it since my electricity bill would skyrocket, whereas a single or crossfired 5870 doesnt have this issue
Score
0
a b U Graphics card
a b Î Nvidia
August 21, 2010 12:55:51 AM

notty22 said:
Imo, this gpu die size importance is overblown. There is another article claiming the g104 is actually smaller.


An article from Theo? Can I post an article from Charlie as the 'counter-proof' since neither has any image to support their claims? :evil: 

PS, the Cypress die size has been measured accurately and with photos (since it's a non-capped die anyone can do it) @ 334mm2 not 337 (18.27mm x 18.27mm = 333.7929) , methinks Theo moved the seven over a decimal space and dropped a 3, cause even rounding to 18.3 is 335, not 337. [:thegreatgrapeape:5]

For nVidia die size and yields matter greatly, especially since as you and the others seem to forget... NVIDIA DOESN'T SELL CARDS, THEY SELL DIES TO AIBs who sell cards !! :hello: 

Quote:
A poor financial quarter is not inspiring, but it reflects a time when Nvidia wasn't competing with any dx11 parts.


So are you saying that The GTX470/480 are not DX11 parts, or forgetting that the quarter in question covered the period well after the GTX470/480 launch (and even included a few weeks of the GTX460) ? :heink: 
Score
0
a b U Graphics card
a b Î Nvidia
August 21, 2010 1:16:25 AM

TheGreatGrapeApe said:

For nVidia die size and yields matter greatly, especially since as you and the others seem to forget... NVIDIA DOESN'T SELL CARDS, THEY SELL DIES TO AIBs who sell cards !! :hello: 


And your point is ? No I don't forget this point. Its also just as important to ATI. The 5830 is the same size as the 5870, along with the 5850. They don't sell them to the AIB at the same price. They study yields and make a business plan around that.
End result is a 400 dollar 5870 or a 200 dollar 5830.
I get a good lol at the fanboys who critique these corporations business models. And they cling to assumptions that suit their pov.
Score
0
a b U Graphics card
August 21, 2010 1:24:11 AM

welshmousepk said:
The thing you forget AMW is that its down to the individual to decide if the heat and power are an issue for them. for me, and many other, it would be.

there is no way i would ever buy a gtx480, and i would only buy a 470 if it was a superb deal. but a 5850, or 5970 would be my go to choices at the higher end. or two 460's assuming a capable mobo.

GF100 is just too hot and inefficient, and i don't want it. based on the lower sales (still good, but not great) i would say I'm not in a minority.

just because two 5870's are 200 more expensive, doesn't make the 470 a better card. especially when the 5970 exists for the exact reason of replacing that setup. when you get similar performance with MUCH less heat and power draw, it seems like a no brainier to me.

though on a new build, I'd sure as hell be going for a GTX460 SLI setup. I'm sure GF104 will be what puts Nv back on top.


Okay the heat thing is completely overblown, there is no heat issue, especially with custom fan profiles.

The power consumption is not that much worse, and may, MAY, cost you an additional $20 a year if you game a lot during your free time. Remember, most of the time these cards will be at idle, where both sip very little power. The "power issue" is so ridiculously overblown, just like the noise and heat.

Your telling me that you would pay almost $200 more to save a few dollars a month on your electric bill? And you complaining about the heat, when you have a well cooled case. It doesn't make sense, your points are purely sensationalism of some minor issues.

If your power supply can't handle dual GTX 470s, but can handle dual 5870s, then you are going to have to pay more for less performance. So much more than you can get a quality 850-1000w PSU for the different and get the extra performance. There is nothing else to it. If you must settle for dual GTX 470s or dual 5850s, then you should get dual GTX 460s because they will be as fast, if not faster, than two 5850s for less money.

nforce4max said:
Not the 460 but the rest are just to expensive to produce for Nvidia to maintain production with the current yield rates. Also down the road the heat and power consumption is going to work against them. Don't forget that 28nm is on the way and both Nvidia as well ATI will be quick to capitalize on the new process. Fermi won't rock the gaming world like the G80 did and later the G92. The older cards didn't run hot enough to slow roast a 5lb brisket. The G92 barely used 100w on a nuked GT and barely any more on a GTX. The G80 only topped out at 189w when maxed out. The Fermi blows then both out of the water at stock clocks at 250w~ They draw more power than the old R600 did and that was a hot gpu for the time.


I'm not talking about any of that, I was just pointing out that G100 chips are not going to suddenly die in a year or two. If you remember correctly, an 8800 GTX, 8800 GT, a 9800 GX2, a 3870 X2, and a 4870 X2 would all get into similarly hot temperatures. I've seen no mention of these cards dying so quickly. Also, you have to take into account that nVidia have the power consumption and heat in mind when designing these, if they started dying abnormally soon then nVidia would get sued, and they obviously don't want that.
Score
0
a b U Graphics card
a b Î Nvidia
August 21, 2010 1:31:03 AM

yanje03 said:
for this case, a smaller PCB design does mean use less components to support its GPU. Do you think AMD's partner is stupid to make a bigger card to not fit some computer case? That's all because they have no choice.


Or maybe they chose to space out their parts further to generate less heat and provide better power characteristics? Or maybe if they make it bigger they don't have to uses as many layers.

You're pretty naive if you think a PCB is like a raw material where it is simply the size that matter alone. Even for a die which is essentially silicon, it's not just about the area as mentioned before, but it's even less linear on o PCB, which include things like more wire traces for the higher bit channels for the Fermi memory. More power regulation hardware for the higher power consuming card/parts.

So really, you think that because the GTX480 board is shorter it's cheaper to make? :pt1cable: 

yanje03 said:
Well, looks like ATI is saving AMD then. An acquired company/division normally means more politics, less production in long run.


Well most people like myself have said that AMD's CPU division is a boat-anchor on their GPU division which means they can't be as competitive as they used to be when they were ATi... yet they are still managing to do beter this time around despite the burden of the traditional AMD bloat.
As for the less production in the long run, production has improved in the long run, in the short run it definitely suffered greatly... but alot of that was due to the R600 more than politics, however the AMD bloat likely meant a slower correction of those issues, unlike the R580's correction of the R520/original500.
That doesn't make nV look better it makes them look lucky that ATi was saddled with AMD.


yanje03 said:
read another new review before you think 480 is too hot

http://www.tomshardware.com/reviews/gtx-480-amp-edition...

you guys need some info up to date. The only thing i saw on nVidia is this company don't know how to make a good reference card.


Did you bother to read that review? :heink: 
It doesn't show the GTX480 as the cooler GPU/CARD of the two, it shows it still being worse than the HD5870 in both power consumption (which is directly related to heat production) and temps (which is heat management) for the reference vs reference and best AIB-design vs best AIB-design.

As for 'too hot' that's all in the eye of the beer-holder, but it's still a major concern for OEMs who need to build computers that have to deal with those issues, so it becomes a more limiting factor for the Fermi cards.
Score
0
!