Sign in with
Sign up | Sign in
Your question
Solved

NVIDIA possibly abandoning mid and high end market

Last response: in Graphics & Displays
Share
October 7, 2009 12:08:00 AM

http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx...

According to Charlie, The GTX260, 275 and 285 are all EOL within weeks if not already. Take with a grain of salt, obviously. Also, expect NVIDIA PR to deny the whole thing, and for said PR response to show up on Fuddo.

IMO Charlie is milking the "success" of his previous article and using the opportunity to send NVIDIA's public image through the floor.

EDIT: Nice contradictory article right before it too: http://www.semiaccurate.com/2009/10/06/nvidia-will-crat... :lol: 

I love Charlie, he certainly knows how to take the "professional" out of the journalist profession.
October 7, 2009 12:14:52 AM

LMAO

Chuck really has no limits does he?
m
0
l
October 7, 2009 12:21:52 AM

I think he got excited that NVIDIA had some real bad press for once and dropped every rumour he could possibly concoct onto his front page even if they weren't consistent just to keep the ball rolling. What next? JHH had an affair with Ruby?
m
0
l
Related resources
a b Î Nvidia
October 7, 2009 7:18:13 AM

OK, so what specifically is your problem with his theory? Other than it's not good for nVidia?

I'd like a counterpoint, not just Charlie-bashing. Especially since this is something we talked about way back at the GTX280's launch that it wouldn't be feasible to keep using a huge chip with low yields to compete with cheaper chips.

This is exactly why they need something new in the mid-range, especially if 40nm isn't going well (how long has the G2xx 40nm been delayed?).
m
0
l
October 7, 2009 7:38:45 AM

If nVidia would make a move like that it means that they have something to replace the current line-up. Possibly a new 2xx card to compete with 48xx or even 5850. If they dont have something and just want to stop producing those huge low wield chips that would create a gap in their offerings and will keep the 58xx prieces up until Fermi. :??: 
m
0
l
October 7, 2009 7:42:18 AM

TheGreatGrapeApe said:
OK, so what specifically is your problem with his theory?

I don't have a problem with his theory, just that he writes 2 pages of text on one topic saying that prices are going down and there's no shortage of supply, then an hour later "word reaches [him]" that suddenly the cards are EOL. At least he should try and verify his info before writing up two articles that oppose each other that close together. Only Fuad has managed to do better (contradictory articles within 10 minutes of each other), but he writes 150 words, not 2 pages.

Charlie's articles are beyond the length that my brain will allow me to read in their entirety without losing interest. So I didn't get through the whole thing, and therefore it may not have been as contradictory I think had I read it all.
m
0
l
October 7, 2009 8:00:30 AM

Quote:
At $1 per year, Jen-Hsun is overpaid.


That I agree with :D .
Charlie is really an ATI fanboy :na:  . I for one dont think nVidia is doing that bad, but I guess time will tell.
m
0
l
October 7, 2009 8:02:39 AM

When I first read this
http://www.improbableinsights.com/2009/08/27/whats-up-w... I wondered exactly what he was trying to say.
Now, it may be more apparent
"Still, it’s an odd place for Nvidia to be. The company’s marketing strategy once revolved around putting out high end products that offered either better performance or a more robust feature set than competitors – or both, in some cases. Then the company would push the new tech down into the mainstream and low end.

Now, their key successes seem to revolve around high volume, low margin products."


Now, I was thinking to myself, this doesnt sound like nVidia, or does it?
m
0
l
a b Î Nvidia
October 7, 2009 8:29:25 AM

randomizer said:
I don't have a problem with his theory, just that he writes 2 pages of text on one topic saying that prices are going down and there's no shortage of supply, then an hour later "word reaches [him]" that suddenly the cards are EOL. Only Fuad has managed to do better (contradictory articles within 10 minutes of each other), but he writes 150 words, not 2 pages.

Charlie's articles are beyond the length that my brain will allow me to read in their entirety without losing interest. So I didn't get through the whole thing, and therefore it may not have been as contradictory I think had I read it all.


You make me laugh,
You say " At least he should try and verify his info before writing up two articles that oppose each other that close together." But are quite prepared to post a thread having a pop at Charlie, without by your own admision having actually bothering to verify your own facts. :pfff: 
If you had read both articles fully you would see that the one is just a follow up of the other, both are related and reported to be in such close proximity to each other as Charlie was acting on news just in if you will.
As TGGA said a counter point would be nice.
Mactronix
m
0
l
Anonymous
October 7, 2009 8:39:16 AM

this article is one big lie about nvidia and i can just smile about it, first this doesnt sound like nvidia and doesnt come from nvidia.
m
0
l
October 7, 2009 8:45:34 AM

Quote:
this article is one big lie about nvidia and i can just smile about it, first this doesnt sound like nvidia and doesnt come from nvidia.


Hi John. Welcome back. I really missed your entertaining posts :bounce: 
m
0
l
October 7, 2009 9:07:58 AM

I have to agree to some extent with TGGA here. While I can't find any links to yield results from TSMC, I do have some rumours that they are having some difficulty dropping to a lower node. Since both AMD and Nvidia use TSMC as their primary (only?) fab, higher xistor count for either company can't be good for yields, let alone operating temps. Obviously Nvidia's proposed x300 card at a rumoured 3 billion xsistors will lower that yield even further if TSMC can't transition to a smaller node size.

Time will tell, but it looks as if AMD is having a hard time feeding the channel with it's x58xx series as we speak.
m
0
l
October 7, 2009 9:19:07 AM

If Nvidia cannot sell the G200 line for profit, then it makes sense to EOL them.


Why actively go out of your way to lose more money?


More worrying, is the size comparison of Fermi to Cypress. Nvidia need it to be competitive per mm^2 of die space. If not, then they will only haemorrhage more money.
m
0
l
October 7, 2009 9:51:03 AM

I dont know what to believe but this sounds even more outlandish than usual even for Charlie. Nvidia are in a bad way but not this bad surely?

If this is true then AMD have conjured up the most amazing victory in the history of computing. They have simply outsmarted Nvidia on strategy.
m
0
l
October 7, 2009 10:29:56 AM

Mousemonkey said:
The shortage issue may not be an Nv exclusive.



Quote:
A number of recent media publications suggest that ATI and Nvidia have started to reduce supplies of previous-generation high-end offerings, e.g., ATI Radeon HD 4870 X2 or Nvidia GeForce GTX 285, as well as previous-generation 55nm graphics chips in general.



AMD have a new generation inbound, some of which has already arrived. It makes sense to dump the older stuff (made on a less economical node) off the market.



Nvidia...?
m
0
l
October 7, 2009 10:30:51 AM

Well it's clear ATI are having issues with delivering the 5x series, but the 4x series are probably all almost sold out.

There could just be a shortage of cards because there really is a shortage of cards at the moment, from both sides.
m
0
l
a b Î Nvidia
October 7, 2009 11:35:22 AM

I cant beleive anyone would actually beleive that both the number of GPU sales is going up AND that both Nvidia and ATI are sitting on the chips to stop them being sold :??:  just plain crazt thinking in my book but please enlighten me if you think im wrong.
We know that yeilds are being reported as being low and we also know that its only sensible that ATI should want to reduce the amount of 4 series cards at this time.
What we dont know is how much internal politics and bull is being passed around.
I also dont beleive that the board partners are in a position where they can demand what price they will pay for the chips. If they are then changing your stock policy cant make a differance as far as i can see.

Mactronix
m
0
l
October 7, 2009 11:40:57 AM

It kind of makes sense. Nvidia are losing money on every card they sell, so why make more? Just finish off what they have and hope Fermi is out the door fast, that is their plan.

http://channel.hexus.net/content/item.php?item=20566

More confirmation. AMD say they are not having issues, Nvidia issue a 'no comment'.

This all adds up. Why would nvidia keep selling cards at a loss? They are handing the market to ATI, but the alternative is to take heavy losses on g200 *and* the people who buy those wouldn't buy one of the (possibly cheaper to produce) Fermi models in a few months.

EDIT - I don't believe Nvidia are dying out or anything like that, but I do believe that they have decided not to compete with ATI's 40nm because...well its no competition is it? ATI should now own the middle - high end market for the next 4 months at least.
m
0
l
a b Î Nvidia
October 7, 2009 11:45:55 AM

Yes i know the general trend and whats been reported about Nvidia makes sense. I should have said, i was refering to teh Xbit article when i posted. I just cant see how you can have it both ways. In the past with the 8800GTX etc i could have beleived it as a way of making the price go up becasue the part was rare.
With AMD and its make em cheap sell in volume practice i just cant see it being right. Sure they are in a position to do that, but at the price point they introduced the cards at i dont beleive they intend to wring every last penny out of us for the cards.

Mactronix
m
0
l
a b Î Nvidia
October 7, 2009 11:56:55 AM

Keep in mind I'm a pro AMD guy.

With that much doom and gloom in the articles, I'm sure this is a fanboi post. I would say the odds of things being that bad for Nvidia are low, aren't they? I don't see why they would just drop out of the mid/high end market. Isn't there some way for them to cut costs? I'm sure John_ will come in and explain how after he was $uck!ng JJH, he saw the plans for Nvidia success.

It will be funny if this is true. They had the large die size problem with G200, and seemingly didn't take any real steps to change it. Now with G300 coming, its possible that the same thing will happen again. I know chips are designed years before we know of them, I hope Nvidia is working on something smaller.
m
0
l
October 7, 2009 12:10:45 PM

Basically put, AMD can make 2 gpu's for the cost of Nvidia's 1 gpu and it's been that way for the past year.

The cost of juniper compared to say, the 275gtx which it will compete with on benchmarks just isn't funny. ATI can probably make 4-5 of these for every 275gtx.

It just isn't sensible for Nvidia to even fight against that. Think about it, ATI are making money on these gpu's, they can afford to drop prices. Nvidia could keep making their high end 200's but why? ATI would just squeeze them more and more by lowering prices more and more.

If Nvidia are taking a $10 hit each sold gpu, that's already bad. If ATI decide to drop prices further? It's unsustainable, Nvidia have given up because both 4000 and 5000 series cards are so much cheaper to make.
m
0
l
October 7, 2009 12:34:47 PM

4745454b said:
I hope Nvidia is working on something smaller.


Nvidia's problem is simple:

For every transistor they pack in dedicated to CUDA, that is one transistor more they have to pay for to compete with ATi.


Basically, they are running a crap technical model at the minute. No doubt PR is dictating to engineering how to build the GPUs, and no-one has the balls to tell dear leader he is talking out his ar$e.
m
0
l
October 7, 2009 1:20:23 PM

The way I see it is, ATI came out as soon as they could, at ramp up, not after it, and thats why theres so few currently available, and EOLing their old cards only makes sense.

Normally, on a good hard launch, a company will have stock dispersed all around in good quantities, but several things made this release different. The economy for one, keeping things tighter than normal, low inventories, and also, waiting to get the feel of the market in this economy, but even more than that was to get as early a jump on things as possible, make impressions, show up the competition, have a decent entry price, which will maintain just that much longer.

As for nVidia? Im not sure. I do know that the G200s do cost more, and saw a somewhat accurate listing on pricing between the G200 and the 4xxx series, and margins are already slim, and to have to lower them for last gens top card to compete on down, the new gen only puts more pressure on everyone involved, and nVidias partners, as well as nVidia doesnt want to get stuck with alot of inventory they just cant sell.
Ive heard theres alot of design wins in the lower end and mobile for them, which is fine, and the high end has been a disaster for them, as their margins were cut right away, and have sunk much futher since, andtheir backbone, the G92 is but a mobile solution today, which again, is selling ok from what I hear, tho Im sure theyve taken a beating because of the "bumpgate" scenario.

What worries me the most here is, its gone from "just wait and see what nVidia has coming" to, exactly what DO they have coming? This isnt a fanboy rant here against them, and I really do hope they have a few surprises up their sleeves, cause they need them at this point, and Ive said before, I like my ATI and all things red, but wish no great ill on nVidia, but it may be, this whole going large/single strategy has really hurt them, especially, since theres no sign of shrunk down DX11 type G200s at 40nm.

Sorry for the wall of text, but this does concern me, and am hoping nVidia has something decent coming soon, just not "too" decent, heheh
m
0
l
a b Î Nvidia
October 7, 2009 1:37:11 PM

Well they say they (Nvidia) are strong at the low end chipset and mobile markets and they are going after the CPGPU sector in a big way. So it could be that they are opting out of the mainstream GPU market just for now while they concentrate on where they are winning, recession and all it could just be that the powers that be (accountants) basically told the board that its this or go bankrupt. Nothing stopping them making a revision mainstream CPGPU part further down the line is there.
Maybe they were worried about competing with intel with two very similar products ?

Mactronix
m
0
l
October 7, 2009 1:37:49 PM

When is the 5XXX mobility series due?

Surely with the low idle power - that will put a massive dent in Nvidia's mobile "design wins".
m
0
l
October 7, 2009 1:54:02 PM

"This puppy here, is Fermi"
m
0
l
October 7, 2009 1:54:30 PM

Sorry, double post somehow, its been years!
m
0
l
a b Î Nvidia
October 7, 2009 2:49:57 PM

randomizer said:
I don't have a problem with his theory, just that he writes 2 pages of text on one topic saying that prices are going down and there's no shortage of supply, then an hour later "word reaches [him]" that suddenly the cards are EOL. At least he should try and verify his info before writing up two articles that oppose each other that close together.


Read harder homer, they don't conflict.
It's because of the production problems, and the the cost of the chips, where they are forced to reduce prices to sell chips (instead of leaving them on shelves), now instead of continuing this money-losing practice, they will cut production.

If you look at article 1, the theory behind article two is imbeded right there in the middle;

"So to minimize price protection costs, the chip makers balance the number of parts in the channel keeping them to a bare minimum around product or price transition times. It is an art, half black art, half luck with a sprinkle of science topped off with a bit of competitive intelligence."

Also remember people, the EOL issue is not about completely holding back chips, it's more about not letting AIBs stockpile in order to be bulk discounts and undercut AMD & nV's profits. They are trying to low-ball AMD & nV, and finally both of them said, ENOUGH! We're giving you just in time delivery. I doubt anyone who's had a good relationship with the two and didn't try this tactic (essentially of trying to only sell old part not new ones), is getting better treatment, but anyone who sandbags their productions/sales to increase their end and greatly damage the IHVs implementation and development strategies, get dropped. I have no sympathy for them. Customers might like low-balling (or free-balling :whistle:  ) but I prefer technology and product advancement thanks.

Quote:
Charlie's articles are beyond the length that my brain will allow me to read in their entirety without losing interest. So I didn't get through the whole thing, and therefore it may not have been as contradictory I think had I read it all.


And to me that's the problem, just like reviews, no one reads anymore, but everyone has strong opinions. Those 150 words are like the graphs in a review, no substance to give an indication if the person is just spouting FUD or BS, or is there reasoning behind the statements. I'd always rather more information than less, but then again I write long posts too, for the same reason, to make sure it's clear.


Quote:
Only Fuad has managed to do better (contradictory articles within 10 minutes of each other), but he writes 150 words, not 2 pages.


I think Charlie should just delete his titles and force people to read the articles, less issues that way, and less people who only cursorily (if that's a word) read his stuff and get annoyed at the gist of it. I also prefer Charlie's 60% success rate to FUDO's 40% (at best) success rate. But I wouldn't invest my money in what either says, I listen to the changing wind, and the voices in my head, which change pitch with the wind. :lol: 
m
0
l
October 7, 2009 3:53:03 PM

"Cursory" is the word you were looking for.
m
0
l
a b Î Nvidia
October 7, 2009 6:10:25 PM

I agree with ape here.

Quote:
OK, so what specifically is your problem with his theory? Other than it's not good for nVidia? I'd like a counterpoint, not just Charlie-bashing.


A theory has been put forth. Might be outlandish, but it also might possibly be true. Other then attacking the source, can anyone attack the logic? Is there some flaw in the theory that can be pointed out? Yes, it comes from Charlie, but show us where the math is wrong. I believe it paints a picture to bleak for Nvidia, so its probably a fanboi rant. Certainly the leaders of a company as great as Nvidia wouldn't allow it to come to this. However, I'm also smart enough to read what Charlie wrote, and say "wait a second, could this be true?"
m
0
l
October 7, 2009 6:14:15 PM

Nvidia couldn't dare do such a thing. Why? They have all those crappy game manufacturers building their games with optimizations to run t3h Geforces. >_>
m
0
l
October 7, 2009 8:08:12 PM

While I don't think this is true, I don't remember ever seeing an AMD ad on Semi-Accurate.
m
0
l
October 7, 2009 8:15:34 PM

jennyh said:
None of the ads on Semi-accurate are paid for by AMD.

One fanboy says its nonsense <> it's nonsense.

there are amd ads on it now and Zardon at driverheaven isnt an nvidia fanboy, they have AMD skins and work with AMD all the time. they got a quote from nvidia and posted it.

What a weird forum this is.
m
0
l
a c 171 Î Nvidia
October 7, 2009 8:15:56 PM

jennyh said:
None of the ads on Semi-accurate are paid for by AMD.

One fanboy says its nonsense <> it's nonsense.


Dekasav said:
While I don't think this is true, I don't remember ever seeing an AMD ad on Semi-Accurate.


Apart from this one :-
and this one:- Unless it's a different way of spelling Nvidia and Inhell. :lol: 
m
0
l
October 7, 2009 8:30:02 PM

Nvidia is DOOMED!!!!! unless they release something soon.
m
0
l
October 7, 2009 8:43:59 PM

Those ads aren't paid for by AMD, if you mouse over or click them you'd see that. :p 
m
0
l
Anonymous
October 7, 2009 8:48:44 PM

Charlie is one big lair and he is paid to screw nvidia...
m
0
l
October 7, 2009 9:13:53 PM

So let me clarify. I wanted to help you guys get information on this so you could educate yourself on Nvidias stance. You have basically just told me to go die. Clearly the loyalty for specific brands just makes people capable of wanting other people dead.

Thanks guys, lovely forum you got here, and top job from the moderators !
m
0
l
October 7, 2009 9:32:48 PM

extremedition said:
So let me clarify. I wanted to help you guys get information on this so you could educate yourself on Nvidias stance. You have basically just told me to go die. Clearly the loyalty for specific brands just makes people capable of wanting other people dead.

Thanks guys, lovely forum you got here, and top job from the moderators !


And I simply educated you on the facts regarding the 'AMD ads' which are actually ads for a reseller and not from AMD.

That alone brings the whole post into question. I mean come on, what do you really expect Nvidia to say?
m
0
l
October 7, 2009 9:54:04 PM

well at least in two months you will know if Nvidia are lying. I guess if they release a card capable of nailing the 5870 it wont matter to you anyway. I used to think Nvidia fanboys on some of their home sites were mentally challenged but its clear the guys in red have a little inbreeding going on as well.

Funny as hell though, these forums are hilarious. its like tech for teenage girls :p 
m
0
l
a b Î Nvidia
October 7, 2009 10:37:29 PM

Don't expect things to get any better soon not with high manufacturing cost due metals prices and the poor economy. The reduction of product is deliberate not just high cost but are doing so as stated in previous posts that Nvidia and ATI are trying to limit available stock in spite of demand from such companies like EVGA. Even the solder joints now cost more all thanks to the rise in silver prices and for the interconnects gold is at a all time high. So in short the cards cost more to manufacture than they are worth for a profit much less brake even for other expenses. As for the GT300 Nvidia has to target all price points or at least the mid range for the consumer and professional to stay afloat. ATI however so long it maintains it's current dominance in the budget and mid range it will do very well while the high end is just icing on the cake. Intel will learn what it is like to be AMD when chips don't sale like they had once did but for AMD a further shrink in market share will most likely happen. Note it will not be much fun when international trade gets hit with a major setback in a years time.
m
0
l
a b Î Nvidia
October 7, 2009 11:55:44 PM

Thanks for the link. Wow, what a shocker, Nvidia says it isn't so. I did see some interesting thing.

Quote:
Charlie also goes on to say : "With the cancellation of the GTX285, GTX275, and GTX260, possibly the GTX295 too, Nvidia is abandoning the entire high end and mid-range graphics market. Expect a reprise in January on the low end.


So charlie expect them to abandon the mid and high end now, followed by the low end later. What are they supposed to sell then? Does Charlie expect them to leave the business because AMD came out with a couple of good cards?

Quote:
a few of them said they have quite low supplies of GTX285 and GTX295. This however can not be simply interpreted as a 'worrying' issue for Nvidia users because as many of our readers will know they actually have a new range of cards out very shortly to compete against the current range of DX11 boards from AMD


Could this be whats going on? Charlie hears that stockpiles are low, so he assumes they are pulling out due to the cost/profit they get on the cards? He knows they are bleeding, so perhaps this is the assumption he's making. (never mind all the money they probably have seeing as they've been selling G80 in one form or another for the last how many years?) They might even be EOL the GTX cards, only because they have the g300 coming. If that's the case G300 better get here soon. (rumored to be how many months away?)
m
0
l
October 8, 2009 12:17:10 AM

TheGreatGrapeApe said:
And to me that's the problem, just like reviews, no one reads anymore, but everyone has strong opinions. Those 150 words are like the graphs in a review, no substance to give an indication if the person is just spouting FUD or BS, or is there reasoning behind the statements. I'd always rather more information than less, but then again I write long posts too, for the same reason, to make sure it's clear.

I always read reviews, I just have issues with tl;dr on a single page. If you split 3000 words over 6 pages it is easier to get through than all on one page, at least for me. Damn subconscious laziness!
m
0
l
October 8, 2009 5:18:45 AM

Anyone see this?
"Nvidia has confirmed that the company has essentially placed its Nforce chipset line on hiatus, given the legal wrangling between itself and Intel.

According to Robert Sherbin, the lead corporate communications spokesman for Nvidia, Nvidia will "postpone further chipset investments".

Sherbin also dismissed a report that Nvidia was pulling out of the mid-range and high-end GPU market as "patently untrue". But Nvidia's recent chip introductions do imply a shift in the graphics company's traditional stance is underway."
http://www.extremetech.com/article2/0,2845,2353936,00.a...

More of the same.
Now, since Charlies already been attacked and condemned here, weve seen article after article confirming things, plus with Apes info of the sellers trying to do a end around on both nVidia an ATI, Im sure at this point it could be anything, because it looks like anything could happen.

Ive been following Charlie more lately, not because Im a fanboy, nor because I like contoversy over fact, but like Ape said, I too follow the wind, and its been pushing me to Charlies writings lately.
And, again like Ape, its not something that you can stick with, but right now, and for a short while, Charlies been very right about a lot of things.
Of all the people I read, all I know, theres maybe 2 leads I'll really pay attention to, and thats it, though theres tons of others who have been right on for short durations, as their contacts are hitting it head on.
m
0
l
October 8, 2009 5:30:58 AM

JAYDEEJOHN said:
Sherbin also dismissed a report that Nvidia was pulling out of the mid-range and high-end GPU market as "patently untrue".

Surprise, surprise. :D 
m
0
l
!