NVIDIA possibly abandoning mid and high end market

randomizer

Champion
Moderator
http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/

According to Charlie, The GTX260, 275 and 285 are all EOL within weeks if not already. Take with a grain of salt, obviously. Also, expect NVIDIA PR to deny the whole thing, and for said PR response to show up on Fuddo.

IMO Charlie is milking the "success" of his previous article and using the opportunity to send NVIDIA's public image through the floor.

EDIT: Nice contradictory article right before it too: http://www.semiaccurate.com/2009/10/06/nvidia-will-crater-gtx260-and-gtx275-prices-soon/ :lol:

I love Charlie, he certainly knows how to take the "professional" out of the journalist profession.
 

randomizer

Champion
Moderator
I think he got excited that NVIDIA had some real bad press for once and dropped every rumour he could possibly concoct onto his front page even if they weren't consistent just to keep the ball rolling. What next? JHH had an affair with Ruby?
 

djcoolmasterx

Distinguished
Mar 15, 2007
1,269
0
19,310
Grain of salt?

largesaltpile.jpg
 
OK, so what specifically is your problem with his theory? Other than it's not good for nVidia?

I'd like a counterpoint, not just Charlie-bashing. Especially since this is something we talked about way back at the GTX280's launch that it wouldn't be feasible to keep using a huge chip with low yields to compete with cheaper chips.

This is exactly why they need something new in the mid-range, especially if 40nm isn't going well (how long has the G2xx 40nm been delayed?).
 

hallowed_dragon

Distinguished
Jan 17, 2008
1,016
0
19,360
If nVidia would make a move like that it means that they have something to replace the current line-up. Possibly a new 2xx card to compete with 48xx or even 5850. If they dont have something and just want to stop producing those huge low wield chips that would create a gap in their offerings and will keep the 58xx prieces up until Fermi. :??:
 

randomizer

Champion
Moderator

I don't have a problem with his theory, just that he writes 2 pages of text on one topic saying that prices are going down and there's no shortage of supply, then an hour later "word reaches [him]" that suddenly the cards are EOL. At least he should try and verify his info before writing up two articles that oppose each other that close together. Only Fuad has managed to do better (contradictory articles within 10 minutes of each other), but he writes 150 words, not 2 pages.

Charlie's articles are beyond the length that my brain will allow me to read in their entirety without losing interest. So I didn't get through the whole thing, and therefore it may not have been as contradictory I think had I read it all.
 
When I first read this
http://www.improbableinsights.com/2009/08/27/whats-up-with-nvidia/ I wondered exactly what he was trying to say.
Now, it may be more apparent
"Still, it’s an odd place for Nvidia to be. The company’s marketing strategy once revolved around putting out high end products that offered either better performance or a more robust feature set than competitors – or both, in some cases. Then the company would push the new tech down into the mainstream and low end.

Now, their key successes seem to revolve around high volume, low margin products."


Now, I was thinking to myself, this doesnt sound like nVidia, or does it?
 


You make me laugh,
You say " At least he should try and verify his info before writing up two articles that oppose each other that close together." But are quite prepared to post a thread having a pop at Charlie, without by your own admision having actually bothering to verify your own facts. :pfff:
If you had read both articles fully you would see that the one is just a follow up of the other, both are related and reported to be in such close proximity to each other as Charlie was acting on news just in if you will.
As TGGA said a counter point would be nice.
Mactronix
 
G

Guest

Guest
this article is one big lie about nvidia and i can just smile about it, first this doesnt sound like nvidia and doesnt come from nvidia.
 

croc

Distinguished
BANNED
Sep 14, 2005
3,038
1
20,810
I have to agree to some extent with TGGA here. While I can't find any links to yield results from TSMC, I do have some rumours that they are having some difficulty dropping to a lower node. Since both AMD and Nvidia use TSMC as their primary (only?) fab, higher xistor count for either company can't be good for yields, let alone operating temps. Obviously Nvidia's proposed x300 card at a rumoured 3 billion xsistors will lower that yield even further if TSMC can't transition to a smaller node size.

Time will tell, but it looks as if AMD is having a hard time feeding the channel with it's x58xx series as we speak.
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980
If Nvidia cannot sell the G200 line for profit, then it makes sense to EOL them.


Why actively go out of your way to lose more money?


More worrying, is the size comparison of Fermi to Cypress. Nvidia need it to be competitive per mm^2 of die space. If not, then they will only haemorrhage more money.
 

jennyh

Splendid
I dont know what to believe but this sounds even more outlandish than usual even for Charlie. Nvidia are in a bad way but not this bad surely?

If this is true then AMD have conjured up the most amazing victory in the history of computing. They have simply outsmarted Nvidia on strategy.
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980



A number of recent media publications suggest that ATI and Nvidia have started to reduce supplies of previous-generation high-end offerings, e.g., ATI Radeon HD 4870 X2 or Nvidia GeForce GTX 285, as well as previous-generation 55nm graphics chips in general.


AMD have a new generation inbound, some of which has already arrived. It makes sense to dump the older stuff (made on a less economical node) off the market.



Nvidia...?
 

jennyh

Splendid
Well it's clear ATI are having issues with delivering the 5x series, but the 4x series are probably all almost sold out.

There could just be a shortage of cards because there really is a shortage of cards at the moment, from both sides.
 
I cant beleive anyone would actually beleive that both the number of GPU sales is going up AND that both Nvidia and ATI are sitting on the chips to stop them being sold :??: just plain crazt thinking in my book but please enlighten me if you think im wrong.
We know that yeilds are being reported as being low and we also know that its only sensible that ATI should want to reduce the amount of 4 series cards at this time.
What we dont know is how much internal politics and bull is being passed around.
I also dont beleive that the board partners are in a position where they can demand what price they will pay for the chips. If they are then changing your stock policy cant make a differance as far as i can see.

Mactronix
 

jennyh

Splendid
It kind of makes sense. Nvidia are losing money on every card they sell, so why make more? Just finish off what they have and hope Fermi is out the door fast, that is their plan.

http://channel.hexus.net/content/item.php?item=20566

More confirmation. AMD say they are not having issues, Nvidia issue a 'no comment'.

This all adds up. Why would nvidia keep selling cards at a loss? They are handing the market to ATI, but the alternative is to take heavy losses on g200 *and* the people who buy those wouldn't buy one of the (possibly cheaper to produce) Fermi models in a few months.

EDIT - I don't believe Nvidia are dying out or anything like that, but I do believe that they have decided not to compete with ATI's 40nm because...well its no competition is it? ATI should now own the middle - high end market for the next 4 months at least.
 
Yes i know the general trend and whats been reported about Nvidia makes sense. I should have said, i was refering to teh Xbit article when i posted. I just cant see how you can have it both ways. In the past with the 8800GTX etc i could have beleived it as a way of making the price go up becasue the part was rare.
With AMD and its make em cheap sell in volume practice i just cant see it being right. Sure they are in a position to do that, but at the price point they introduced the cards at i dont beleive they intend to wring every last penny out of us for the cards.

Mactronix
 

4745454b

Titan
Moderator
Keep in mind I'm a pro AMD guy.

With that much doom and gloom in the articles, I'm sure this is a fanboi post. I would say the odds of things being that bad for Nvidia are low, aren't they? I don't see why they would just drop out of the mid/high end market. Isn't there some way for them to cut costs? I'm sure John_ will come in and explain how after he was $uck!ng JJH, he saw the plans for Nvidia success.

It will be funny if this is true. They had the large die size problem with G200, and seemingly didn't take any real steps to change it. Now with G300 coming, its possible that the same thing will happen again. I know chips are designed years before we know of them, I hope Nvidia is working on something smaller.
 

jennyh

Splendid
Basically put, AMD can make 2 gpu's for the cost of Nvidia's 1 gpu and it's been that way for the past year.

The cost of juniper compared to say, the 275gtx which it will compete with on benchmarks just isn't funny. ATI can probably make 4-5 of these for every 275gtx.

It just isn't sensible for Nvidia to even fight against that. Think about it, ATI are making money on these gpu's, they can afford to drop prices. Nvidia could keep making their high end 200's but why? ATI would just squeeze them more and more by lowering prices more and more.

If Nvidia are taking a $10 hit each sold gpu, that's already bad. If ATI decide to drop prices further? It's unsustainable, Nvidia have given up because both 4000 and 5000 series cards are so much cheaper to make.
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980


Nvidia's problem is simple:

For every transistor they pack in dedicated to CUDA, that is one transistor more they have to pay for to compete with ATi.


Basically, they are running a crap technical model at the minute. No doubt PR is dictating to engineering how to build the GPUs, and no-one has the balls to tell dear leader he is talking out his ar$e.