Sign in with
Sign up | Sign in

Question 1: Nvidia’s Low- To Mid-Range Products

Talking Heads: VGA Manager Edition, September 2010
By

Question: AMD is currently benefiting from a visible gap in Nvidia’s discrete product portfolio. At the moment, it lacks Fermi-based DX11 products below the $200 GeForce GTX 460. In the next three months, do you expect to see Nvidia address the low-end to mid-range desktop graphic space with derivative DX11 products?

  • The Fermi-based DX11 low/mid-range products from Nvidia should be ready soon. Nvidia will definitely need to launch new products to get back some market share.
  • For channel [sales], maybe GF108 needs to wait until Q4.
  • The GeForce GTX 460 just launched last month and received exceptional feedback from customers and the media. Nvidia is working at full-speed to pull out the rest of its DX11 family. As you may know, the GTS 450 is going to launch in September, and other new SKUs like GF108 are following very shortly. We will see a much better product line-up from Nvidia in the next 2Qs.
  • DX 11 is becoming increasingly important to the user, with the uptake of Windows 7 and new DX11 applications. We anticipate this rolling out into the mainstream space from Nvidia in order to remain competitive, as it already is available with ATI-based products.
  • Detailed visuals for gaming and home theater are a must for users. After the success of GeForce GTX 460, Nvidia has a great possibility to have new DX11 products in mid-range segments in Q4.
  • Nvidia has done a great job designing the GF104 used in the GTX 460. They've shown the ability to take the DX11 Fermi architecture and develop a more power-efficient variant of a performance card. I think we'll see new products in the mid-range of the lineup for sure. The low-end may or may not show up, which in my opinion is not a big deal given the performance you need to truly take advantage of a DX11 title.

One hundred percent of our respondents expect to see Fermi-based entry-level and mid-range offerings before the end of the year. We never expected Nvidia to retreat from this market space. It is even more important to note that AMD-exclusive vendors are in agreement on when Nvidia is coming to market, sometimes right down to the month. One has to wonder if everyone just happens to be feeling the ripples in the water. Then again, a lot of money can be made in the lower- and mid-range group, just by way of volume.

We should make it clear that our question was always about time-to-market. Was Nvidia going to try and fight back this quarter or in early 2011? We know now that it is going to happen in the early half of Q4. How much market share Nvidia retakes depends on the performance and pricing of GF108, said to follow the already-popular GF104 and the soon-to-come GF106. We’ll cover that variable in-depth once we get hardware into the labs. No doubt this will be an interesting fight, especially with talk of an AMD-based retaliation before the end of 2010.

The fight for top spot is constantly up in the air. When you’re “King of the Hill,” it often doesn’t matter if your lower-end products perform subpar versus the competition. With all the press generally focused on performance, the two companies often enjoy good sales from buzz generated by flagship products. At least, this was the thinking in previous years. This doesn’t really seem to be true anymore, primarily because we are now seeing more and more educated buyers.

Another pressure at play here is the effect of economic factors on discretionary spending. In this context, customers may conceivably care less and less about the top spot. Whereas an enthusiast might normally purchase a GeForce GTX 480 or a Radeon HD 5870, they might instead purchase something in the sub-$200 range. Some retailers have noticed this trend, and we imagine this translates into the quarterly sale reports.

At the moment, it certainly seems that AMD is better situated to address the economic climate. This is not to suggest AMD planned for these pressures and Nvidia didn’t. Strategic time-to-market and product plans often occur years before economic conditions are known. We won’t rule anything out, but what we have seen is AMD’s market approach. It attacked the mid- to upper-range space first, and then branched out to the highest and lowest spaces through its Sweet Spot strategy.

Compare this to Nvidia’s monolithic GPU approach, introducing the GeForce GTX 480, and then working its way down the ladder. The company is committed to a top-down plan. As a result, it is 6-12 months “late” in bringing a competitive DX11 architecture (Fermi) to the mid-range and low-end spaces. Nvidia’s current position means it needs to step up its game if it really wants to retake market share. The pricing of its mid-level and low-end products might actually be more important than the performance they deliver, assuming they can hold their weight against AMD’s existing lineup. We can only speculate how well they will sell, since Q4 earnings won’t be out until 2011.

With some serious price slashing, Nvidia could really hack chunks of the market from an already-entrenched AMD during the holiday buying season. At a recent Microsoft partner conference, we were informed that the average age of the business PC is the highest it has been in a long time (now at 4.4 years). Sure, business computers have become more powerful and are expected to have a longer lifespan. But the current economic condition indeed seems to have an enhancing effect. We have seen this in the low-end server’s sales surge, noted by early quarterly reports.

It would be logical to deduce that the consumer market is likewise affected. The fact that both Newegg and Best Buy are still selling AGP cards at a semi-decent click seems to reinforce our point. Compared to Nvidia, AMD’s product offerings are much stronger from $50 to $125. The 3650 is still a very compelling value here (~$60), performing reasonably well, even against the 3850, 4650, and 4570 (~$70 to $100). Whereas AMD took the time early in the Radeon HD’s lifespan to invest in bridge chips, Nvidia took a different approach, and the ramifications of that are showing today.

One might argue that this isn’t a problem, considering there aren’t really any DX11 parts for AGP (nor are there plans for them, as far as we are aware). In fact, 2011 may be the year we finally see AGP disappear from the channel. However if the interface's demand continues to persist until the end of 2011, Nvidia’s current GeForce 6200 (~$40) isn’t going to cut it. The only place the company has AMD beat is pricing on  the 6200 and 5200, already pegged at rock-bottom. Considering all new AGP purchases are likely to be upgrades, there isn’t much to be seen in unit sales. Oddly enough, the low-end market space is an area the company historically handles well, (i.e. 7200 GS and 8400 GS at ~$30). Yet, one person in the survey suggested the lowest end of Nvidia’s DX11 graphics line up might never see the light of day. This makes us wonder if perhaps Nvidia is anticipating the strength of Sandy Bridge and Llano, and perhaps making space for those products.

On the PCI Express side, one generation back, Nvidia’s DirectX 10 offerings still fall short in terms of performance given their current pricing strategy, which is why the Q4 launch is so important. However, Fermi-based DX11 cards under the GeForce GTX 460 really need to launch at an attractive price if they’re to hit the ground running in 2011. AMD is already shifting its argument towards Nvidia’s heat and power consumption, but we’re not sure how successful it’ll be now that the GeForce GTX 460 has shown that Fermi doesn’t have to be hot or loud. Besides, the average consumer doesn’t think about heat and fan speed when they make a graphic card purchase. The only time power really becomes an issue is when the PSU’s limits are considered. Unless AMD shows us evidence that its cards  can save some serious money versus the new Fermi-based DX11 competition, it’s probably not going to drive that point home. Plus, based on our results, it’s not even quite clear Nvidia’s architecture is always the most power-hungry. As it stands, we are already starting to see the price drops from the GeForce GT 240 through the GTX 470. Make no mistake; Nvidia is wising up for the Q4'10 and 2011 battle.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 43 comments.
This thread is closed for comments
Top Comments
  • 23 Hide
    Anonymous , September 3, 2010 7:50 AM
    The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • 21 Hide
    Anonymous , September 3, 2010 6:40 AM
    Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • 20 Hide
    Kelavarus , September 3, 2010 6:50 AM
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
Other Comments
  • 21 Hide
    Anonymous , September 3, 2010 6:40 AM
    Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • 20 Hide
    Kelavarus , September 3, 2010 6:50 AM
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
  • 2 Hide
    TheStealthyOne , September 3, 2010 6:58 AM
    I've been waiting for this article! Yes :D 
  • 23 Hide
    Anonymous , September 3, 2010 7:50 AM
    The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • 3 Hide
    corser , September 3, 2010 9:21 AM
    I do not think that including a IGP on the processor die and conecting them doesn't means that discrete graphics vendors are dead. Some people will have graphics requirements that will overhelm the IGP and connect an 'EGP' (External Graphics Processor). Uhmmmm... maybe I created a whole new acronym.

    Since the start of that idea, believed that IGP on the processor die could serve to offload math operations and complex transformations from CPU to IGP, freeing CPU cycles for doing what is intended to do.

    Many years ago Apple made somewhat similar to this with their Quadra models that sported a dedicated DSP to offload some tasks from the processor to the DSP.

    My personal view on all this hype is that we're going to a different computing model, from a point that all the work was directed to the CPU and making some small steps making that specialized processors around the CPU do part of the work of the CPU (think on the first fixed instruction graphics accelerators, sound cards that off-load CPU, Physx and others).

    From a standalone CPU -> SMP ->A-SMP (Asymetric SMP).
  • 1 Hide
    silky salamandr , September 3, 2010 9:30 AM
    I agree with Scort. We have all this fire sitting on our desks and it means nothing if theres no software to utilize it. While I love the advancement in technology, I really would like devs to catch up with the hardware side of things. I think everybody is going crazy adding more cores and having an arms race as a marketing tick mark but theres no devs stepping up to write for it. We all have invested so much money into what we love but very few of us(not me at all)can actually code. With that being said, most of our machines are "held hostage" in what they can and cannot do.

    But great read.
  • 2 Hide
    corser , September 3, 2010 9:55 AM
    Hardware should be way time before software starts to take advantage of it. Has been like this since the start of the computing times.
  • 7 Hide
    Darkerson , September 3, 2010 10:17 AM
    Very informative article. I'm hoping to see more stuff like this down the line. Keep up the good work!
  • 1 Hide
    jestersage , September 3, 2010 10:30 AM
    Awesome start for a very ambitions series. I hope we get more insights and soon.

    I agree with Snort and silky salamandr, we are held back by developments on the software side. Maybe because developers need to take backwards compatibility into consideration. Just take games for example: developers would like to keep the minimum and even recommended specs down so that they can get more customers to buy. So we see games made tough for the top-end hardware but, thru tweaks and reduced detail, can be played on a 6-year old Pentium 4 with a 128mb AGP card.

    From a business consumer standpoint, and the fact that I work for a company that still uses tens of thousands of Pentium 4s for productivity related purposes, I figure that adoption of the GPU/CPU in the business space will not happen for another 5-7 years AFTER they launch. There is simply no need for an i3 if a Core2 derivative Pentium Dual Core or Athlon X2 could still do spreadsheet, word processing, email, research, etc. Pricing will definitely play into the timelines as the technology ages (or matures) but both companies will have to get money to pay for all that R&D from somewhere, right?
  • -4 Hide
    smile9999 , September 3, 2010 12:01 PM
    great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.
  • -1 Hide
    smile9999 , September 3, 2010 12:03 PM
    smile9999great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.


    I meant nvidia and ati dedicated low end GPUs
  • 2 Hide
    Onus , September 3, 2010 12:54 PM
    I also agree with scort. Part of the cause, however, I believe is nVidia's unwillingness to coexist. Whether it's their x86/chipset issues, or disallowing PhysX (minus 3rd party hacks) if there's an AMD card in the system, I suspect developers are averse to putting a lot of work into something that may not ever be standardized. I think GPGPU could give us the AI needed for vastly more immersive experiences, but it won't happen until the hardware vendors agree on (and/or accept) some standards.
  • 2 Hide
    NotYetRated , September 3, 2010 1:22 PM
    Death of the graphics card business? No. Way. Not for a long long time. The sound card only began to die because modern processors are in no way phased by the hit sounds takes on the system. Sound cards have become almost obsolete for most except for the audio enthusiast. (Though, I am not enthusiast and still have an X-fi...) We are nowhere near ready for CPU's to be able to handle graphics without taking very much a hit. In fact, graphics cards themselves cannot keep up with the pace that software is pushing it to. I think it will be a long, long time before we begin to see the phase out of dedicated graphics cards. At least for the gamers/engineers/creative professionals out there.... If we ever even do see the phase out.
  • -2 Hide
    hundredislandsboy , September 3, 2010 1:43 PM
    If I can still get the same gaming experience, same visual quality with same smooth fast framerates from a CPU/GPU, then go ahead and pour huge investments to R&D to make this happen. Why? Because it's a win-win fro everyone, users play less, cases gets less heat, electric bills get lowered, etc, etc...
  • 2 Hide
    Aerobernardo , September 3, 2010 2:08 PM
    I have two ideas for this kind of topics:

    1 - Tom's start a pool on wich questions do we want to ask?
    2 - I kinda hoped to hear something about what's next. We know about NDA's, but it would be nice to hear if AMD have intention to adopt new features like a proprietary 3D solution or what both AMD and Nvidia think about how a next generation VGA should do (or wich architecture should it have) to be successfull on todays perspective.
  • 0 Hide
    LORD_ORION , September 3, 2010 2:43 PM
    If Nvidia can bring out the best $150 card by Black Friday, I think they will be fine.

    Basically, they need something that blows the 5770 out of the water and costs the same or less... because AMD will lower the 5770 prices when that happens, so it can't perform on par.
  • 0 Hide
    rocky1234 , September 3, 2010 2:44 PM
    I do not think they will be getting rid of the stand alone graphics cards any time soon. Reason is that if they did it would send us back in performance at least 10 to 12 years in performance if we lost stand alone graphics cards all together. Maybe by 2017 they will have tech that can do it but right now it is not going to happen. Well it could I guess if all of the save the planet people have their way look at what the auto makers are starting to do rolling out 4 bangers in mid sized cars GM just said that is what they plan to do so I guess if enough pressure is applied anything can happen. Which is to bad because all it does is effect the consumers we get less & pay more for that less.
  • 4 Hide
    Moores Law , September 3, 2010 2:56 PM
    Nvidia better come up with something to counter the new AMD cards soon or this is going to get even uglier.
  • 1 Hide
    porksmuggler , September 3, 2010 2:57 PM
    Very good premise for an article, and an excellent write-up for the companies involved. I haven't read any articles from Andrew Ku before, is Tom's keeping the one guy who can spell hidden away? Beware, he may get criticism from the TL;DR crowd though. Those were great questions, but lets be honest *cough NDAs* the answers were generic PR we already know. Without Ku's analysis, this series would sink, thanks for putting the leg work into this.

    oh, and the 100% pie chart for question 1 is too funny.
  • 3 Hide
    kilthas_th , September 3, 2010 3:13 PM
    High phone bill? If only there were some way to have our conversations or even conferences routed effectively and cheaply over the internet ;) 
Display more comments