Sign in with
Sign up | Sign in

Talking Heads: VGA Manager Edition, September 2010

Talking Heads: VGA Manager Edition, September 2010
By

How are Intel, AMD, and Nvidia shaping up for Q4'10 and 2011? Fourteen R&D insiders talk to us about future of discrete graphics. If you’re curious about the business side of things, we have their thoughts on Sandy Bridge, Llano, and the upcoming Fermis.

Tom’s Hardware generally focuses on its bread and butter—giving you access to reviews of the latest PC-oriented hardware to help you make the right buying decisions. But today, we introduce to you a new feature that we’d like to publish on a more regular basis. Instead of going hands-on with graphics cards, CPUs, and motherboards, we are presenting business-oriented content emphasizing the hot issues relating to technology.

It’s funny—because we spend so much time going hands-on with hardware, the same analyst firms that everyone likes to cite are the ones who call us asking about this processor generation versus that one, how AMD’s graphics match up to Nvidia’s, and so on. We’ll give those companies our assessment, and you’ll often read about it later. The difference is that business analysts are often looking at larger trends from a quarter-to-quarter or year-to-year perspective. Compare this to the reviewer’s job--comparing a product to the next or last big technology to run through the labs. For example, IT administrators want to know if AMD’s new Opteron offerings provide better value over Nehalem-based Xeons. Meanwhile, business-focused readers want to know where and to what degree AMD can retake server market share.

So, on an early morning in June, Chris Angelini and I started tossing ideas back and forth, trying to figure out how to bring business content to Tom’s Hardware (he had already embarked on this quest after building a Xeon 5600-based machine). One of the ideas that I suggested stuck: start  an industry dialogue about some of the most prevalent business trends and strategies.

However, we don’t want to just talk about our own opinions. This is an old and tired approach, and our insight isn’t necessarily straight from the horse’s mouth. Instead, our idea was to bring together leading industry figures directly involved in R&D, as well as the early product deployment process, to talk about hot topics

We should make clear these are not marketing representatives sent to evangelize certain agendas. If they are, they’re pulling double duty as product managers. The primary duty of public relations is to get good press, and sometimes it is hard to get those folks out of that mode without having to resort to alcohol (Chris and I are both in agreement that it would probably be unwise to do so, anyway).

We specifically chose to talk to people in charge of the technical aspect of their company’s graphics business. Depending on the organization, we carefully selected GMs, VPs, heads of departments, and R&D engineers. It is important to note that these are people from headquarters, meaning they bring us their ideas from a global perspective.

There were no barriers in our quest. If we needed to use another language to find the people we wanted, we used it (that’s the beauty of working for a global media company). Distance did not deter us, and if you saw our international phone bill, you’d understand our dedication to this project. No stone was left unturned to find the people we needed. To our participants out there, we extend our most gracious thanks and sincerest apologies  for the constant pestering.

Ultimately, we see this as a way to bring a better sense of industry dialog, answer a lot of your questions, end a lot of speculation, and hit you with surprising insights on the current and upcoming industry trends.

Display all 43 comments.
This thread is closed for comments
Top Comments
  • 23 Hide
    Anonymous , September 3, 2010 7:50 AM
    The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • 21 Hide
    Anonymous , September 3, 2010 6:40 AM
    Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • 20 Hide
    Kelavarus , September 3, 2010 6:50 AM
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
Other Comments
  • 21 Hide
    Anonymous , September 3, 2010 6:40 AM
    Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • 20 Hide
    Kelavarus , September 3, 2010 6:50 AM
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
  • 2 Hide
    TheStealthyOne , September 3, 2010 6:58 AM
    I've been waiting for this article! Yes :D 
  • 23 Hide
    Anonymous , September 3, 2010 7:50 AM
    The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • 3 Hide
    corser , September 3, 2010 9:21 AM
    I do not think that including a IGP on the processor die and conecting them doesn't means that discrete graphics vendors are dead. Some people will have graphics requirements that will overhelm the IGP and connect an 'EGP' (External Graphics Processor). Uhmmmm... maybe I created a whole new acronym.

    Since the start of that idea, believed that IGP on the processor die could serve to offload math operations and complex transformations from CPU to IGP, freeing CPU cycles for doing what is intended to do.

    Many years ago Apple made somewhat similar to this with their Quadra models that sported a dedicated DSP to offload some tasks from the processor to the DSP.

    My personal view on all this hype is that we're going to a different computing model, from a point that all the work was directed to the CPU and making some small steps making that specialized processors around the CPU do part of the work of the CPU (think on the first fixed instruction graphics accelerators, sound cards that off-load CPU, Physx and others).

    From a standalone CPU -> SMP ->A-SMP (Asymetric SMP).
  • 1 Hide
    silky salamandr , September 3, 2010 9:30 AM
    I agree with Scort. We have all this fire sitting on our desks and it means nothing if theres no software to utilize it. While I love the advancement in technology, I really would like devs to catch up with the hardware side of things. I think everybody is going crazy adding more cores and having an arms race as a marketing tick mark but theres no devs stepping up to write for it. We all have invested so much money into what we love but very few of us(not me at all)can actually code. With that being said, most of our machines are "held hostage" in what they can and cannot do.

    But great read.
  • 2 Hide
    corser , September 3, 2010 9:55 AM
    Hardware should be way time before software starts to take advantage of it. Has been like this since the start of the computing times.
  • 7 Hide
    Darkerson , September 3, 2010 10:17 AM
    Very informative article. I'm hoping to see more stuff like this down the line. Keep up the good work!
  • 1 Hide
    jestersage , September 3, 2010 10:30 AM
    Awesome start for a very ambitions series. I hope we get more insights and soon.

    I agree with Snort and silky salamandr, we are held back by developments on the software side. Maybe because developers need to take backwards compatibility into consideration. Just take games for example: developers would like to keep the minimum and even recommended specs down so that they can get more customers to buy. So we see games made tough for the top-end hardware but, thru tweaks and reduced detail, can be played on a 6-year old Pentium 4 with a 128mb AGP card.

    From a business consumer standpoint, and the fact that I work for a company that still uses tens of thousands of Pentium 4s for productivity related purposes, I figure that adoption of the GPU/CPU in the business space will not happen for another 5-7 years AFTER they launch. There is simply no need for an i3 if a Core2 derivative Pentium Dual Core or Athlon X2 could still do spreadsheet, word processing, email, research, etc. Pricing will definitely play into the timelines as the technology ages (or matures) but both companies will have to get money to pay for all that R&D from somewhere, right?
  • -4 Hide
    smile9999 , September 3, 2010 12:01 PM
    great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.
  • -1 Hide
    smile9999 , September 3, 2010 12:03 PM
    smile9999great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.


    I meant nvidia and ati dedicated low end GPUs
  • 2 Hide
    Onus , September 3, 2010 12:54 PM
    I also agree with scort. Part of the cause, however, I believe is nVidia's unwillingness to coexist. Whether it's their x86/chipset issues, or disallowing PhysX (minus 3rd party hacks) if there's an AMD card in the system, I suspect developers are averse to putting a lot of work into something that may not ever be standardized. I think GPGPU could give us the AI needed for vastly more immersive experiences, but it won't happen until the hardware vendors agree on (and/or accept) some standards.
  • 2 Hide
    NotYetRated , September 3, 2010 1:22 PM
    Death of the graphics card business? No. Way. Not for a long long time. The sound card only began to die because modern processors are in no way phased by the hit sounds takes on the system. Sound cards have become almost obsolete for most except for the audio enthusiast. (Though, I am not enthusiast and still have an X-fi...) We are nowhere near ready for CPU's to be able to handle graphics without taking very much a hit. In fact, graphics cards themselves cannot keep up with the pace that software is pushing it to. I think it will be a long, long time before we begin to see the phase out of dedicated graphics cards. At least for the gamers/engineers/creative professionals out there.... If we ever even do see the phase out.
  • -2 Hide
    hundredislandsboy , September 3, 2010 1:43 PM
    If I can still get the same gaming experience, same visual quality with same smooth fast framerates from a CPU/GPU, then go ahead and pour huge investments to R&D to make this happen. Why? Because it's a win-win fro everyone, users play less, cases gets less heat, electric bills get lowered, etc, etc...
  • 2 Hide
    Aerobernardo , September 3, 2010 2:08 PM
    I have two ideas for this kind of topics:

    1 - Tom's start a pool on wich questions do we want to ask?
    2 - I kinda hoped to hear something about what's next. We know about NDA's, but it would be nice to hear if AMD have intention to adopt new features like a proprietary 3D solution or what both AMD and Nvidia think about how a next generation VGA should do (or wich architecture should it have) to be successfull on todays perspective.
  • 0 Hide
    LORD_ORION , September 3, 2010 2:43 PM
    If Nvidia can bring out the best $150 card by Black Friday, I think they will be fine.

    Basically, they need something that blows the 5770 out of the water and costs the same or less... because AMD will lower the 5770 prices when that happens, so it can't perform on par.
  • 0 Hide
    rocky1234 , September 3, 2010 2:44 PM
    I do not think they will be getting rid of the stand alone graphics cards any time soon. Reason is that if they did it would send us back in performance at least 10 to 12 years in performance if we lost stand alone graphics cards all together. Maybe by 2017 they will have tech that can do it but right now it is not going to happen. Well it could I guess if all of the save the planet people have their way look at what the auto makers are starting to do rolling out 4 bangers in mid sized cars GM just said that is what they plan to do so I guess if enough pressure is applied anything can happen. Which is to bad because all it does is effect the consumers we get less & pay more for that less.
  • 4 Hide
    Moores Law , September 3, 2010 2:56 PM
    Nvidia better come up with something to counter the new AMD cards soon or this is going to get even uglier.
  • 1 Hide
    porksmuggler , September 3, 2010 2:57 PM
    Very good premise for an article, and an excellent write-up for the companies involved. I haven't read any articles from Andrew Ku before, is Tom's keeping the one guy who can spell hidden away? Beware, he may get criticism from the TL;DR crowd though. Those were great questions, but lets be honest *cough NDAs* the answers were generic PR we already know. Without Ku's analysis, this series would sink, thanks for putting the leg work into this.

    oh, and the 100% pie chart for question 1 is too funny.
  • 3 Hide
    kilthas_th , September 3, 2010 3:13 PM
    High phone bill? If only there were some way to have our conversations or even conferences routed effectively and cheaply over the internet ;) 
Display more comments