Sign in with
Sign up | Sign in

Nvidia Tech Demonstration Reveals CPU Focus

By - Source: Tom's Hardware US | B 24 comments

This week Nvidia held a press event here at Computex in order to show off its technology and talk about its direction for future technology.

The interesting thing though is that the company spent little to no time on its GPU tech. Nvidia CEO Jen-hsun Huang spoke hardly a word about any upcoming technology in the company's pipeline for GPUs.

What Nvidia did spend most of its time talking about though is its CUDA technology, which focuses entirely on general, highly parallel computing. Much of what Nvidia has been talking in the last year has been about CUDA. The company has received some industry criticism recently for not putting enough emphasis on its GPUs.

While in a press meeting, we even overhead several journalists saying that "Nvidia hasn't had a major GPU breakthrough since the 8800 GTX."

One thing's certain, Nvidia is putting a lot of emphasis on CUDA.

Despite this, Nvidia did make an attempt at showing off its 3D glasses technology. Unfortunately, the tech demo didn't quite work right and many in the crowd were left wondering if Nvidia even realized that the demo didn't work. Unfortunately, no one spoke up, and instead gave hesitant applause.

Could this really mean that AMD and Intel should safe guard their respective markets a bit more aggressively? Absolutely.

Display 24 Comments.
This thread is closed for comments
  • -7 Hide
    Anonymous , June 2, 2009 4:15 PM
    Nvidia is good at badge engineering.


    Hey wait a minute! There is another company that was all into selling the same product under different names.

    Don't think that went to well for them.

    Maybe Nvidia needs to go Ch 11 so they can actually start to create new products.
  • 0 Hide
    JeanLuc , June 2, 2009 4:28 PM
    I'm glad at least one journalist pressed Nvidia on the GPU situation, when are Nvidia going to put their arses into gear and get a 3 series GPU onto the market?

    I'm guessing the 8 series fiasco with the broken GPU's and being sued by every major distributor in the world is taking its toll on them financially.
  • 3 Hide
    zodiacfml , June 2, 2009 4:45 PM
    indeed but it's the only way to grow. very difficult for nvidia since intel's larrabee is based on x86 architecture and amd's solution is cheaper and simpler.
  • 8 Hide
    norbs , June 2, 2009 4:47 PM
    This article is kinda bad, says CPU in title then starts talking about GPU's and it has misspellings.

    I honestly thought nVidia was finally starting to get into the CPU market.
  • 2 Hide
    doomtomb , June 2, 2009 5:54 PM
    Quote:
    While in a press meeting, we even overhead several journalists saying that "Nvidia hasn't had a major GPU breakthrough since the 8800 GTX."

    What do you call the GTX 280? Came out a year ago but it's still essentially the top offering from Nvidia (GTX 285 is basically the same thing).

    I do feel the impatience that the journalists are feeling though and it is well-known that Nvidia hasn't been doing much of anything in the GPU market. The only thing on the horizons is the GT-300 series which will hopefully be released at the end of the year. That's still a pretty long time since the GTX 285/295 releases at the very beginning of the year.
  • 0 Hide
    antilycus , June 2, 2009 5:58 PM
    NVIDIA has to do something to compete against the big boys. Getting bought by the king bully Intel is NOT NOT NOT an option. If NVDA can put some good processors out to compete with the rest, at a desired price point, they have the ability to become a strong contender in the market and grow their company unbelievably large. Putting Intel in the weak spot, which they totally deserve.
  • 2 Hide
    Anonymous , June 2, 2009 6:02 PM
    They've been very busy with the ION platform which is just amazing,if you look at the mobo smaller than a human hand with everything on it!
  • 7 Hide
    sublifer , June 2, 2009 6:06 PM
    This is starting to piss me off... no, not Nvidia, this site, Tom's. Everytime I look at it I have to reload the page because it didn't render completely or correctly and its getting irritating. Just now I read the article thinking it was all fine (after numerous reloads on other articles earlier) but then got to the comments section and it was all jacked up. Anyone else seeing this?
  • 4 Hide
    Dax corrin , June 2, 2009 7:40 PM
    Yeah, it's a mess here too. Not rendering correctly.
  • 3 Hide
    kschwarz88 , June 2, 2009 7:47 PM
    I thought it might have something to do with these damn ads on the side. Dunno
  • 4 Hide
    Vettedude , June 2, 2009 8:04 PM
    Ads? Firefox and AdBlock+ are my best friends. :D 
  • 0 Hide
    Vettedude , June 2, 2009 8:11 PM
    joeman42Nvidia reminds me of Yahoo. Arrogantly insistent on a path in the face of contradicting truths, and destined to wither and eventually fail. They, not ATI, would have been the optimal merger partner if not for their CEO's ego. Worse, their two front war against IBM and AMD on CPUs is as likely to succeed as Yahoo against Google and Microsoft (or Bush vs Iraq and Afghanistan).

    You mean Intel, right?

    I would say Nvidia has their head in their A$$, but their head is too big to fit. The 6 Series was my last Nvidia GPU for a while.
  • 3 Hide
    scryer_360 , June 2, 2009 9:15 PM
    Nvidia needs to break into the CPU market though. Face it, Intel will now be offering discrete graphics, AMD already bought ATI for discrete graphics, and IBM has had a hold on the CPU market for industrial and mobile applications for some time.

    Nvidia doesn't want to be the one company sitting back just doing GPU's when everyone else is doing both. The reason we aren't seeing so much in the way of GPU advancement from them, I think, is because they might be in the basement building a CPU to take on Intel and AMD. I know that some of you may laugh at that, but think about it: if you were the only chipmaker doing JUST graphics, and not doing them well enough to be outstanding (really, ATI's offerings satisfy much of the market in ways Nvidia only sort of brushes up against), and now the biggest player in the CPU market walks into your backyard with a boomstick in hand (Intel and Larrabee), what do you do?
  • 0 Hide
    kakkoii , June 2, 2009 10:27 PM
    Yeah, Nvidia hasn't had a breakthrough since 8800's. But the GT300 is going to be a MAJOR fucking breakthrough.

    Read these articles and you'll know why it's going to be a revolutionary GPU chip:
    http://brightsideofnews.com/news/2009/5/18/nvidia-geforce-gtx380-clocks-leak-out.aspx

    http://brightsideofnews.com/news/2009/5/16/nvidia-g(t)300-already-taped-out2c-a1-silicon-in-santa-clara.aspx

    http://brightsideofnews.com/news/2009/5/12/nvidias-gt300-is-smaller2c-faster-than-larrabee.aspx

    http://brightsideofnews.com/news/2009/5/5/gt300-to-feature-512-bit-interface---nvidia-set-to-continue-with-complicated-controllers.aspx

    http://brightsideofnews.com/news/2009/4/22/nvidias-gt300-specifications-revealed---its-a-cgpu!.aspx


    Switching from SIMD to MIMD on it's cores is going to open up a whole load more performance. Not to mention double the cores over the GT200's.
  • 3 Hide
    starryman , June 3, 2009 12:51 AM
    Put the CPU, GPU, and 12GB of ram onto a single die. Charge me $600. I'll buy two.
  • 0 Hide
    TheMan1214 , June 3, 2009 12:54 AM
    Not a good thing to battle on two different industry fronts both with strong competitors
  • -4 Hide
    Tindytim , June 3, 2009 1:24 AM
    starrymanPut the CPU, GPU, and 12GB of ram onto a single die. Charge me $600. I'll buy two.

    That's idiotic.

    It increases the price of a single purchase, makes customization difficult, and makes upgrading much more expensive and less worthwhile.

    That's why both Intel and Nvidia have idiotic ideas about these all in one solutions for things they aren't that great at. Intel makes the best performing CPU, and Nvidia makes the best performing GPU, I shouldn't have to decide whether I want one or the other, I should just get both and customize.
  • -5 Hide
    goose man , June 3, 2009 3:33 AM
    @Tindytim

    You may be right, but think it is like this:

    If every system has at least this kind of system (CPU+GPU+12GB), developer can do their work more eficiently, the code can be more streamline and efficient, the do not have to worry how their program would run on joe's machine that only have integrated GPU with slow CPU and 128 MB RAM ...

    In the end, software is much more optimized, efficient and faster, that would benefit us all. No more "Can it play crysis ?" question :-D
  • -1 Hide
    Tindytim , June 3, 2009 3:51 AM
    goose manIf every system has at least this kind of system (CPU+GPU+12GB), developer can do their work more eficiently, the code can be more streamline and efficient

    A) Not everyone would jump on that ship. You're essentially making something like a game console (the original Xbox was off the shelf parts). And the fact of the matter is, even if you got everyone to buy this, you force people who don't need 12GB of RAM, and huge processing power for writing school papers, into spending money they wouldn't have to currently.

    goose manthe do not have to worry how their program would run on joe's machine that only have integrated GPU with slow CPU and 128 MB RAM ...In the end, software is much more optimized, efficient and faster, that would benefit us all. No more "Can it play crysis ?" question :-D

    That ruins the whole concept of being a hardware enthusiast.

    The fact of the matter is, under the current system, I can spend more if I need more performance and less if I don't, rather than being victim to someone else's idea of what ratio of GPU to CPU performance is best for every application.

    I pride myself at understand and learning the performance of component, and there are many other people that feel the same way. Milking the best performance is a hobby to many people, not to mention competitive gamers.

    That concept if why we no longer have the computer systems we had in the 80's.
  • -3 Hide
    zodiacfml , June 3, 2009 4:04 AM
    i agree brother. amd's solution, cpu+gpu is not far from old cpus that had integrated math co-processors.
Display more comments