Nvidia Tech Demonstration Reveals CPU Focus
This week Nvidia held a press event here at Computex in order to show off its technology and talk about its direction for future technology.
The interesting thing though is that the company spent little to no time on its GPU tech. Nvidia CEO Jen-hsun Huang spoke hardly a word about any upcoming technology in the company's pipeline for GPUs.
What Nvidia did spend most of its time talking about though is its CUDA technology, which focuses entirely on general, highly parallel computing. Much of what Nvidia has been talking in the last year has been about CUDA. The company has received some industry criticism recently for not putting enough emphasis on its GPUs.
While in a press meeting, we even overhead several journalists saying that "Nvidia hasn't had a major GPU breakthrough since the 8800 GTX."
One thing's certain, Nvidia is putting a lot of emphasis on CUDA.
Despite this, Nvidia did make an attempt at showing off its 3D glasses technology. Unfortunately, the tech demo didn't quite work right and many in the crowd were left wondering if Nvidia even realized that the demo didn't work. Unfortunately, no one spoke up, and instead gave hesitant applause.
Could this really mean that AMD and Intel should safe guard their respective markets a bit more aggressively? Absolutely.
Hey wait a minute! There is another company that was all into selling the same product under different names.
Don't think that went to well for them.
Maybe Nvidia needs to go Ch 11 so they can actually start to create new products.
I'm guessing the 8 series fiasco with the broken GPU's and being sued by every major distributor in the world is taking its toll on them financially.
I honestly thought nVidia was finally starting to get into the CPU market.
What do you call the GTX 280? Came out a year ago but it's still essentially the top offering from Nvidia (GTX 285 is basically the same thing).
I do feel the impatience that the journalists are feeling though and it is well-known that Nvidia hasn't been doing much of anything in the GPU market. The only thing on the horizons is the GT-300 series which will hopefully be released at the end of the year. That's still a pretty long time since the GTX 285/295 releases at the very beginning of the year.
You mean Intel, right?
I would say Nvidia has their head in their A$$, but their head is too big to fit. The 6 Series was my last Nvidia GPU for a while.
Nvidia doesn't want to be the one company sitting back just doing GPU's when everyone else is doing both. The reason we aren't seeing so much in the way of GPU advancement from them, I think, is because they might be in the basement building a CPU to take on Intel and AMD. I know that some of you may laugh at that, but think about it: if you were the only chipmaker doing JUST graphics, and not doing them well enough to be outstanding (really, ATI's offerings satisfy much of the market in ways Nvidia only sort of brushes up against), and now the biggest player in the CPU market walks into your backyard with a boomstick in hand (Intel and Larrabee), what do you do?
Read these articles and you'll know why it's going to be a revolutionary GPU chip:
http://brightsideofnews.com/news/2009/5/18/nvidia-geforce-gtx380-clocks-leak-out.aspx
http://brightsideofnews.com/news/2009/5/16/nvidia-g(t)300-already-taped-out2c-a1-silicon-in-santa-clara.aspx
http://brightsideofnews.com/news/2009/5/12/nvidias-gt300-is-smaller2c-faster-than-larrabee.aspx
http://brightsideofnews.com/news/2009/5/5/gt300-to-feature-512-bit-interface---nvidia-set-to-continue-with-complicated-controllers.aspx
http://brightsideofnews.com/news/2009/4/22/nvidias-gt300-specifications-revealed---its-a-cgpu!.aspx
Switching from SIMD to MIMD on it's cores is going to open up a whole load more performance. Not to mention double the cores over the GT200's.
That's idiotic.
It increases the price of a single purchase, makes customization difficult, and makes upgrading much more expensive and less worthwhile.
That's why both Intel and Nvidia have idiotic ideas about these all in one solutions for things they aren't that great at. Intel makes the best performing CPU, and Nvidia makes the best performing GPU, I shouldn't have to decide whether I want one or the other, I should just get both and customize.
You may be right, but think it is like this:
If every system has at least this kind of system (CPU+GPU+12GB), developer can do their work more eficiently, the code can be more streamline and efficient, the do not have to worry how their program would run on joe's machine that only have integrated GPU with slow CPU and 128 MB RAM ...
In the end, software is much more optimized, efficient and faster, that would benefit us all. No more "Can it play crysis ?" question :-D
A) Not everyone would jump on that ship. You're essentially making something like a game console (the original Xbox was off the shelf parts). And the fact of the matter is, even if you got everyone to buy this, you force people who don't need 12GB of RAM, and huge processing power for writing school papers, into spending money they wouldn't have to currently.
That ruins the whole concept of being a hardware enthusiast.
The fact of the matter is, under the current system, I can spend more if I need more performance and less if I don't, rather than being victim to someone else's idea of what ratio of GPU to CPU performance is best for every application.
I pride myself at understand and learning the performance of component, and there are many other people that feel the same way. Milking the best performance is a hobby to many people, not to mention competitive gamers.
That concept if why we no longer have the computer systems we had in the 80's.