NVISION '08 - What to Expect

Up until now, Nvidia has kept various aspects of its marketing practices in the channel. LAN parties under the GeForce LAN moniker would happen once a year or so, and press conferences would happen when necessary. This year, however, NVISION 2008 looks to unify the community and press around one all-encompassing event. NVISION will have something for everybody, from the Electronic Sports World Cup (ESWC) and the potentially record-breaking LAN party (Guinness will be here to verify) to presentations and exhibits for HD enthusiasts to a wide variety of speeches and keynotes.

Is NVISION the new E3? In some ways. Nvidia is keen on demonstrating its new technology and vendors such as it are finding it more prudent to setup own-run shows instead of participating in conglomerate events. On the other hand, NVISION isn’t intended to be a big party bash, but rather a show case to demonstrate cutting edge graphics technology — and then some.

This time around, Nvidia has timed its show with a pivotal point in its strategy. As frequent Tom’s Hardware readers know, Nvidia is pushing into the general processing realm with technology like CUDA. Application acceleration on the GPU is becoming a serious topic, even more so than dedicated physics processing was when people were becoming aware of it.

The most important presentation we can look forward to is the keynote from Nvidia chairman Jen-Hsun Huang, who should shed some light on CUDA and GPU acceleration for applications, and what we should expect down the road for graphics cards. Other presenters and special guests include former Apollo astronaut Buzz Aldrin, “Battlestar Galactica” star Tricia Helfer, Mythbuster stars Jamie Hyneman and Adam Savage, and Digg creator Kevin Rose.

Celebrities aside, the star of the show is still Nvidia’s graphics platforms, and with CUDA and CUDA 2.0 being aggressively pushed as of late, we should expect some sort of announcement regarding the GPU acceleration technology — as well as a possible list of supported applications. Also, it would not be a complete surprise if talk about the next generation of graphics cards came up GTX 380 anyone?)...but that might be wishful thinking. Expect some news form the myriad of hardware developers who will also be in attendance.

More to follow shortly.

  • I'm honestly surprised that they're throwing away PAX to do this. PAX is expected to have 50-60k people this year, and even if nVidia still want to do its own self-hyping pity party they should have still come to PAX to leverage that huge consumer contact. It really seems petty and shortsighted.

    For the record, there's no way in hell that show done by one company for its own self-aggrandizement will replace E3. The days of a game industry trade show having any real meaning are over. PAX has shown that a show that speaks to consumers directly is many times more effective, and the show can't seem to avoid growth. It's doubled every year, is now the largest in the hemisphere, and in 2010 there's going to be one on each coast at different times of the year just to try to keep pace with the demand! Nvision is doomed to be nothing more than a cheap attempt to hock a product. It won't have a fraction of the community that PAX does.
    Reply
  • bf2gameplaya
    I would like to see, in addition to this self-promotion, far more actual human resources from nVidia liasoned to developers and other concerns such as motherboard makers. Bring tools and talent to the experience makers, help them help you.

    I have no opinion on CUDA and no clear crystal ball on how parallelism is going to be successfully adopted, but it stands to reason that if you really want to move the industry in the direction you think it should go, you dedicate man-power to that task, not just throw a party to impress bored billionaires and cross your fingers things go your way.
    Reply
  • jaragon13
    I want to see a GTX 380 and a GTX 360 infact.

    380 will be(pure speculation if they are to make a good comeback to the 4870X2 "ultimate graphics" 512 bit GDDR5,with atleast 128 shader units,560 stream processors,and a bunch of crap that noone else cares about.Atleast 45NM,but 40 or 32NM would be plain outright BS.

    GTX 360 would be 384 bit GDDR5,with like 96 shader units and 320 stream processors

    Extremely,outrageously expensive graphics cards aside,I've seen the advertisements for Nvision.Seriously.WHO is actually going there? I still see ads for people to come for more prizes,yet I thought all the fanboys would be flocking? No?

    Ahh,well an overclocked 280 GTX would be nice.Would need to underclock it,so my computer wouldn't die from my morbidly not 800 watt power supply with three 24A 12V+ rails.
    Reply
  • mr roboto
    Mysteriously absent in photo 24 is the 4850, 4870 or 4870x2, instead Jen Hsun Huang is comparing all of Nvidia's major cards from the last 2 years to an older generation ATI 3870x2. Wuss.

    I still love my 8800GTX so until they release something that pounds it into the ground I'm good. Anyways how many good games really put a hurtin' on the 8800 GTX's? Not too many. Alright that's my rationalization for not upgrading and I'm sticking to it!
    Reply
  • kansur0
    This is a blatant attempt at trying to create an environment where there is no competition. Too bad they have to resort to using benchmarks from 3-6 months ago before ATI 4870 came out. Pathetic.

    A note on the GPU rendering front...nVidia would have liked ATI to use CUDA and then yank the rug out from under them a year later. The smart move was going with OpenCL. Huge. I can easily see ATI growing with Apple. Snow Leopard is largely based on OpenCL. I can envision an Apple MacPro using ATI GPU's talking through OpenCL. CPU's will operate the main app. All program functionality and effects can be driven through OpenCL. This will probably be the key to realizing full HD 1080p realtime effects which even the biggest dual quadcore system sweats.
    Reply
  • cruiseoveride
    i want to hear about if/any "Open Source Strategy" in the works.
    Reply
  • thomasxstewart
    Seems we are entering era of stream processors on card, in integrated processor & on CPU.

    Signed:PHYSICIAN THOMAS STEWART VON DRASHEK M.D.
    Reply