Sign in with
Sign up | Sign in

CUDA And Parallel Processing

PCI Express 3.0: On Motherboards By This Time Next Year?
By

We are entering an age of the desktop supercomputer. We have access to massively parallel graphics processors, along with power supplies and motherboards that can support as many as four cards at the same time. Nvidia’s CUDA technology is transforming the graphics card into a tool for programmers working not only with games, but with science and engineering. The programming interface has already played an instrumental role in solutions for enterprises as diverse as medical imaging, mathematics, and oil and gas exploration

nVidia’s CUDA showcasenVidia’s CUDA showcase

I asked OpenGL programmer Terry Welsh, from Really Slick Screensavers, for his thoughts on PCI Express 3.0 and GPU processing. Terry told me “PCI Express was a great boost, and I'm happy with them doubling the bandwidth anytime they want, as with 3.0. However, for the types of projects I work on, I don't expect to see any difference from it. I do a lot of flight-sim stuff at work, but that's mostly bound by memory and disk I/O; the graphics bus isn't a bottleneck at all. I can easily see [PCI Express 3.0] being a big boost, though, for GPU compute applications, and people doing scientific viz on large datasets.” 

The explosion Easter Egg in Terry Welsh’s Skyrocket Screensaver The explosion Easter Egg in Terry Welsh’s Skyrocket Screensaver

The ability to double transfer speed when working with mathematics-intensive workloads is sure to enhance both CUDA and Fusion development.  This is one of the most promising areas for the upcoming PCI Express 3.0 interface.

Display all 69 comments.
This thread is closed for comments
Top Comments
  • 23 Hide
    tony singh , August 3, 2010 6:43 AM
    What the..... pcie3 already devoloped & most games graphics are still of geforce 7 level thnk u consoles..
  • 14 Hide
    shortbus25 , August 3, 2010 10:49 AM
    NVidia=Global Warming?
Other Comments
  • 8 Hide
    cmcghee358 , August 3, 2010 6:34 AM
    Good article with some nice teases. Seems us regular users of high end machines won't see a worth until 2012. Just in time for my next build!
  • 23 Hide
    tony singh , August 3, 2010 6:43 AM
    What the..... pcie3 already devoloped & most games graphics are still of geforce 7 level thnk u consoles..
  • 5 Hide
    darthvidor , August 3, 2010 7:03 AM
    just got pci-e 2.0 last 2008 with my x58 ... time's flying
  • 9 Hide
    iqvl , August 3, 2010 7:13 AM
    Good news to peoples like me who haven't spent any money on PCIE 2.0 DX11 card due to nVidia's delay in shipping GTX460.

    Can't wait to see PCIE 3.0, native USB3/SATA3, DDR4, quad channel and faster&cheaper SSD next year.

    In addition, I hate unreasonably priced buggy HDMI and would also like to see the Ethernet cable(cheap, fast and exceptional) based monitors as soon as possible.
    http://www.tomshardware.com/news/ethernet-cable-hdmi-displayport-hdbaset,10784.html

    One more tech that I can't wait to see: http://www.tomshardware.com/news/silicon-photonics-laser-light-beams,10961.html

    WOW, so much new techs to be expected next year!
  • -6 Hide
    Casper42 , August 3, 2010 7:33 AM
    I havent read this entire article but on a related note I was told that within the Sandy Bridge family, at least on the server side, the higher end products will get PCIe 3.0.

    And if you think the Core i3/5/7 desktop naming is confusing now, wait till Intel starts releasing all their Sandy Bridge Server chips. Its going to be even worse I think.

    And while we're talking about futures, 32GB DIMMs will be out for the server market most likely before the end of this year. If 3D Stacking and Load Reducing DIMMs remain on track, we could see 128GB on a single DIMM around 2013, which is when DDR4 is slated to come out as well.
  • 3 Hide
    JonnyDough , August 3, 2010 7:43 AM
    Quote:
    After an unfortunate series of untimely delays, the folks behind PCI Express 3.0 believe they've worked out the kinks that have kept next-generation connectivity from achieving backwards compatibility with PCIe 2.0. We take a look at the tech to come.


    It's nice to see the backwards compatibility and cost be key factors in the decision making. Especially considering that devices won't be able to saturate it for many years to come.
  • 9 Hide
    rohitbaran , August 3, 2010 7:50 AM
    Quote:
    Nothing in the world of graphics is getting smaller. Displays are getting larger, high definition is replacing standard definition, the textures used in games are becoming even more detailed and intricate.

    Even the graphics cards are getting bigger! :lol: 
  • -3 Hide
    iqvl , August 3, 2010 7:53 AM
    rohitbaranEven the graphics cards are getting bigger!

    I believe that he meant gfx size per performance. :) 
  • -3 Hide
    Tamz_msc , August 3, 2010 7:57 AM
    Quote:
    We do not feel that the need exists today for the latest and greatest graphics cards to sport 16-lane PCI Express 3.0 interfaces.
    Glad you said today, since when Crysis 3 comes along its all back to the drawing board, again.
  • 0 Hide
    rohitbaran , August 3, 2010 7:58 AM
    iqvlI believe that he meant gfx size per performance.

    Still, the largest cards today are a bit too large! Aren't they!
  • -8 Hide
    qhoa1385 , August 3, 2010 9:42 AM
    NO!
    I HATE YOU TECHNOLOGY!

    lol
  • 7 Hide
    descendency , August 3, 2010 10:02 AM
    rohitbaranEven the graphics cards are getting bigger!

    And thanks to NVidia, hotter.
  • 8 Hide
    LordConrad , August 3, 2010 10:49 AM
    "After an unfortunate series of untimely delays..."

    A series of unfortunate events? That sounds familiar...
  • 14 Hide
    shortbus25 , August 3, 2010 10:49 AM
    NVidia=Global Warming?
  • 2 Hide
    Anonymous , August 3, 2010 10:53 AM
    Very pleased with all this, looks like 2012 Q1/2 will be my new PC build, should all come together nicely then!
  • 1 Hide
    ta152h , August 3, 2010 11:02 AM
    This article could have been written in a sentence. PCI-E 3.0 will be out in 2011 and will be faster.

    Perhaps you could have explained why CUDA would benefit from this, or what type of apps that use it could. Fusion makes no sense to me, since the GPU and CPU will not be connected using PCI-Express, and be on the same die. Maybe you could explain why these things are going to benefit.

    Also, according to the visual, latency will be lowered. Bandwidth is essentially irrelevant in many situations, since it's only rarely fully used, but latency could make itself felt in virtually anything.

    You also could have included the extra power use this extra speed will take. It almost certainly will, all other things being equal. That's a huge consideration. If I have to add, say 15 watts to my motherboard, is it worth it for a technology that might not be relevant for many situations, in the relative near term? If it's one or two watts, it's a no brainer, but, if it's a lot higher (which I suspect it might be), people need to really ask if they need this technology, or if it's better to wait until the next purchase, when it might have more value.
  • -6 Hide
    Mousemonkey , August 3, 2010 11:19 AM
    Quote:
    And thanks to NVidia, hotter.

    With a bit of help from ATi of course.
  • 1 Hide
    cmartin011 , August 3, 2010 11:38 AM
    they should integrate Intel's new optic technology in to it give it twice the bandwidth on top of that 64 gb/s or more
  • -4 Hide
    hardcore_gamer , August 3, 2010 1:30 PM
    finally.........
Display more comments