AMD Has A One-year Exclusive On PCIe External Connector

 

Santa Clara (CA) - Last week, we learned that AMD has put the finishing touches on its external graphics platform, short XGP, and aims to be rolling out products based on the specifcation soon. Now we hear that AMD has secured a one-year exclusive right to the technology with JAE, the mahufacturer of the connectors, and we expect AMD to squeeze every possible penny out of XGP.

Based on the initial PCIe External specification, AMD’s Graphics Products Group (GPG, ex-ATI) began developing its own PCIe External connector some time ago with the intent to create a notebook friendly graphics product. As it was the case with GDDR3, GDDR4 and GDDR5, AMD has been spending its own time and money to create a spec that eventually will be used by many, but this time around, the company made sure it got a head start.

To protect its Intellectual Property (IP), AMD struck a one year exclusive agreement with JAE, manufacturer of the XGP connectors, and is looking to take advantage of this scenario in any way it can. It seems that the company finally decided to become more aggressive when it comes to advertising graphics and this is where XGP really shines. However, XGP can be used for much more than external graphics.

XGP is essentially integrates PCI Express Gen2 and you can connect any devices supporting the standard to it. There is not much beyond graphics at this time that would make much sense as extension to a notebook and it is clear that some notebooks can use improved performance. Plus, the technology allows you to get a decent portable graphics system without the need of carrying around at 12 pound notebook. ATI also added wires for an embedded USB hub, which can or cannot be used.

According to Ogi Brkic, AMD’s product manager for notebook graphics, AMD intends to build an eco-system in which partners will be able to build various external boxes containing one or more graphics modules, DisplayLink chips, USB hubs, wireless USB hubs, Wi-Fi routers or network storage boxes (NAS).

In its current shape, XGP supports the majority of modern notebook chipsets. Currently, XGP can only accelerate displays that are connected to the external box, while support for accelerating notebook displays is expected in late Q3. That could turn out to be a serious issue in the short term, as the majority of users wants like to play a game on the actual notebook screen. It is not common to have secondary display lying around, unless you use your notebook as desktop replacement, in which case it is almost certain you have more than just one display.

The first company to adopt XGP for its notebook series was Fujitsu Siemens, which said it is developing consumer and professional XGP products. Will we see external FireGL boxes? We certainly think so, but we will have to wait until Siggraph how such a solution will look like.

It will be interesting to see how Taiwan will be reacting to this technology: There is money to be made on external graphics boxes and since most of vendors in graphics business are strapped for cash, XGP parts offer an opportunity to increase revenue and operating margins.

  • eagle07
    This is spending money on a solution that can be carried from one notebook to the next...
    Instead of having a shared rendering tower at a work site employers can have a shared firegl... take your laptop plug in and render
    Reply
  • woodco2001
    Incredibly cool concept if I understand it right. 1 graphics card for all my boxes. Sweet.... Saves that $100/year upgrade on each. Right now that is 4 times. Each 3400 ATI class card is at least 40 bucks. Buying a hub like this is $160 USD in the bank each time I upgrade. I totally love it. Go AMD go ATI go go go

    Seti and Utopia lifer
    Reply
  • I
    I suspect the market for this is very small, remembering most systems are sold with integrated video just to reduce the cost of video, and that buying a laptop with a better GPU in it is inevitably going to be cheaper than an external box plus power supply plus video card.

    IMO, putting the video card in the monitor itself would make more sense for most, or just leaving it like it is now.
    Reply