Sign in with
Sign up | Sign in

ASUS Creates Upgradeable Graphics Cards

By - Source: Tom's Hardware US | B 21 comments

 

Taipei (Taiwan) - The vision of upgradeable graphic cards goes back to the late 1990s, when Micron Technology was experimenting with removable sockets. In 2006, both MSI and Gigabyte showcased upgradeable graphic cards, but their concepts, which were based on GeForce Go MXM boards, never took off. Earlier this year, Asus introduced a single board with three MXM slots for ATI Mobility Radeon 3850 or 3870 cards (upgradeable with future parts), and has now unveiled its single-MXM product.

Called Splendid HD 3850M, this card doesn’t look like anything special, until you remove the dual-slot cooler. What you can see then is a MXM card with a RV670 chip and 512 MB of memory attached to the PCB that contains the Splendid HD video processor: The video processor features 12-bit gamma correction, 7-region color enhancement and dynamic contrast engine.

The Graphics chip is clocked at 668 MHz while the 512 MB GDDR3 memory operates at 828 MHz DDR (1.65 GT/s). According to Asus, this MXM card will score around 600 3DMarks (3DMark06) more than ATI’s own reference design. But what makes the different, is the fact that this product is significantly shorter than the Radeon 3850 or 3870 ATI reference design.


The Asus Trinity card has three MXM slots. The company is currently selling the card with three modules based on the Radeon HD 3850.

Thanks to a modular design, you will be able to upgrade to upcoming MXM modules, including ATI’s RV770 and RV870 chips (Radeon HD 4800, 5800 series). Interestingly, there should be no issue to put a Nvidia-GPU based MXM module onto this card, since there is no limiting logic.

Using this design, you can imagine a future where users will upgrade their graphics experience simply by buying a small module. If you would have to buy just the GPU and memory, this approach would actually lead to less money being spent, since you don’t need to buy the complete card over and over again.

This new line of products appears to be much more than an engineering exercise. We hope to see future designs incorporating HDMI-in on graphics cards too, just like on the much anticipated professional sound card, Xonar AV1 .

Asus is now on track of doing something new, something that can put them clearly ahead of the competition.

Display 21 Comments.
This thread is closed for comments
  • 0 Hide
    Zerk , May 8, 2008 11:34 PM
    Sweet Card,that will be Cool, i Guess?
  • 0 Hide
    RADIO_ACTIVE , May 8, 2008 11:45 PM
    Interesting...
    Anything that would make Graphics Cards cheaper gets an A+ in my book.
    We will have to wait and see if this product takes off or not.
    I like that is is small, I have 2x 8800GTX cards and they take up so much room. My wife has a 8800GTS 512 and it takes up even more room.
  • 0 Hide
    N19h7M4r3 , May 9, 2008 12:08 AM
    I think this should have already existed in a commercial basis for a very long time... a graphic card is almost a small computer... so why havent we been able to buy just a new GPU or memory, i know getting everything to work will be hard, but we can just imagine how it was to make the first motherboard with removable slots for everything...
  • 0 Hide
    inglburt , May 9, 2008 12:09 AM
    If it actually works out like they want, I'm all for it. It sucks spending 3-500 on new video cards every couple years or so.
  • 0 Hide
    Turas , May 9, 2008 12:18 AM
    OK I see "The Asus Trinity card has three MXM slots. The company is currently selling the card with three modules based on the Radeon HD 3850. " in the article. Please tell me where one could buy such a card.
  • -1 Hide
    miahallen , May 9, 2008 12:31 AM
    Quote:
    Using this design, you can imagine a future where users will upgrade their graphics experience simply by buying a small module. If you would have to buy just the GPU and memory, this approach would actually lead to less money being spent, since you don’t need to buy the complete card over and over again.


    That's the stupidest thing I've ever heard. An MXM module IS a complete card! THIS WILL NOT SAVE THE END USER MONEY!
  • 1 Hide
    lopopo , May 9, 2008 4:02 AM
    I read a review on this on techreport turns out it sucks because drivers. What the hell is up with people now days they advertise and want you to buy 3 way sli and three of this and four of that and the drivers aren't ready yet or no software can take advantage...if I pay $ I want scalability
  • 0 Hide
    virtualban , May 9, 2008 8:10 AM
    I agree that it might and probably will have some point of view where less money is spent on an upgrade (that's just the idea behind this, isn't it?), but sure it will prevent better products in the long run, as the already limited budget of ATI for example (but even if it was Intel) would be diverted to compatibility with sockets and previous designs and buss and stuff and more money for the same performance for the rest of us.
  • 0 Hide
    KyleSTL , May 9, 2008 1:32 PM
    TurasThe Asus Trinity ... Please tell me where one could buy such a card.

    You won't be able to. According to news reports ATI will make ~10 units. It's like asking when you can buy a Ferrari Enzo FXX, never going to happen, unless you have 'connections', or are on a very short hand-picked list.
  • 0 Hide
    mf_fm , May 9, 2008 3:43 PM
    ASUS FTW, you just can't go wrong with ASUS's board.

    15+ years of usage, my first ASUS board is still in working condition.

    ever since then, every mobo and graphic cards that i bought is from ASUS, almost 0 problem. almost.

    still, ASUS is an A+ brand. strong brand which i will recommend without embarrassment to myself.
  • 0 Hide
    maxinexus , May 9, 2008 5:00 PM
    This idea is not new!!! But why not make a socket and upgrade just a GPU exactly like we do CPU on MB.
  • 0 Hide
    kittle , May 9, 2008 7:40 PM
    And ditto for the GPU memory.
    The gamer wants mass processing power. The artist wants lots of memory. The game developer wants both. so we dont need 3 cards, we need 1 card with options.
    Id love to see this take off, although the initial outlay for the upgradable card will probably run quite a bit.
  • 0 Hide
    hcforde , May 10, 2008 1:55 AM
    Back in the middle of Feb I made this post
    http://www.tomshardware.com/forum/248601-33-cheaper-crossfire-build

    It basicaly addressed what ASUS is doing now, and was wondering why it wasn't being done
  • 0 Hide
    groo , May 11, 2008 3:39 AM
    MXM is just PCIe x16 for laptops

    laptop graphics are usualy several months behind and more expensive
  • 0 Hide
    nachowarrior , May 12, 2008 4:48 AM
    to mf_fm. I loved asus as well... but their recent socket am2 board in my price range had a very low rating, mostly due to shipping doa's. I'd think they'd have fixed that by this time in it's life... but alas, no. SO for a 100 dollar or less am2 board... i ordered gigabyte. because the best one from asus in that price was just too junky. I also ordered a gigabyte video card... just because it was super cheap at the time compared to all the others and it had a great feature list, and with a game... I'm happy with it to say the least. "rock solid" would be what i'd say, but that term is coined by asus... I would still go for an asus board in the next generation if it fit my needs. but testing out some of this "ultra durable" tech... I think that's the way to go, super stable even at overclocked speeds... gigabyte is rocking the cradle. asus needs to re-focus on the mid-high range enthusiast again... none of their late products have really anything to impress me for the price tag they sit at... it's just too much to pay for the name when I know another name that works just as well if not better. Anyway, having said that... this 'product' is the type of reason i still look at asus as a viable company to buy from dispite their recent plunge in amd motherboard quality. Keep it up, and put this b*tch on the shelf for the right price and i'll buy one.
  • 0 Hide
    nachowarrior , May 12, 2008 4:58 AM
    to virtual ban...

    that's just the thing.... a gpu doesn't do anything but interface with gddr and ship information off to the motherboard to my understanding... so technically they could install something to scale power input per installed gpu/memor, and just put an inordinate amount of lanes from the memory to the gpu... (ever notice there's more memory chips the wider the memory interface for the same amount of memory?) basically making it upgradeable without worrying about buss speeds... just increase the lanes from gddr to gpu and there's no problem provided it's still on pci-e 1.0 or 2.0... as for smaller 'processes' and 'socket compatibility'... gpu's don't really have the same problems that cpu's do... they're made for one specific purpose, and they do it well... something like this will only not be seen on shelves because it would provide less instant profit, and alienate even more mainstream consumers from upgrading their graphics card... because then they'd have to know how to apply thermal paste and a heat sink OOOOOOOO omg noooo!!!! haha. :-p
  • 0 Hide
    nachowarrior , May 12, 2008 5:17 AM
    and to athagulus... or whatever your name is....

    why is it that i can pull out my 65w cpu and dump in a 125w cpu and have no problems? why can't that work on another board that functions more indipendantly than any other piece of the pc? oh yeah... that's cuz you don't realize that...hmmm pci-e interface can carry up to 75 watts... so if it consumes 125 watts, where do the rest come from? oh yeah, the secondary input for power on the back of your card... gpu's can scale power just as well if not better than cpu's. arg... people keep forgetting that depending on how you look at it, a gpu has a ridiculous advantage on a cpu... IF they weren't apples to oranges, i'd say gpu's win out... But what the hell do i know?
  • 0 Hide
    Anonymous , May 12, 2008 4:34 PM
    I have been wondering why they haven't done this already. I just wish they would separate the memory from the GPU module too. All the recent generations of cards are using 512MB, why keep buying the 512MB again and again.

    It would be even nicer if they managed to make memory sockets so the memory could be upgraded or more could be added at a later date.

    I think its all about money, the gfx card manufacturers make money on every bit of the card, the board, the memory, the connectors, the power transforming circuitry, and the GPU. Its in their interest to sell the whole enchilada over and over again.
  • 0 Hide
    KyleSTL , May 12, 2008 5:45 PM
    Quote:
    I have been wondering why they haven't done this already. I just wish they would separate the memory from the GPU module too. All the recent generations of cards are using 512MB, why keep buying the 512MB again and again.

    It would be even nicer if they managed to make memory sockets so the memory could be upgraded or more could be added at a later date.

    I think its all about money, the gfx card manufacturers make money on every bit of the card, the board, the memory, the connectors, the power transforming circuitry, and the GPU. Its in their interest to sell the whole enchilada over and over again.

    It's because memory modules that motherboards use are 64-bit bus width, because the memory controller address each stick as a bank, instead of each memory chip individually. That would have a massive effect on 3D performance if we went back to 64-bit bus width, however, I understand your point, and I'm sure this could be implemented (with a increase in access time, because of the increase in trace lengths) with a different module design. Conversely, memory prices for DDR2 have plummeted, and are so commoditized the decrease in part price would be neglegible compared to the cost of R&D and production cost of the PCBs (number/length of traces, and additional socket).
  • 0 Hide
    spearhead , May 12, 2008 8:34 PM
    it would be cool if i could aquire some fast and extreemly scrace mxm modules this way and place one of those inside my notebook to have 3000 dollar notebook graphics for only a mere 200 dollars :D 
Display more comments