Sign in with
Sign up | Sign in

Nvidia Quadro FX 4800 Pro Goodness

By - Source: Tom's Hardware | B 15 comments

Nvidia has launched the Quadro FX 4800 professional graphics card, featuring 192 processor cores and 1.5 GB of memory.

Nvidia has been hard at work lately pushing out new high-end products for the professional market; last month releasing the Tesla personal supercomputer, as well as the world’s first 4 GB graphics card. While not quite as exciting as those releases however, Nvidia has started off December with the launch of its new Quadro FX 4800 ultra-high-end professional graphics card. The new graphics card features 192 CUDA parallel processor cores, 1.5 GB of GDDR3 frame buffer memory and 76.8 GB/s of memory bandwidth.

Designed for applications such as digital content creation and high-performance computational analysis, the Quadro FX 4800 seems to be the replacement for the aging Quadro FX 4600 graphics card, both of which are currently priced at $1,999. Loaded with 192 parallel processor cores though, the new Quadro FX 4800 has even more processor cores than the Quadro FX 5600, while having a lower price and requiring less power. According to benchmarks located on Nvidia’s website, the Quadro FX 5600 and Quadro FX 4800 perform identically.

Processor CoresMemory SizeMemory BandwidthMemory InterfacePower ConsumptionPrice
Quadro FX 58002404 GB102 GB/s512-bit189 W$3,499
Quadro FX 56001281.5 GB76.8 GB/s384-bit171 W$2,999
Quadro FX 48001921.5 GB76.8 GB/s384-bit150 W$1,999
Quadro FX 4600112768 GB67.2 GB/s384-bit134 W$1,999
Quadro FX 3700112512 MB51.2 GB/s256-bit78 W$799

The Quadro FX 4800 comes equipped with two DisplayPort connectors, a single Dual-Link DVI connector and has support for DirectX 10, OpenGL 3.0 and Shader Model 4.0. The card fills up the space of two slots, consumes 150-watt of power and assuming that a system can handle two of these cards, there is support for SLI frame rendering.

Display 15 Comments.
This thread is closed for comments
  • 0 Hide
    jaragon13 , December 3, 2008 8:03 PM
    What's the point of the Quadro 5600 then? Or is the data table wrong?
  • 0 Hide
    ckthecerealkiller , December 3, 2008 8:20 PM
    jaragon13What's the point of the Quadro 5600 then? Or is the data table wrong?

    As far as I know the only dif is the 5600 has 2 dual link DVI ports. Whoopdie freakin do.
  • 0 Hide
    NuclearShadow , December 3, 2008 9:14 PM
    What the heck is Nvidia thinking with the 5600? It has less Processor Cores than the 4800 and it requires more power! There is no excuse for it being priced that much higher no matter how many DL DVI ports it has.
  • 0 Hide
    emp , December 3, 2008 10:21 PM
    Well at least now we know what happened with all the old GT200 192SP meant for GTX 260.
  • 0 Hide
    Anonymous , December 4, 2008 1:39 AM
    the point of the 5600 is that it's the old card, just like the 4800 replaced the 4600. the 5800 replaced the 5600
  • 0 Hide
    Tindytim , December 4, 2008 8:43 AM
    I'd be interested in seeing someone get a few 5800's and switch them to the 280 GTX bios (their them same hardware other than one having more memory, and the different ports). Then seeing what sort of performance games they would get versus the normal hardware. But I don't know that many people that would do this normally, when going up the ladder of price, the performance increases decrease. So why not get a tri-SLI set-up with 12GB of ram?
  • -1 Hide
    eodeo , December 4, 2008 1:52 PM
    5600= 8800 ultra (with more video ram)
    5800= gtx 280 (with more video ram)
    4800= gtx 260 (with more video ram)
    4600= 8800gtx (same memory, and shader count is the same @ 128 not 112 like the article says)
    3700=8800gt=9800gt

    only thing different between them are drivers and the price tag. be aware thaat you can find all cards save for gtx 280 with same/more memory than the quadro model has.
  • 0 Hide
    antilycus , December 4, 2008 3:32 PM
    gamers donet need this. 3d rendering is much more intesnsive then your games. however, until someone makes a DLL or API or bakend to use the 192 as processors for computations this is useless to 3d rendering. BTW I cant see the chat window, so I apologize if there are typos
  • 0 Hide
    wavebossa , December 4, 2008 6:57 PM
    I cant see the window either.. Oh well. eodeo, your are a fool, i'm sory to say. But those cards areNOT the same. At all They may look similar on the outside but they are made to specialize in certain apps. Quadro specializes in 3d rendering, and up until the 5800, they couldn't render a game worth shit. gtx is the opposite. It can run CAD but nowher closeto the level the 56000 does.
  • 0 Hide
    wavebossa , December 4, 2008 6:58 PM
    See this is why we need the comment window back... I shoulda just typed it on word and copy paseted.
  • 0 Hide
    Tindytim , December 4, 2008 11:00 PM
    You'd think they were diffferent , but you'd be wrong. In fact, there are multiple guides are just hacking the Bios to chnage a regular desktop card into a workstation card. Other than the BIO, the memory, and the display portys, te cards are exactly the same. The reason they render in apps better is because of the drivers, and that's what you end up paying for, Drivers. Graphics card companies spend large amounts of cash developing drivers for various applications to squeeze the most out of their cards. So they make that back by making the cards ridiculously expensive.

    Read this:
    http://www.nvworld.ru/docs/sqe.html

    They are the same things, you're just paying for drivers.
  • 0 Hide
    eodeo , December 4, 2008 11:45 PM
    wavebossa:

    Quote:
    eodeo, your are a fool, i'm sory to say. But those cards areNOT the same.


    silly thing calling me a fool in matter you clearly you know nothing about. You should have at least researched a bit before calling my post wrong.

    Antilycus

    Quote:
    gamers donet need this. 3d rendering is much more intesnsive then your games.


    Wrong. 3D DCC programs cant use 30% of modern GPU power. Well, that’s also wrong. Its not that they cant use it, its that people using 3d don’t use it. 8800+ cards are capable of displaying 100 million polygons in real time, its just that most of us, 3d users, don’t need them to do as many.

    Only trouble with this is if you’re using non quadro card in non DirectX program. OpenGL is hacked in non quadro drivers to work at minimal speed. Since most new programs have DirectX only poor Mac/Linux users are stuck in need of a quadro- but they too can simply softmod their card to trick drivers into thinking that it’s a quadro and boom- all performance gained even under OpenGL with a non quadro card.

    How is this possible, you might be asking yourself (wavebossa)? Simple- both cards use the exact same chip- see my post above for true exactness.

    Tindytim

    Quote:
    They are the same things, you're just paying for drivers.


    Exactly. You can get more ram on partner made modification of the base line chip. I’ve seen 8800gts being sold with 2gb of ram. Still, its true that I haven’t seen any with 4gb, but you’re HIGHLY unlikely to need that much video ram. More so when you consider current CPUs bottlenecking even 1gb ram cards.
  • 0 Hide
    eodeo , December 4, 2008 11:48 PM
    How do I know what (silly named) quadro uses what chip?

    http://en.wikipedia.org/wiki/Nvidia_Quadro

    excerpt:

    FX 5800 | GT200GL | Chip also used in GeForce GTX 280 (240 shaders)
  • 0 Hide
    Tindytim , December 5, 2008 11:14 PM
    eodeoHow do I know what (silly named) quadro uses what chip?http://en.wikipedia.org/wiki/Nvidia_Quadroexcerpt:FX 5800 | GT200GL | Chip also used in GeForce GTX 280 (240 shaders)

    Hmm. I can't seem to find the guide I was using.

    But just check the specs of the cards, they should be the same, or very similar.
  • 0 Hide
    Thor , December 7, 2008 9:38 AM
    I would like to see, just for FUN, a benchmark who compare these video card against the game card (Radeon HD 4870 X2,GTX 280).