Look Who's Here: Matrox With Two Graphics Cards
Matrox has built two graphics cards for the professional market, both of which carry AMD GPUs.
Matrox is back! Okay, not really, as they've been working in the professional space for a while now, but that is what everyone screams whenever we see a new Matrox product come along, right? Matrox has announced two new graphics cards for the professional market, the C420 LP and the C680.
The C420 LP is a low-profile graphics card that is capable of driving four individual displays with resolutions of up to 2560 x 1600. It's a low-power card and can therefore be passively cooled. The C680 is a single-slot graphics card capable of driving up to six 4K displays. It's no longer a low-power card, but that's not all that surprising because six 4K monitors is a lot of pixels to push. Do note that you will only be able to push 60 Hz across the 4K panels with the C680 if you've only got three displays attached; any more and they will operate at 30 Hz.
Both of the cards are based on AMD GPUs, and they both carry 2 GB of GDDR5 graphics memory. No details were given on how many shaders these GPUs have, or what clocks they run at, although they do support DirectX 11.2, OpenGL 4.4 and OpenCL 1.2. The graphics cards operate over a PCI-Express x16 interface and have support for all the latest Windows operating systems and Linux distros.
You're probably wondering who these cards are aimed at, but that's simple: folks who need a lot of screen real estate and want a very reliable graphics card to power them. Working in the professional space, Matrox is set on ensuring that its cards are as reliable as possible. These cards can also be used for digital signage, and in order to ensure that you don't get tearing or artifacts between the various displays, they also come with a frame-lock feature, which ensures that all the attached displays are refreshed simultaneously, even if you're using more than one of these cards.
Matrox did not reveal pricing, although you probably don't want to know anyway.
Niels Broekhuijsen thinks Matrox is cool, and that they should make their way onto the consumer market once again. Follow him at @NBroekhuijsen. Follow us @tomshardware, on Facebook and on Google+.
well, maybe it is attacked, who knows what kind of diabolical things that amd based card is doing to those poor monitors!
I miss Glide too....
*sigh* remember when Matrox tried to get into gaming... that was fun times!
My guess is that these cards aren't worth a flip at gaming, that's the biggest difference. On the other hand it shows to me that both the major desktop graphics card companies are either holding back or thinking inside a box that prevents them from truly innovating. We are told by the hardware press that we are going to have to be satisfied with a slower GPU development cycle because of the difficulties involved in continued die shrinks, but, frankly, I think they just got lazy and depended on the die shrinks so they could have a smaller R&D budget.
I don't buy it for a second.
And we can certainly make use of serious performance boosts. 2560x1440 @ 144Hz. 4K @ 60Hz. Now is not the time for Nvidia and AMD to drag their proverbial feet.
I only used their framegrabbers but I never had a complaint about their performance or the company's service when I needed it.
More info and images on C420 and C680
http://www.matrox.com/graphics/en/products/graphics_cards/c-series/c420/
http://www.matrox.com/graphics/en/products/graphics_cards/c-series/c680/#close
That poor display...
Yes, I said that in my post (though you did go into more detail).
I suspect those that down-voted me didn't thoroughly read what I wrote.
This points to the kind of thing I'm talking about.
If the power can't come directly from the GPU, then find another way to do it. There is no reason at all that separate RAMDACs for gaming graphics can't be designed and built (there is actually some of that going on in current GPUs made by Nvidia and AMD) to assist the GPU in transferring more data faster.
Instead of thinking "well it can't be done with current hardware designs", I want them to think "How CAN we do it?"
Congratulations to AMD on the progress.