Sign in with
Sign up | Sign in
Your question

Look Who's Here: Matrox With Two Graphics Cards

Tags:
  • Matrox
  • Graphics Cards
Last response: in News comments
Share
a b U Graphics card
September 11, 2014 2:55:44 PM

Matrox has built two graphics cards for the professional market, both of which carry AMD GPUs.

Look Who's Here: Matrox With Two Graphics Cards : Read more

More about : matrox graphics cards

September 11, 2014 3:13:17 PM

It says "if you've only got three displays attacked." I'm pretty sure its supposed to be attached.
m
9
l
September 11, 2014 4:19:10 PM

Quote:
It says "if you've only got three displays attacked." I'm pretty sure its supposed to be attached.



well, maybe it is attacked, who knows what kind of diabolical things that amd based card is doing to those poor monitors!
m
6
l
Related resources
September 11, 2014 4:19:31 PM

They should make a 4k Sextuple Head 2 Go
m
-2
l
September 11, 2014 4:46:14 PM

Now if only 3dfx appeared and bought her self back from nvidia .,,,,
m
9
l
a b U Graphics card
September 11, 2014 4:46:43 PM

beetlejuicegr said:
Now if only 3dfx appeared and bought her self back from nvidia .,,,,


I miss Glide too....
m
5
l
a b U Graphics card
September 11, 2014 5:50:42 PM

Given that they are using AMD graphics chips, and AMDs cards with those graphics chips are probably cheaper. I don't see what they are hoping to gain. If they wanted to limit themselves to being used for things like this also, why didn't they just continue there own line? It doesn't need to be competitive with high end GPUs, but if they still wanted in this low end stuff why not at least have there own product.
m
-11
l
a b U Graphics card
September 11, 2014 6:10:01 PM

Man I miss Matrox! My first real GPU setup was a G550 paired with an RT2500; that thing was a 2D beast back in 2001, and I remember having issues getting HDDs fast enough to feed those cards for video editing. To think that now my phone has more video editing capability than that costly and power hungry setup.

*sigh* remember when Matrox tried to get into gaming... that was fun times!
m
7
l
September 11, 2014 6:19:17 PM

If Matrox can get these things to drive that many displays @ 60Hz, and in one case do it with a passive cooler, why can't AMD and Nvidia? You can't drive 3 4K displays at 60 Hz with their top end consumer grade dual-GPU solutions, and those require beastly coolers.

My guess is that these cards aren't worth a flip at gaming, that's the biggest difference. On the other hand it shows to me that both the major desktop graphics card companies are either holding back or thinking inside a box that prevents them from truly innovating. We are told by the hardware press that we are going to have to be satisfied with a slower GPU development cycle because of the difficulties involved in continued die shrinks, but, frankly, I think they just got lazy and depended on the die shrinks so they could have a smaller R&D budget.

I don't buy it for a second.

And we can certainly make use of serious performance boosts. 2560x1440 @ 144Hz. 4K @ 60Hz. Now is not the time for Nvidia and AMD to drag their proverbial feet.
m
-4
l
a b U Graphics card
September 11, 2014 6:59:41 PM

These cards aren't aimed at gamers. Most of what Matrox does, is 2D, which is quite easy to render at 4K. Even my R7 240 can handle 2D at 4K..... the "depth of field" is what kills a graphics card trying to render a 3D scene at 4K.
m
2
l
September 11, 2014 7:34:27 PM

$$$$Thousands. And they will be lapped up by the target audience simply because Matrox's professional line of products is awesome.

I only used their framegrabbers but I never had a complaint about their performance or the company's service when I needed it.
m
0
l
September 11, 2014 10:15:21 PM

For those asking about why 60 vs 30 / ect.. and why don't our current batch of consumer cards do this, it has to do with the RAMDAC's on the cards. We have long sinced moved away from analogue video signals to digital and so "RAMDAC" is kind of a misnomer now as it's really just the chip that converts the contents of the video frame in memory into a digital signal going out to that individual display, usually with TMDS. The speed and quality of the RAMDAC is what determines maximum output resolution and signaling rate. What matrox has done is take a typical dGPU and put several high end professional RAMDAC's on them. There are still limitations and so that's why you get 3x60 or 6x30 at maximum resolution, I'm willing to bet there are three RAMDAC chips capable of running in split channel mode.
m
1
l
September 11, 2014 11:34:35 PM

wall street graphics card
m
2
l
a b U Graphics card
September 12, 2014 12:17:02 AM

Quote:
It says "if you've only got three displays attacked." I'm pretty sure its supposed to be attached.
This would be correct. I must've felt a subconscious need to attack a display.

m
3
l
a b U Graphics card
September 12, 2014 1:30:30 AM

Quote:
Quote:
It says "if you've only got three displays attacked." I'm pretty sure its supposed to be attached.
This would be correct. I must've felt a subconscious need to attack a display.


That poor display...
m
1
l
September 12, 2014 6:08:33 AM

I actually do want to know the price :p  I have a need for something like this in a build of a security cam driving PC I am looking at constructing for a client.
m
0
l
a c 96 U Graphics card
September 12, 2014 10:09:34 AM

Interesting!
m
0
l
September 12, 2014 10:19:27 AM

sykozis said:
These cards aren't aimed at gamers. Most of what Matrox does, is 2D, which is quite easy to render at 4K. Even my R7 240 can handle 2D at 4K..... the "depth of field" is what kills a graphics card trying to render a 3D scene at 4K.


Yes, I said that in my post (though you did go into more detail).

I suspect those that down-voted me didn't thoroughly read what I wrote.

m
-1
l
September 12, 2014 10:44:50 AM

palladin9479 said:
For those asking about why 60 vs 30 / ect.. and why don't our current batch of consumer cards do this, it has to do with the RAMDAC's on the cards. We have long sinced moved away from analogue video signals to digital and so "RAMDAC" is kind of a misnomer now as it's really just the chip that converts the contents of the video frame in memory into a digital signal going out to that individual display, usually with TMDS. The speed and quality of the RAMDAC is what determines maximum output resolution and signaling rate. What matrox has done is take a typical dGPU and put several high end professional RAMDAC's on them. There are still limitations and so that's why you get 3x60 or 6x30 at maximum resolution, I'm willing to bet there are three RAMDAC chips capable of running in split channel mode.


This points to the kind of thing I'm talking about.

If the power can't come directly from the GPU, then find another way to do it. There is no reason at all that separate RAMDACs for gaming graphics can't be designed and built (there is actually some of that going on in current GPUs made by Nvidia and AMD) to assist the GPU in transferring more data faster.

Instead of thinking "well it can't be done with current hardware designs", I want them to think "How CAN we do it?"

m
-1
l
September 12, 2014 2:35:38 PM

First time I'm hearing "power-efficient" and "AMD" in the same sentence.

Congratulations to AMD on the progress.
m
-3
l
September 14, 2014 8:42:17 AM

Quote:
First time I'm hearing "power-efficient" and "AMD" in the same sentence.

Congratulations to AMD on the progress.


You must not recall the HD 5000 and 6000 series that trumped their Nvidia counterparts in power efficiency, or the Athlon X2 vs Pentium 4 days
m
0
l
a b U Graphics card
September 14, 2014 2:12:35 PM

zhunt99 said:
Quote:
First time I'm hearing "power-efficient" and "AMD" in the same sentence.

Congratulations to AMD on the progress.


You must not recall the HD 5000 and 6000 series that trumped their Nvidia counterparts in power efficiency, or the Athlon X2 vs Pentium 4 days


The old FX Nvidia chips were awful on power as well. So were the 8800 and 9800 GTX+
m
0
l
a b U Graphics card
September 14, 2014 3:11:17 PM

Alec Mowat said:
zhunt99 said:
Quote:
First time I'm hearing "power-efficient" and "AMD" in the same sentence.

Congratulations to AMD on the progress.


You must not recall the HD 5000 and 6000 series that trumped their Nvidia counterparts in power efficiency, or the Athlon X2 vs Pentium 4 days


The old FX Nvidia chips were awful on power as well. So were the 8800 and 9800 GTX+


If you mean the FX as in the Nvidia 5000 series those things were pretty darn weak and hot to begin with. The fastest one in the series only had an 8-pipeline configuration and it was a really hot running card with extra power added so it had to be horribly inefficient. Its kinda like the dark days of graphics cards between the good quality and performance for the time and low power consumption cards of the Voodoo 3, Geforce, Geforce 2, up to Geforce 4, with the start of the ATI Radeon cards. Then you get the Nvidia FX 5000 series and it seems for the time being both are kinda lagging in terms of performance and power consumption, then you get the Nvidia 6000 series and the ATI X800 cards and all of a sudden performance is top notch and power is low again.
m
0
l
a b U Graphics card
September 14, 2014 4:32:02 PM

Yup, I'm using the FX 5950 Ultra in my PIII rig (a bit overkill really). (see sig for details).

It's also one of the earliest dual slot cards, but it's also the first to use the current naming conventions.
m
0
l
September 16, 2014 5:13:49 PM

well, maybe it is attacked, who knows what kind of diabolical things that amd based card is doing to those poor monitors!

someone's a Nvidia fan boy.....
m
0
l
September 16, 2014 6:43:59 PM

morgan payne said:
well, maybe it is attacked, who knows what kind of diabolical things that amd based card is doing to those poor monitors!

someone's a Nvidia fan boy.....


LOL! I may be using an nvidia card right now but I actually prefer amd, learn to take a joke!
m
0
l
September 16, 2014 11:45:29 PM

PandaButtonFTW said:
morgan payne said:
well, maybe it is attacked, who knows what kind of diabolical things that amd based card is doing to those poor monitors!

someone's a Nvidia fan boy.....


LOL! I may be using an nvidia card right now but I actually prefer amd, learn to take a joke!


I can take a joke I was pointing it out
m
0
l
!