Look Who's Here: Matrox With Two Graphics Cards
Tags:
- Matrox
-
Graphics Cards
Last response: in News comments
Matrox has built two graphics cards for the professional market, both of which carry AMD GPUs.
Look Who's Here: Matrox With Two Graphics Cards : Read more
Look Who's Here: Matrox With Two Graphics Cards : Read more
More about : matrox graphics cards
-
Reply to N.Broekhuijsen
Gazabi
September 11, 2014 3:13:17 PM
PandaButtonFTW
September 11, 2014 4:19:10 PM
Related resources
- Do I need one of Matrox's graphics cards? - Forum
- Matrox M9148 1GB 4x Mini displayPort PC Quad Output Graphics Cards on Ebay - Forum
- I need a low profile graphics card. After looking for hours I've found two and can't find which is better. It's between the EV - Forum
- who is better from these graphic cards - Forum
- Looking for Feedback for these two Graphics Cards or advice. - Forum
builder4
September 11, 2014 4:19:31 PM
beetlejuicegr
September 11, 2014 4:46:14 PM
Given that they are using AMD graphics chips, and AMDs cards with those graphics chips are probably cheaper. I don't see what they are hoping to gain. If they wanted to limit themselves to being used for things like this also, why didn't they just continue there own line? It doesn't need to be competitive with high end GPUs, but if they still wanted in this low end stuff why not at least have there own product.
-
Reply to IInuyasha74
m
-11
l
Man I miss Matrox! My first real GPU setup was a G550 paired with an RT2500; that thing was a 2D beast back in 2001, and I remember having issues getting HDDs fast enough to feed those cards for video editing. To think that now my phone has more video editing capability than that costly and power hungry setup.
*sigh* remember when Matrox tried to get into gaming... that was fun times!
*sigh* remember when Matrox tried to get into gaming... that was fun times!
-
Reply to CaedenV
m
7
l
Marcus52
September 11, 2014 6:19:17 PM
If Matrox can get these things to drive that many displays @ 60Hz, and in one case do it with a passive cooler, why can't AMD and Nvidia? You can't drive 3 4K displays at 60 Hz with their top end consumer grade dual-GPU solutions, and those require beastly coolers.
My guess is that these cards aren't worth a flip at gaming, that's the biggest difference. On the other hand it shows to me that both the major desktop graphics card companies are either holding back or thinking inside a box that prevents them from truly innovating. We are told by the hardware press that we are going to have to be satisfied with a slower GPU development cycle because of the difficulties involved in continued die shrinks, but, frankly, I think they just got lazy and depended on the die shrinks so they could have a smaller R&D budget.
I don't buy it for a second.
And we can certainly make use of serious performance boosts. 2560x1440 @ 144Hz. 4K @ 60Hz. Now is not the time for Nvidia and AMD to drag their proverbial feet.
My guess is that these cards aren't worth a flip at gaming, that's the biggest difference. On the other hand it shows to me that both the major desktop graphics card companies are either holding back or thinking inside a box that prevents them from truly innovating. We are told by the hardware press that we are going to have to be satisfied with a slower GPU development cycle because of the difficulties involved in continued die shrinks, but, frankly, I think they just got lazy and depended on the die shrinks so they could have a smaller R&D budget.
I don't buy it for a second.
And we can certainly make use of serious performance boosts. 2560x1440 @ 144Hz. 4K @ 60Hz. Now is not the time for Nvidia and AMD to drag their proverbial feet.
-
Reply to Marcus52
m
-4
l
Innocent_Bystander
September 11, 2014 7:34:27 PM
Last time I've heard of Matrox was like 2000 or so. Boy does time fly.
More info and images on C420 and C680
http://www.matrox.com/graphics/en/products/graphics_car...
http://www.matrox.com/graphics/en/products/graphics_car...
More info and images on C420 and C680
http://www.matrox.com/graphics/en/products/graphics_car...
http://www.matrox.com/graphics/en/products/graphics_car...
-
Reply to lp231
m
0
l
For those asking about why 60 vs 30 / ect.. and why don't our current batch of consumer cards do this, it has to do with the RAMDAC's on the cards. We have long sinced moved away from analogue video signals to digital and so "RAMDAC" is kind of a misnomer now as it's really just the chip that converts the contents of the video frame in memory into a digital signal going out to that individual display, usually with TMDS. The speed and quality of the RAMDAC is what determines maximum output resolution and signaling rate. What matrox has done is take a typical dGPU and put several high end professional RAMDAC's on them. There are still limitations and so that's why you get 3x60 or 6x30 at maximum resolution, I'm willing to bet there are three RAMDAC chips capable of running in split channel mode.
-
Reply to palladin9479
m
1
l
Drejeck
September 11, 2014 11:34:35 PM
Poul Wrist
September 12, 2014 6:08:33 AM
Marcus52
September 12, 2014 10:19:27 AM
sykozis said:
These cards aren't aimed at gamers. Most of what Matrox does, is 2D, which is quite easy to render at 4K. Even my R7 240 can handle 2D at 4K..... the "depth of field" is what kills a graphics card trying to render a 3D scene at 4K.Yes, I said that in my post (though you did go into more detail).
I suspect those that down-voted me didn't thoroughly read what I wrote.
-
Reply to Marcus52
m
-1
l
Marcus52
September 12, 2014 10:44:50 AM
palladin9479 said:
For those asking about why 60 vs 30 / ect.. and why don't our current batch of consumer cards do this, it has to do with the RAMDAC's on the cards. We have long sinced moved away from analogue video signals to digital and so "RAMDAC" is kind of a misnomer now as it's really just the chip that converts the contents of the video frame in memory into a digital signal going out to that individual display, usually with TMDS. The speed and quality of the RAMDAC is what determines maximum output resolution and signaling rate. What matrox has done is take a typical dGPU and put several high end professional RAMDAC's on them. There are still limitations and so that's why you get 3x60 or 6x30 at maximum resolution, I'm willing to bet there are three RAMDAC chips capable of running in split channel mode.This points to the kind of thing I'm talking about.
If the power can't come directly from the GPU, then find another way to do it. There is no reason at all that separate RAMDACs for gaming graphics can't be designed and built (there is actually some of that going on in current GPUs made by Nvidia and AMD) to assist the GPU in transferring more data faster.
Instead of thinking "well it can't be done with current hardware designs", I want them to think "How CAN we do it?"
-
Reply to Marcus52
m
-1
l
coolitic
September 12, 2014 2:35:38 PM
zhunt99
September 14, 2014 8:42:17 AM
zhunt99 said:
Quote:
First time I'm hearing "power-efficient" and "AMD" in the same sentence.Congratulations to AMD on the progress.
You must not recall the HD 5000 and 6000 series that trumped their Nvidia counterparts in power efficiency, or the Athlon X2 vs Pentium 4 days
The old FX Nvidia chips were awful on power as well. So were the 8800 and 9800 GTX+
-
Reply to Alec Mowat
m
0
l
Alec Mowat said:
zhunt99 said:
Quote:
First time I'm hearing "power-efficient" and "AMD" in the same sentence.Congratulations to AMD on the progress.
You must not recall the HD 5000 and 6000 series that trumped their Nvidia counterparts in power efficiency, or the Athlon X2 vs Pentium 4 days
The old FX Nvidia chips were awful on power as well. So were the 8800 and 9800 GTX+
If you mean the FX as in the Nvidia 5000 series those things were pretty darn weak and hot to begin with. The fastest one in the series only had an 8-pipeline configuration and it was a really hot running card with extra power added so it had to be horribly inefficient. Its kinda like the dark days of graphics cards between the good quality and performance for the time and low power consumption cards of the Voodoo 3, Geforce, Geforce 2, up to Geforce 4, with the start of the ATI Radeon cards. Then you get the Nvidia FX 5000 series and it seems for the time being both are kinda lagging in terms of performance and power consumption, then you get the Nvidia 6000 series and the ATI X800 cards and all of a sudden performance is top notch and power is low again.
-
Reply to IInuyasha74
m
0
l
morgan payne
September 16, 2014 5:13:49 PM
PandaButtonFTW
September 16, 2014 6:43:59 PM
morgan payne
September 16, 2014 11:45:29 PM
PandaButtonFTW said:
morgan payne said:
well, maybe it is attacked, who knows what kind of diabolical things that amd based card is doing to those poor monitors! someone's a Nvidia fan boy.....
LOL! I may be using an nvidia card right now but I actually prefer amd, learn to take a joke!
I can take a joke I was pointing it out
-
Reply to morgan payne
m
0
l
Related resources
- SolvedWhat's the difference between these two graphics cards? Forum
- SolvedTwo different graphics cards 3 monitors and a tv Forum
- SolvedDifference between these two Graphics cards? Forum
- SolvedA place to compare these two graphics cards? Forum
- SolvedNew to here so dont know if this is the right spot. so im looking to build two pc. one for around 500 and the other for 800. Forum
- Dual Matrox Graphics Cards Forum
- SolvedDebating between two graphics cards Forum
- Matrox graphics cards etc...? Forum
- SolvedHow to these two graphics cards compare to each other? Forum
- SolvedCan you use two graphics cards with no SLI/Xfire? Forum
- SolvedLooking for helpful computer graphics artist who would be willing to make me a logo Forum
- SolvedMy two graphics cards can't handle ultra? Forum
- SolvedRunning two graphics cards with the same GPU without SLI Forum
- Who makes graphics cards? Forum
- Looking for two graphics cards to run in Crossfire Forum
- More resources
!
I have a need for something like this in a build of a security cam driving PC I am looking at constructing for a client.