Sign in with
Sign up | Sign in
Your question
Solved

HYBRID CrossFire ... Please "Set Me Straight".

Last response: in Systems
Share
March 30, 2010 9:22:45 PM


Regarding the ASUS 890 HYBRID CrossFire mobo(s) ...

QUESTION : What is the "most powerful" discreet 16X GPU option that can fully leverage HYBRID CrossFire ? ?

ALSO : How do GIGABYTE's 890 offerings compare ??

. . . Do more advanced 790 chipset boards offer same ??

. . . DISregarding any CrossFire option ... Can nVidia or Hi-End Radeon discreet options "gang" with onbd gpu for multi-mons ? ? (under Win7)?

... Just trying to nail it all down, in my mind ... for the sake of better consulting, if not for my own curiosity.

Thanks for sharing your knowledge.

= Alvin =

March 30, 2010 9:38:05 PM

The only way I'm aware that you can use both the integrated GPU and a separate GPU is in Hybrid Crossfire. I haven't researched it lately because combining a low-end integrated GPU and a low-end video card doesn't really interest me. I'll look up the possible configurations again if I get some time at work.

http://en.wikipedia.org/wiki/ATI_Hybrid_Graphics

Edit: Here's ATI's Hybrid Crossfire site:

http://game.amd.com/us-en/crossfirex_hybrid.aspx

According to that site, Hybrid Crossfire only works with "an ATI Radeon™ HD 2400 Series2, ATI Radeon™ HD 3400 Series or ATI Mobility Radeon™ HD 3400 Series graphics processor ".

I would imagine that site hasn't been updated in a while. Here's a site that claims the new "Cedar" chips (HD 5400 and 5500 series) can be run in Hybrid Crossfire with the 890GX.

http://www.hardwarecanucks.com/forum/hardware-canucks-r...
m
0
l
March 30, 2010 9:52:35 PM

Well ... It's a " FEATURE " ... y'know? ... And I agree like "who would want to ?? "Right ?

BUT ...

(1) I have seen articles (gKay??) on using integrated + discreet for multi-mon setups. Would be nice to know how many 1200(1080)/30P mons can do full HD streams and how/what Win7 can do with all that, natively. It seems like the onbd-gfx could function as "just another gpu card" ... I mean ... with no onbd-gpu, can't I put one each nVidia/Radeon/Mattrox cards and have Win7 use them all ?? Native ?
... so ... DOES the 890 onbd-gpu pitch in for multi-vendor discreet configs ??

(2) Wondering what type/level of game-play experience might be enjoyed if said "FEATURE" is fully exploited ... and also just to leverage the onbd resources towards budget gaming. ... If I already payed for integrated gpu and can donate 512MB of sysRAM to it ... How can I get the biggest boost "INTO" a discreet option? Would the resolution fps gained even be worth it, for shooter/fps genre. ? ?

Because to build a correct budget game rig ... one must know these things.

= Alvin =


m
0
l
Related resources

Best solution

March 30, 2010 10:09:13 PM

If you plug any other card into the PCIe slot that's NOT a hybrid crossfire card (like shortstuff said - only the 24xx, 34xx and 43xx cards and, supposedly, the 54xx cards on the 890GX), the onboard card will be disabled. Your onboard will support multi-monitor (up to 2) using either the DVI, VGA or HDMI - not 3. But, putting any other card in there will disable the onboard and you'll be using that card instead. Hybrid crossfire with the onboard 890GX chipset and a 4350 will get you roughly 4550 speeds - hardly worth it. Throwing a 5450 might you get up to close to 4650 speeds, which is hardly worth it conisdering another $20 will get you a 5570 that will completely blow that away (relatively speaking).

The only thing hybrid crossfire is good for is using onboard to save power until you need a bump, and then ONLY if you happen to have a 4350 laying around that isn't costing you anything. You'll be lucky to get 30fps in any modern game at 1280x1024 with a hybrid crossfire setup.
Share
March 30, 2010 10:16:56 PM

That pretty much sums it up.
m
0
l
March 30, 2010 11:02:25 PM

Best answer selected by Alvin Smith.
m
0
l
March 30, 2010 11:09:39 PM

Yeah ... So for HTPC and Hoyle Bridge /Poker/Chess and even lite SD editing ... Just use the onbd. Also remember the conclusions of one article stated that it is no surprise that integrated ATi was no good for FPS ... discreet FPS is ATi's bread-n-buttuh.

THE WAS AN (other) ARTICLE (Batuchka or gKay) posted that showed a 4xMon setup using combo of discreet 4350 (I think, maybe) and the integrated graphics. I think it was a Tom's HWG article ... 4 monitors using integrated plus discreet.

= Alvin =
m
0
l
March 30, 2010 11:12:35 PM

The system in that article would have had Hybrid Crossfire turned on. Like dkapke said, if you try to use the integrated GPU and a stand-alone GPU without turning on Hybrid Crossfire the integrated GPU will be turned off.
m
0
l
March 30, 2010 11:13:19 PM

Would be nice if SOME EXPERT would compose/maintain a list of various GPU requirements for various games at various rez ... I can see that WoW does not require anything like HALO (not even close). Would be nice for non gamers to have a better idea of what games need whih GPU at which rez.

= In a slightly more perfect world =


m
0
l
Anonymous
April 6, 2010 12:02:47 AM

I finally got a ATI Radeon 3470 Card to go with my onboard 4200 card in Crossfire.. It only cost $25 bucks so I wanted to try it.
At first on some of the tests it actually worse with crossfire enabled than with the 4200 alone. But I had a bad overclocking utility that came with my Gigabyte board that was messing up the Overclocking from the BIOS, the BIOS Overclocking works MUCH better as you can boost the IGP to REALLY high Mhz if you wanted, plus you can O.C. the discrete 3470, after that I was getting alot better performance. I can run Warhammer online on the highest setting and it runs at 20-30 fps in towns with lots of people, same with Oblivion can run that at highest video quality with good framerates.. With the newer games like Dragonage (Haven't tried to run any new First Person Shooters on it yet) It works alot better, can run it on medium to high detail graphics with Anti-Aliasing 2x and at 1400x900 resolution it works great... Before I put the 3470 in with just the onboard 4200 running I had to run it at a much lower resolution and no Anti-Aliasing whatsoever to get it to run at decent framerates.

So all in all it actually boosted my performance in Dragonage considerably for $25 bucks, I and it runs Warhammer great, I guess If I put a First Person Shooter on medium to low graphics I could play it online with Decent Frames Per Second (30ish)... On older games like Warcraft though like someone else in this thread was talking about, I could probably get 100+ frames a second on some settings in that game so no worries about running that stuff, or pretty much any online MMO for that matter this crossfire config is really cheap and will run any of those games at top quality.

It is kind of a hassel and not that much of an increase, but now I can run anything my xbox can and can try any new game I want at decent performance. I couldn't really do that before with just the onboard.

4200 Onboard and 3470($25) Crossfire on a 785G chipset.

WoW= 100+Fps
Warhammer = 30+fps
DragonAge = 17-25 fps ( Never goes under 15 on high settings)
Oblivion = 50+fps ( Super High Setting, Water looks awesome.)

anything under 15fps is when you start to see choppy animation, I would say anything over that is ok but the higher the better.
m
0
l
Anonymous
April 6, 2010 12:11:15 AM

I forgot to say that before the Rendering time for Maya movies was pretty quick with just the IGP but now with the Crossfire it renders 3D movies and Publishes things in all kinds off different applications I have WAY faster now with the Crossfire... The Discrete card I got only has 256mb ram on it but it is DDR3 and none of the other stuff on my motherboard is DDR3 so that really helps alot... Plus the Loadtimes on new zones or when you load a saved Game on any video game is almost instant now compared to 30sec-1minute wait time to load the same games befores, the actual frame rates there isn't THAT much of an increase on the newer games but the load times are WAY faster now with the crossfire.
m
0
l
April 6, 2010 12:34:58 AM

Cron,

VERY glad to hear all that and we DO thank you very much for your report !

I will feel much better, now, recommending (and "for what") . . .

. . . One question still remains and, someone might have answered it but is has not sunk in (yet) ...

Question: What is the highest level of discreet Radeon GPU (x16 card) which can "FULLY LEVERAGE" all benefit from the latest 890 Integrated Hybrid SLI ? ? ?

... I really wanna know-oh ... for shore !

= Alvin =
m
0
l
April 6, 2010 1:23:20 AM

The 890G is nothing more than an overclocked 785G - same specs and same basic chips, just different clocks. The only hardware difference is that the 890G adds DX10.1 support which the 785G didn't have. So, all of the boards that were compatible with Hybrid Crossfire on the 780G, 785G and 790G will work with the the 890G. That said, and I'm unsure why they haven't put this out there, but the fastest card you can pair up with it now is the 5450. However, you'll still only have DX10.1 (not 11 like the 5450) and you'll still be slower than a 5570 which only cost about $20 more than a 5450. Heck, it'll still be slower than a 4650 and that's the same cost. You're settling for DX10.1 in Hybrid Crossfire anyways.

Bottom line is if you can get a $25 3470, then that's not a bad combination and it just depends on the games you play, but no matter what card you pair up with an onboard 890G (or 780/785/790) you'll NEVER be as fast as a $50 HD4650 or a $60 HD4670, and nowhere near as fast as a $70 HD5570. If you're playing 2-3 year old games at 1280x1024 (or 1400x900) then you'll probably be fine so long as you don't get carried away with detail settings. Anything more and it's a wasted $25.
m
0
l
April 6, 2010 2:29:13 AM


What about that south-bridge and BIOS call to it (et. al.) ? ? ?

And ...

What about SATA3 and USB3 resource handling and "full saturation" rates ?

I was under the (admittedly somewhat fuzzy) impression that some things were "done better" in terms of "features" (with the 890) and that ASUS core-unlocking and very slightly better (more modern) integrated graphics ...

... I also remember the reviewer was "not particularly impressed" with any differences between the 790 and 890.

= FWIW =
m
0
l
April 6, 2010 2:41:47 AM

Quote:
NO, the 890GX is 790GX (what 785G is to 780G), with added UVD2 and DX10.1 support..

Actually, no, that's not the case. The 780G was an HD3200. The 790G was an an upgraded 780G - the HD3300. The 785G was new - referred to as the HD4200 and...the 890G is an HD4290. So, no, the 890G is an upgraded HD4200, not a 3xxx series chip. It's clocked the same as a 790G (700MHz by default) but it's NOT the HD3300. The 785G came out a year after the 780G/790G and is not the same chip at all. Of course, I could be wrong - it wouldn't be the first time.
m
0
l
April 6, 2010 2:48:22 AM

Alvin Smith said:
What about that south-bridge and BIOS call to it (et. al.) ? ? ?

And ...

What about SATA3 and USB3 resource handling and "full saturation" rates ?

I was under the (admittedly somewhat fuzzy) impression that some things were "done better" in terms of "features" (with the 890) and that ASUS core-unlocking and very slightly better (more modern) integrated graphics ...

... I also remember the reviewer was "not particularly impressed" with any differences between the 790 and 890.

= FWIW =

Now, if you're talking about all that, the 890G is a FAR better chipset than anything AMD has come out with previously. USB3 and SATA3 are native and don't suck PCIe lanes from the graphics card (like ALL of the Intel 1156 chipsets do). But no...the graphics aren't at all any better than any of the other 78x/79x chipsets other than DX10.1 and UVD2 support. It isn't going to run a single game any faster and it isn't going to run Hybrid Crossfire any faster.
m
0
l
April 6, 2010 3:14:03 AM

Quote:
only SATA6 is native, USB3 requires an add on chip..

Correct...but it doesn't take PCIe lanes away from the graphics card like an 1156 chipset does.
m
0
l
April 6, 2010 4:59:14 AM

What about X58 !?

. . . I just read that "INTELs" implementation of PCIe only gives full PCIe 2.0 bandwidth to the X16 slot(s) and ALL the other lanes are PCIe 1.x and top-out at 250MB/sec. (Which, BTW, is only ONE HALF the fully saturated USB3 spec )... BUT ...

... the 890 (at least) gives full PCIe 2.x bandwidth to ALL PCIe Lanes and, therefore, One PCIe lane will support one fully saturated USB3 Port, @500MB/s (less protocol).

. . . It is a "no-brainer" ... The Latest AM3 boards are a "superior break-out-box" for your GPU/Compute core ... certainly for that much less money (all totalled).

= Al =
m
0
l
!