Sign in with
Sign up | Sign in
Your question

Lucid Hydra 200

Last response: in Graphics & Displays
Share
a b U Graphics card
September 23, 2009 9:45:50 PM

http://www.anandtech.com/video/showdoc.aspx?i=3646


Very interesting wondering how they will perform using 2 different cards.
Using a 5870 and gtx285 multi-gpu set up would offer dx11 support?
Wouldnt enabling aa cause stability issues?

Seems interesting what are your thoughts on this?

More about : lucid hydra 200

September 23, 2009 10:05:59 PM

i've never heard a story of someone using an nvidia card and an ati card in the same machine that didn't end in tears. the drivers don't play nice and from what i've heard most games don't work right.
i'd stear clear of that idea if i were you.
a b U Graphics card
September 23, 2009 10:10:12 PM

endorphines said:
i've never heard a story of someone using an nvidia card and an ati card in the same machine that didn't end in tears. the drivers don't play nice and from what i've heard most games don't work right.
i'd stear clear of that idea if i were you.


If you had bothered to read what this technology is about you would probably sing a different tune.... Come now folks.. take a second before you respond.

It is interesting tech.. will have to watch how it unfolds. Obviously have to wait for seven. Physx still wont work with this.. and I can't personally see it working better than crossfire or SLI on their own.. but who knows.
Related resources
a c 84 U Graphics card
September 23, 2009 10:21:34 PM

endorphines said:
i've never heard a story of someone using an nvidia card and an ati card in the same machine that didn't end in tears. the drivers don't play nice and from what i've heard most games don't work right.
i'd stear clear of that idea if i were you.

lol maybe that's because it isn't out yet!!

though $72 premium is alot....
a b U Graphics card
September 23, 2009 10:27:27 PM

It is interesting but I think most people read this stuff and automatically think 'i'll believe it when i see it', or 'it's far too good to be true'.

Most of them are a bit too good to be true. Brilliant ideas sure, but not always amazing in practice. I wish them all the luck because with these 2 companies in the war they are in, it's not going to be easy.
a b U Graphics card
September 23, 2009 11:38:38 PM

With this technology i wonder if it could help with cpu scaling.
a b U Graphics card
September 23, 2009 11:42:09 PM

5 months ago I thought this tech was joke that would never see the light of day.. now they have the newest chip on MSI's new board.. that is not small potato company support there. I still can't see it working better than pure crossfire/SLI... but we will soon find out.
September 24, 2009 12:29:16 AM

Very interesting....
September 24, 2009 5:28:25 PM

Even if it doesn't work as well as SLI or Crossfire, the ability to use disparate cards alone is worth the price, IMHO. That way, I wouldn't have to toss an old video card when I get a new one. I can use both and get more mileage out of the older one. (assuming this works as advertised)
a b U Graphics card
September 24, 2009 5:46:26 PM

Yes, very cool. Though I imagine there will be real world limitations. Mixing and matching cards will have incompatibilities and weirdnesses, even more than standard SLI/Crossfire. While I really hope this works out, it is important to remember there is a big difference between good technology and magic.
a b U Graphics card
September 24, 2009 8:30:39 PM

Agreed. I'm more worried about imcompatabilites/rendering differences then anything else.
a b U Graphics card
September 24, 2009 8:34:41 PM

There are going to be a lot of real world compatibility issues I'm sure.. but if this works like it is supposed to.. provided your cards stay in the same API release.. it may work pretty well.. and if it is truly dictating load externally it shouldn't matter what the card is. However, if AA (or whatever) is done differently it may look really strange when the frames are alternated between the different cards :\

Anyway.. anand forums are mentioning some rumors that Nvidia is going to block the function of this with their cards.. will have to wait and see.
a c 169 U Graphics card
September 24, 2009 8:39:57 PM

Lucid is indeed a very interesting technology,however i wonder how they mix a 5870 with a GTX 260 when one is DX 11 and the other one is DX 10 and some other questions...
When it comes out then we will see in-depth review which will answer this questions
a b U Graphics card
September 24, 2009 8:49:11 PM

Maziar said:
Lucid is indeed a very interesting technology,however i wonder how they mix a 5870 with a GTX 260 when one is DX 11 and the other one is DX 10 and some other questions...
When it comes out then we will see in-depth review which will answer this questions


The only demonstrations I have seen were of a 4890 and a 260. Did I miss one?
a c 169 U Graphics card
September 25, 2009 7:41:53 AM

Right i just checked anandtech and it was 4890 with GTX 260 :)  thanks for mentioning :)  but i think it can be done with 5870 too and that would be very nice,since one is DX 11 and the other one is DX 10
a b U Graphics card
September 25, 2009 11:49:46 AM

^^ Either you have to lock the feature level at the lowest common supported standard (In the above case, DX10), or the chip itself has some way to know to which card to route the calls. Theoretically, you can do 10.1 in the above situation, provided the rendering is handled on only the supported card, letting the other act as a buffer/co-processor. Likewise, you could lock AA at supersampling (slow, but commonly supported) to avoid the differences in AA technique.

Also BTW, as the Hydra chip is installed right on the mobo and probably has low level access (I'm assuming here, we'll know more in a month or so...), then I doubt NVIDIA could lock out its functions driver level, unless they intentionally nuke their drivers to not physically work if a non-nvidia GPU is detected...
September 25, 2009 12:28:41 PM

It certainly looks good, a good achievement, but unless people have just upgraded to a different brand of card etc.. I cant see why people would go and buy 2 gpu's from different manufactures
a b U Graphics card
September 25, 2009 4:33:03 PM

^^ Lets say I already have a 4890, and buy a GT300. Why throw away a perfectly good 4890?

Its good for upgrading purposes. Nevermind the promised scaling is better then CF/SLI...
September 28, 2009 6:11:47 AM

If you can match Nvidia with ATI that would be great, but what I really care about most is just the supposed scaling between the same brand cards. If I can add two 5850s and they are able to scale 95% and above I'll jump on it on release day, assuming the BingBang board is a good OCer.

The other thing that has me waiting on this is the way the chip will handle the Video card memory. If you have 2x1GB cards would they remain 2x1GB like SLI/Crossfire or would games be able to utilize them as a 2GB card? Can't wait for the reviews. So many questions that need to be answered.
a b U Graphics card
September 28, 2009 7:23:16 AM

daedalus685 said:
5 months ago I thought this tech was joke that would never see the light of day.. now they have the newest chip on MSI's new board.. that is not small potato company support there. I still can't see it working better than pure crossfire/SLI... but we will soon find out.



Even if it doesn't work better, if it works the same or close it is still great for upgrading. Think about it, if you get a 5870 now....and in a year pick up the 6XXX you can still use, can get the performance of the 5780 as if it were in Crossfire (or better, if it does indeed work)
a c 130 U Graphics card
September 28, 2009 8:00:39 AM

Maziar said:
Lucid is indeed a very interesting technology,however i wonder how they mix a 5870 with a GTX 260 when one is DX 11 and the other one is DX 10 and some other questions...
When it comes out then we will see in-depth review which will answer this questions


You never know maybe it will be clever enough to throw the DX11 comands at the 5870 and not the GTX 260 ? All is speculation untill we get the thing in the hands of a few reviewers of course, but if it can dynamically load balance two cards of differant abilities then its not a massive step as far as i can see to dynamically distrubute feature sets either.

Mactronix
September 28, 2009 8:09:53 AM

Its very interesting, but I have serious doubts about quality in the games. In older, simpler games using cards with similar architecture - sure, this could work even better than SLI or crossfire. But imagine new, complex games in DX10-11, you cant so easily break apart scenes to render on diff cards (add to that physics, etc). Also how about AA and geometry? Even different generation of cards from the same manufacturer may produce different results (tearing, holes, etc), talk about different manufacturers and problems would be exponential...

Also main investor is Intel, its all fine for financing while Lara The Bee ;)  is a year or two away, what about support after that? I doubt intel will happily share rendering with ATI and nvidia, they'll want to cut opponents out IMO.
a b U Graphics card
September 28, 2009 12:29:29 PM

M3d said:
The other thing that has me waiting on this is the way the chip will handle the Video card memory. If you have 2x1GB cards would they remain 2x1GB like SLI/Crossfire or would games be able to utilize them as a 2GB card? Can't wait for the reviews. So many questions that need to be answered.


Good catch, didn't think of that myself. I would assume memory would be handled the same way as CF/SLI, but its worth looking into.

I think we've explained all the things we need to look at, but this is promising to say the least...
September 28, 2009 6:58:49 PM

gamerk316- I'll post this link on this thread since I don't see it.
http://vimeo.com/6700209

If you look at 8:20 they have GTX 260 1.7GB and a GTS 250 1GB

So, they are already doing it. Too bad he didn't give any details.
I don't know whether the lack of details is troubling or they have something so great, and are holding on tight to it for now, that they will just knock everyone's socks off.

a b U Graphics card
September 28, 2009 9:25:49 PM

M3d said:
gamerk316- I'll post this link on this thread since I don't see it.
http://vimeo.com/6700209

If you look at 8:20 they have GTX 260 1.7GB and a GTS 250 1GB

So, they are already doing it. Too bad he didn't give any details.
I don't know whether the lack of details is troubling or they have something so great, and are holding on tight to it for now, that they will just knock everyone's socks off.



They aren't allowed to give detail until the company making it says so. Same thing with the new graphics cards, the reviews had the cards and reviewed them but they weren't allowed to post the review until a certain time.
a b U Graphics card
September 28, 2009 9:30:13 PM

Harrisson said:
Its very interesting, but I have serious doubts about quality in the games. In older, simpler games using cards with similar architecture - sure, this could work even better than SLI or crossfire. But imagine new, complex games in DX10-11, you cant so easily break apart scenes to render on diff cards (add to that physics, etc). Also how about AA and geometry? Even different generation of cards from the same manufacturer may produce different results (tearing, holes, etc), talk about different manufacturers and problems would be exponential...

Also main investor is Intel, its all fine for financing while Lara The Bee ;)  is a year or two away, what about support after that? I doubt intel will happily share rendering with ATI and nvidia, they'll want to cut opponents out IMO.


The thing is the chip is dealing with what does what, and balances the loads across the cards depending on their speeds and architectures then collects that rendering information and the chip then uses it.

So it is more or less how will the chip handle it, not rather the cards will mess it up. (that is if it works correctly)
!