Lucid Hydra 200

endorphines

Distinguished
Mar 11, 2008
68
0
18,640
i've never heard a story of someone using an nvidia card and an ati card in the same machine that didn't end in tears. the drivers don't play nice and from what i've heard most games don't work right.
i'd stear clear of that idea if i were you.
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


If you had bothered to read what this technology is about you would probably sing a different tune.... Come now folks.. take a second before you respond.

It is interesting tech.. will have to watch how it unfolds. Obviously have to wait for seven. Physx still wont work with this.. and I can't personally see it working better than crossfire or SLI on their own.. but who knows.
 

Kari

Splendid

lol maybe that's because it isn't out yet!!

though $72 premium is alot....
 

jennyh

Splendid
It is interesting but I think most people read this stuff and automatically think 'i'll believe it when i see it', or 'it's far too good to be true'.

Most of them are a bit too good to be true. Brilliant ideas sure, but not always amazing in practice. I wish them all the luck because with these 2 companies in the war they are in, it's not going to be easy.
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810
5 months ago I thought this tech was joke that would never see the light of day.. now they have the newest chip on MSI's new board.. that is not small potato company support there. I still can't see it working better than pure crossfire/SLI... but we will soon find out.
 

bollwerk

Distinguished
Oct 31, 2007
38
10
18,535
Even if it doesn't work as well as SLI or Crossfire, the ability to use disparate cards alone is worth the price, IMHO. That way, I wouldn't have to toss an old video card when I get a new one. I can use both and get more mileage out of the older one. (assuming this works as advertised)
 
Yes, very cool. Though I imagine there will be real world limitations. Mixing and matching cards will have incompatibilities and weirdnesses, even more than standard SLI/Crossfire. While I really hope this works out, it is important to remember there is a big difference between good technology and magic.
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810
There are going to be a lot of real world compatibility issues I'm sure.. but if this works like it is supposed to.. provided your cards stay in the same API release.. it may work pretty well.. and if it is truly dictating load externally it shouldn't matter what the card is. However, if AA (or whatever) is done differently it may look really strange when the frames are alternated between the different cards :\

Anyway.. anand forums are mentioning some rumors that Nvidia is going to block the function of this with their cards.. will have to wait and see.
 
Lucid is indeed a very interesting technology,however i wonder how they mix a 5870 with a GTX 260 when one is DX 11 and the other one is DX 10 and some other questions...
When it comes out then we will see in-depth review which will answer this questions
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


The only demonstrations I have seen were of a 4890 and a 260. Did I miss one?
 
Right i just checked anandtech and it was 4890 with GTX 260 :) thanks for mentioning :) but i think it can be done with 5870 too and that would be very nice,since one is DX 11 and the other one is DX 10
 
^^ Either you have to lock the feature level at the lowest common supported standard (In the above case, DX10), or the chip itself has some way to know to which card to route the calls. Theoretically, you can do 10.1 in the above situation, provided the rendering is handled on only the supported card, letting the other act as a buffer/co-processor. Likewise, you could lock AA at supersampling (slow, but commonly supported) to avoid the differences in AA technique.

Also BTW, as the Hydra chip is installed right on the mobo and probably has low level access (I'm assuming here, we'll know more in a month or so...), then I doubt NVIDIA could lock out its functions driver level, unless they intentionally nuke their drivers to not physically work if a non-nvidia GPU is detected...
 

marsay001

Distinguished
Apr 21, 2009
184
0
18,710
It certainly looks good, a good achievement, but unless people have just upgraded to a different brand of card etc.. I cant see why people would go and buy 2 gpu's from different manufactures
 

M3d

Distinguished
Jul 11, 2006
243
0
18,680
If you can match Nvidia with ATI that would be great, but what I really care about most is just the supposed scaling between the same brand cards. If I can add two 5850s and they are able to scale 95% and above I'll jump on it on release day, assuming the BingBang board is a good OCer.

The other thing that has me waiting on this is the way the chip will handle the Video card memory. If you have 2x1GB cards would they remain 2x1GB like SLI/Crossfire or would games be able to utilize them as a 2GB card? Can't wait for the reviews. So many questions that need to be answered.
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810



Even if it doesn't work better, if it works the same or close it is still great for upgrading. Think about it, if you get a 5870 now....and in a year pick up the 6XXX you can still use, can get the performance of the 5780 as if it were in Crossfire (or better, if it does indeed work)
 


You never know maybe it will be clever enough to throw the DX11 comands at the 5870 and not the GTX 260 ? All is speculation untill we get the thing in the hands of a few reviewers of course, but if it can dynamically load balance two cards of differant abilities then its not a massive step as far as i can see to dynamically distrubute feature sets either.

Mactronix
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990
Its very interesting, but I have serious doubts about quality in the games. In older, simpler games using cards with similar architecture - sure, this could work even better than SLI or crossfire. But imagine new, complex games in DX10-11, you cant so easily break apart scenes to render on diff cards (add to that physics, etc). Also how about AA and geometry? Even different generation of cards from the same manufacturer may produce different results (tearing, holes, etc), talk about different manufacturers and problems would be exponential...

Also main investor is Intel, its all fine for financing while Lara The Bee ;) is a year or two away, what about support after that? I doubt intel will happily share rendering with ATI and nvidia, they'll want to cut opponents out IMO.
 


Good catch, didn't think of that myself. I would assume memory would be handled the same way as CF/SLI, but its worth looking into.

I think we've explained all the things we need to look at, but this is promising to say the least...
 

M3d

Distinguished
Jul 11, 2006
243
0
18,680
gamerk316- I'll post this link on this thread since I don't see it.
http://vimeo.com/6700209

If you look at 8:20 they have GTX 260 1.7GB and a GTS 250 1GB

So, they are already doing it. Too bad he didn't give any details.
I don't know whether the lack of details is troubling or they have something so great, and are holding on tight to it for now, that they will just knock everyone's socks off.

 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810



They aren't allowed to give detail until the company making it says so. Same thing with the new graphics cards, the reviews had the cards and reviewed them but they weren't allowed to post the review until a certain time.