Sign in with
Sign up | Sign in
Your question
Solved

What cpu will not bottleneck quadfire

Last response: in Graphics & Displays
Share
a b U Graphics card
a b à CPUs
February 26, 2011 12:22:59 AM

If i had 4x 4890s
2x 4870x2
or 2x gtx 295s

what cpu will i need not to bottleneck these?
a b U Graphics card
a b à CPUs
February 26, 2011 1:50:08 AM

If you did a great watercooler and had a great Intel Sandy Bridge i7 CPU OCed very, very far you could. I would say an Intel core i7 2600k OCed to around 4.5Ghz.
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 2:26:24 AM

Any i7 Oced past 3.6 would work.
m
0
l
Related resources

Best solution

a b U Graphics card
a b à CPUs
February 26, 2011 2:28:10 AM

christop said:
Any i7 Oced past 3.6 would work.

That is A LOT of GPU Power, you will need more like 4Ghz now that I think about it. I would do 4.2Ghz to be safe. 3.6Ghz will not do it.
Share
a b U Graphics card
a b à CPUs
February 26, 2011 3:46:26 AM

my 2500k is at 5ghz that work or not?
m
0
l
a c 231 U Graphics card
a c 217 à CPUs
February 26, 2011 3:54:15 AM

2500k MoBo would need NF200 to get full x16 bandwidth.
m
0
l
February 26, 2011 3:58:13 AM

Certainly would be fine - though the Sabertooth P67 wouldn't be.

You'd be best off with a WS Evolution / UD7 to avoid the 16x PCI-E lanes issue with socket 1155.

Also a massive ew @ those setups, heat producing old generation cards without DX11 yuk. Waste of moneys

GTX 570 SLI / 6950 CF much much better option
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 4:00:38 AM

No i am not going to change my current computer all i am saying is i have a 2500k laying around but no mobo lol what luck
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 4:02:01 AM

I can save a lot of money and get another gtx 580. i am still pissed that nvidia in the year 2010 hasn't made it so series can work together like a gtx 580 and 570 in sli
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 4:05:07 AM

I do not know how you managed 5Ghz on a i5 2500k, but yeah, that would work really well. But I think, correct me if I am wrong, but 1155 boards only go 16x or 8x 8x SLi/CF. That would mean that you would take advantage of all that power.
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 4:13:25 AM

at x8 you lose what 2 fps lol doesn't matter for crap.. what i want to know is why is nvidia still being stupid not allowing a series to sli together.
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 4:23:27 AM

Ummm... Wrong. Anything that is higher than a hd5870 loses performance at 8x. If you cards are stronger than a hd5870 it will lose performance. The higher above a hd5870, the more performance you will end up losing. So in reality, you might end up losing 10 FPS.
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 4:24:24 AM

but for decoding i don't think that x8 is going to make a major difference.
m
0
l
a c 106 U Graphics card
a b à CPUs
February 26, 2011 4:24:41 AM

Using a board with an NF 200 chip adds latency you know so will it's actually a slight determent to LGA 1366 boards it's a benefit for the 1156 and 1155 boards. So, for a sandy bridge chip, a board with the NF 200 would allow two 4870X2s or two 295s to communicate with each other with 16x/16x lanes rather than just 8x/8x. For four 4890s though you would need a a board with 4 8x slots with an empty slot between them which also means a bigger case with at least room for 8 expansion slots rather than 7 so the last card has an exhaust.
m
0
l
a c 173 U Graphics card
a c 88 à CPUs
February 26, 2011 6:02:38 AM

quadfire bottlenecks itself as scaling sucks balls when you xfire 4 cards. Dont bother, stick with 2 cards and 2x dual gpu cards is counted as quadfire. just get 2 high end cards.
m
0
l
a b U Graphics card
a b à CPUs
February 26, 2011 1:56:34 PM

Ebay your dated dual GPU cards if you want money--dump them on classifieds here if you don't.

Then buy a new dual GPU (GTX 590/6990) when it comes out. The 8x can limit your high end cards if you crossfire/SLI them, but you'll be getting over 60fps in anything, so it won't matter. It should matter with decoding though--which should be done with Radeons btw.
m
0
l
a b U Graphics card
a b à CPUs
March 28, 2011 3:41:50 AM

Best answer selected by cia24.
m
0
l
!