Would a i3 2120 boottleneck 2 6870

BF3 is gpu intensive.2120 is a good cpu.it challenges AMD 2+core cpu in gaming.i highly recommend it. difference between i3/i5/i7 is not much but they have other benefits(overclocking,hyper-threading).better choice is to go with a single 6950.less power consumption,less heat,no micro-stuttering etc.NO BOTTLENECKS!!!!a good(antec,xfx,corsair,seasonic,enermax,CM silent pro,nzxt or silverstone will do fine.)
 
yep its all about balance... its absolutley a waste of time pairing 2 high end gpu's with a budget cpu. you may get away with it in a lot of games but more often than not you wont see the real performance gains that you would with a quad core system...
i dunno what benchmarks your referring to but i have seen many dual core systems crumple on bfbc2 with any card over a gtx 250..
a m8 with a e6700 saw his system choke very badly when he upped the anti with a 4890. he went from 45fps playable to 4-50fps constantly bouncing. showing him and us that bfbc2 is a very cpu bound game.
now you make a system with an even worse cpu to gpu power ratio and i can almost guarantee that you will be less than happy with the end result.

forget the second card and buy a i5 quad core instead. it will give your current card more leg room and will allow better performance when you do eventually sli it.
better to build a balance system and play all games rather than a gpu heavy system that will only play a select few.



 
G

Guest

Guest


honestly i haven't found any data either way.

being almost in the same boat with a 2120 myself i figured if i am upgrading one component i would get more performance out of another GPU than an i5; i would deal with the bottleneck until i could upgrade my cpu. but i have a 550ti, BIG difference, that SLI'd wouldn't bottleneck.
 

phyco126

Distinguished
Nov 6, 2011
1,014
0
19,460
I'm not very convinced of bottlenecking being such a horrible bad thing. I slapped a GTX 560Ti 448 core in my E8400 system. Would I see larger FPS if I upped the CPU to a 2500K? Absolutely. But did I see a major improvement over my 8800 GT? Hell yes. Crysis, max settings on a 1920 x 1200 display - 30 FPS flat, compared to the 1 - 2 FPS my 8800 GT got at the same settings. With a newer processor, my FPS will be even higher yet. But who cares? I saw a performance increase with a card that I can eventually transfer to a new build (as money comes available).

Now if crossfire got no better frames than, say, a single 6870 due to the CPU, then I would say its a waste. If they will still see an increase in frames, just not as high as with a higher end CPU, then who cares? Their money, they get the benefits, and they are happy.
 

DXRick

Distinguished
Jun 9, 2006
1,320
0
19,360
I thought GPU(s) determine things like the resolution and AA settings you can play at. How would it tax the CPU to switch from 1600 x 1050 with no AA to 1080P with 4x AA?

In general, I agree that it's silly to do this with a budget CPU, especially if one needs to turn off or down other quality or effect settings.

(I built a system with an i7-950 and one 6870 before SB came out but chose a mobo and PSU that can handle another 6870.)
 

phyco126

Distinguished
Nov 6, 2011
1,014
0
19,460


From my understanding, the GPU still has to send the information through the CPU. If the information can't be processed fast enough, the GPU gets bottle-necked. Basically, it can only work as fast as the CPU. That's my basic understanding of it.
 
G

Guest

Guest


ah, i may have read you worng, saying the GPU sends information to the CPU. i thought the CPU sends info to the GPU which processes the rendering of graphics and tessellation.
(ok, maybe it tells the cpu what its completed)
if the cpu isn't sending data fast enough then the gpu is taking a coffee break.
 

phyco126

Distinguished
Nov 6, 2011
1,014
0
19,460


Perhaps. I make no claim that I am right, lol. Just that is what several others have said to me in the past. They could be wrong. Or we could both be right. I would imagine information gets sent back and forth.
 
G

Guest

Guest


please excuse me for sounding critical of your assesment.

i simply wanted an excuse to say, "the gpu is taking a coffee break"

that is all . . . :)
 

Quaddro

Distinguished
why so worry about bottleneck..?

as long as it can deliver 60fps to your screen, you can't fell the difference..

in the high resolution, you'll need more graphics card power than processor..

personally, for gaming, i prefer using $500 graphic card and $100 processor than $100 graphic card and $500 processor
 

DXRick

Distinguished
Jun 9, 2006
1,320
0
19,360
Actually, the program (CPU) performs various functions (like animations, AI, and physics, and determining what information gets sent to the GPU) each frame. The information sent to the GPU includes the geometry and textures to be rendered that frame. It also includes additional information needed by the shader program (that is run on the GPU) for things like lighting, shadows, and particle effects.

The GPU actually builds the frame buffer (the next screen shot to display) from all of that information. Thus the CPU's job does not change based on the GPU settings of resolution and AA.

I studied DirectX 9 a few years ago.