Sign in with
Sign up | Sign in
Your question

Would a i3 2120 boottleneck 2 6870

Last response: in CPUs
Share
February 26, 2012 1:40:26 AM

just wondering if my cpu would bottleneck 2 6870

I can buy a new psu and another 6870 or buy a 6950 2gb
http://lanoc.org/review/pc-hardware/5096-i3-vs-i5-vs-i7... here's a article where it compares i3 i5 i7 with 2 580 in bfbc2 there is a couple frame difference but nothing big
Plus I mainly play bf3

More about : 2120 boottleneck 6870

a c 159 à CPUs
February 26, 2012 1:46:02 AM

Yeah, that i3 will bottleneck 2x6870 for sure.
m
0
l
a b à CPUs
February 26, 2012 1:51:03 AM

BF3 is gpu intensive.2120 is a good cpu.it challenges AMD 2+core cpu in gaming.i highly recommend it. difference between i3/i5/i7 is not much but they have other benefits(overclocking,hyper-threading).better choice is to go with a single 6950.less power consumption,less heat,no micro-stuttering etc.NO BOTTLENECKS!!!!a good(antec,xfx,corsair,seasonic,enermax,CM silent pro,nzxt or silverstone will do fine.)
m
0
l
Related resources
February 26, 2012 2:04:12 AM

Get a 6950 and save crossfire for a future upgrade.
m
0
l
a b à CPUs
February 26, 2012 2:10:03 AM

IMO it's hard to say if that would bottleneck it. It would be close though.

I wish tomshardware would do a bottleneck test using SLI/CF configs on semi-gpu bound games.
m
0
l
a c 185 à CPUs
February 26, 2012 4:06:55 AM

OC 6870 for now.
m
0
l
a c 159 à CPUs
February 26, 2012 4:24:25 PM

Have not sense spend $340 in GPUs while your CPU only cost $130, that WOULD not be a balanced system.
m
0
l
a c 112 à CPUs
February 26, 2012 5:03:03 PM

yep its all about balance... its absolutley a waste of time pairing 2 high end gpu's with a budget cpu. you may get away with it in a lot of games but more often than not you wont see the real performance gains that you would with a quad core system...
i dunno what benchmarks your referring to but i have seen many dual core systems crumple on bfbc2 with any card over a gtx 250..
a m8 with a e6700 saw his system choke very badly when he upped the anti with a 4890. he went from 45fps playable to 4-50fps constantly bouncing. showing him and us that bfbc2 is a very cpu bound game.
now you make a system with an even worse cpu to gpu power ratio and i can almost guarantee that you will be less than happy with the end result.

forget the second card and buy a i5 quad core instead. it will give your current card more leg room and will allow better performance when you do eventually sli it.
better to build a balance system and play all games rather than a gpu heavy system that will only play a select few.



m
0
l
a b à CPUs
February 26, 2012 6:23:52 PM

Does anybody have any data showing these would be bottlenecked?
m
0
l
Anonymous
a b à CPUs
February 26, 2012 6:41:13 PM

andY biersack said:
just wondering if my cpu would bottleneck 2 6870

I can buy a new psu and another 6870 or buy a 6950 2gb
http://lanoc.org/review/pc-hardware/5096-i3-vs-i5-vs-i7... here's a article where it compares i3 i5 i7 with 2 580 in bfbc2 there is a couple frame difference but nothing big
Plus I mainly play bf3


honestly i haven't found any data either way.

being almost in the same boat with a 2120 myself i figured if i am upgrading one component i would get more performance out of another GPU than an i5; i would deal with the bottleneck until i could upgrade my cpu. but i have a 550ti, BIG difference, that SLI'd wouldn't bottleneck.
m
0
l
a b à CPUs
February 26, 2012 6:45:33 PM

I'm not very convinced of bottlenecking being such a horrible bad thing. I slapped a GTX 560Ti 448 core in my E8400 system. Would I see larger FPS if I upped the CPU to a 2500K? Absolutely. But did I see a major improvement over my 8800 GT? Hell yes. Crysis, max settings on a 1920 x 1200 display - 30 FPS flat, compared to the 1 - 2 FPS my 8800 GT got at the same settings. With a newer processor, my FPS will be even higher yet. But who cares? I saw a performance increase with a card that I can eventually transfer to a new build (as money comes available).

Now if crossfire got no better frames than, say, a single 6870 due to the CPU, then I would say its a waste. If they will still see an increase in frames, just not as high as with a higher end CPU, then who cares? Their money, they get the benefits, and they are happy.
m
0
l
February 26, 2012 6:49:59 PM

I thought GPU(s) determine things like the resolution and AA settings you can play at. How would it tax the CPU to switch from 1600 x 1050 with no AA to 1080P with 4x AA?

In general, I agree that it's silly to do this with a budget CPU, especially if one needs to turn off or down other quality or effect settings.

(I built a system with an i7-950 and one 6870 before SB came out but chose a mobo and PSU that can handle another 6870.)
m
0
l
a b à CPUs
February 26, 2012 6:52:57 PM

DXRick said:
I thought GPU(s) determine things like the resolution and AA settings you can play at. How would it tax the CPU to switch from 1600 x 1050 with no AA to 1080P with 4x AA?


From my understanding, the GPU still has to send the information through the CPU. If the information can't be processed fast enough, the GPU gets bottle-necked. Basically, it can only work as fast as the CPU. That's my basic understanding of it.
m
0
l
Anonymous
a b à CPUs
February 26, 2012 7:17:11 PM

phyco126 said:
From my understanding, the GPU still has to send the information through the CPU. If the information can't be processed fast enough, the GPU gets bottle-necked. Basically, it can only work as fast as the CPU. That's my basic understanding of it.


ah, i may have read you worng, saying the GPU sends information to the CPU. i thought the CPU sends info to the GPU which processes the rendering of graphics and tessellation.
(ok, maybe it tells the cpu what its completed)
if the cpu isn't sending data fast enough then the gpu is taking a coffee break.
m
0
l
a b à CPUs
February 26, 2012 10:49:16 PM

Anonymous said:
ah, i may have read you worng, saying the GPU sends information to the CPU. i thought the CPU sends info to the GPU which processes the rendering of graphics and tessellation.
(ok, maybe it tells the cpu what its completed)
if the cpu isn't sending data fast enough then the gpu is taking a coffee break.


Perhaps. I make no claim that I am right, lol. Just that is what several others have said to me in the past. They could be wrong. Or we could both be right. I would imagine information gets sent back and forth.
m
0
l
Anonymous
a b à CPUs
February 26, 2012 11:41:19 PM

phyco126 said:
Perhaps. I make no claim that I am right, lol. Just that is what several others have said to me in the past. They could be wrong. Or we could both be right. I would imagine information gets sent back and forth.


please excuse me for sounding critical of your assesment.

i simply wanted an excuse to say, "the gpu is taking a coffee break"

that is all . . . :) 
m
0
l
a c 92 à CPUs
February 26, 2012 11:41:45 PM

why so worry about bottleneck..?

as long as it can deliver 60fps to your screen, you can't fell the difference..

in the high resolution, you'll need more graphics card power than processor..

personally, for gaming, i prefer using $500 graphic card and $100 processor than $100 graphic card and $500 processor
m
0
l
a b à CPUs
February 27, 2012 1:14:00 AM

Anonymous said:
please excuse me for sounding critical of your assesment.

i simply wanted an excuse to say, "the gpu is taking a coffee break"

that is all . . . :) 


Haha, it is all good. I did not feel like you were being critical.
m
0
l
February 27, 2012 2:03:43 AM

Actually, the program (CPU) performs various functions (like animations, AI, and physics, and determining what information gets sent to the GPU) each frame. The information sent to the GPU includes the geometry and textures to be rendered that frame. It also includes additional information needed by the shader program (that is run on the GPU) for things like lighting, shadows, and particle effects.

The GPU actually builds the frame buffer (the next screen shot to display) from all of that information. Thus the CPU's job does not change based on the GPU settings of resolution and AA.

I studied DirectX 9 a few years ago.
m
0
l
Anonymous
a b à CPUs
February 27, 2012 2:11:05 AM

^ thank you.
m
0
l
!