Sign in with
Sign up | Sign in
Your question
Closed

Will my CPU bottleneck a GTX 670?

Last response: in CPUs
Share
May 11, 2012 6:30:35 AM

Hi guys, so i've been jumping around on what i'm going to do to upgrade my GPU this year and with the 670 comes out it seems the perfect fit for price/power for me. However, i want to make sure my CPU, an AMD PHENOM II x6 won't bottleneck it. I know, i know, it's a pretty weak CPU (was a noob when i first put my rig together a year ago) but i've overclocked it by 500 mhz to make it a bit more powerful.

Do any of you think it will be a bottleneck in terms of performance from a 670? or should it get the job done?

More about : cpu bottleneck gtx 670

May 11, 2012 7:05:31 AM

no it shouldn't bottleneck any GPU out right now, thats CPU is not weak it is powerful
Score
0
a b à CPUs
May 11, 2012 7:28:21 AM

It'll bottleneck the gtx 670 slightly. It won't have the same performance as the better intel solutions but every game should be playable if you do get a 670. Really...

It'll be like what 10 fps difference when you're going to be getting 60+ in every game. Completely playable. You should worry more about finding the 670 though.. :-p
Score
0
Related resources
a c 186 à CPUs
May 11, 2012 7:33:43 AM

What resolution? Well at lower resolutions your cpu will be a bottleneck, but at resolutions (at least 1080P) you will find great performance.
Score
0
a c 186 à CPUs
May 11, 2012 7:34:04 AM

Though I don't see a reason why you will be buying a 670, why not just add another 6870?
Score
0
a b à CPUs
May 11, 2012 4:18:17 PM

I asked that same question in Anandtech forums and was told don't worry about it (I have a Phenom II X4 980). It will bottleneck a bit in CPU intensive games but still be much, much better than the OC'd GTX 460 1GB I currently have.

Turning loose of the $400 for it - that's a different story. :na: 
Score
0
a c 95 à CPUs
May 11, 2012 4:28:36 PM

why people so worried about bottleneck..?

as long as it's can deliver more than 60 fps..no need to worry..you can't see the difference between 1573fps and 60 fps..

well, 40 fps actually will give you similar result..
Score
0
May 11, 2012 4:40:37 PM

I remember once reading that the human eye cant identify more than 27 FPS
Score
0
a b à CPUs
May 11, 2012 4:46:53 PM

infact yes, x6 can bottleneck gtx670 but not a massive if you want to decrease the chances of bottleneck then keep your moniter resolution higher. In this you will be good with no bottleneck.. Also power up your gtx 670 with a decent PSU brand and wattage!!
Score
0
a b à CPUs
May 11, 2012 5:01:26 PM

rajaawad23 said:
I remember once reading that the human eye cant identify more than 27 FPS


I always see those claims and I chuckle. For me, I can't play CoD4 MP with anything less than 90FPS. I usually try to stay above 120FPS but when it dips to 60FPS, I end up unable to react fast enough.

Anyhow, the 1090T will definitely bottleneck that GPU. Just look at this:



This is a game where the limitation is not on the GPU, but the CPU. This was tested with a GTX 580, which is slightly slower than a GTX 670.

In this test, Phenom 1100T gets destroyed as it can't generate as many frames as i5-2500K, even with everything else being equal.

I realize that some people will mention Crysis 2 or Battefield 3. However, Chris Angelini himself states that:

Quote:
Crysis 2, our first-person shooter, is indicative of the most visually-engaging GPU-bound titles currently compelling gamers to spend lots of cash on multi-card CrossFire and SLI setups. This is the sort of app AMD would most like you to associate with FX.


Basically, the only winning chance AMD has is to compare the processor lineup using GPU-bound games like Crysis 2 and Battlefield 3.

So in short: Yes, GTX 670 will bottleneck that CPU, hard.
Score
0
May 11, 2012 5:14:25 PM

thanks Quaddro for the correction, so now what is the point for aiming above 60 FPS???
Score
0
May 11, 2012 5:43:44 PM

over 120 fps i cant play LOL :D 

for shooters, 100 fps its kinda ALOT, because ur eye cant diference betwen 40 and 120...
Score
0
May 11, 2012 6:27:33 PM

eddieroolz said:
I always see those claims and I chuckle. For me, I can't play CoD4 MP with anything less than 90FPS. I usually try to stay above 120FPS but when it dips to 60FPS, I end up unable to react fast enough.

Anyhow, the 1090T will definitely bottleneck that GPU. Just look at this:

http://media.bestofmicro.com/M/1/310537/original/f1%202011%201680.png

This is a game where the limitation is not on the GPU, but the CPU. This was tested with a GTX 580, which is slightly slower than a GTX 670.

In this test, Phenom 1100T gets destroyed as it can't generate as many frames as i5-2500K, even with everything else being equal.

I realize that some people will mention Crysis 2 or Battefield 3. However, Chris Angelini himself states that:

Quote:
Crysis 2, our first-person shooter, is indicative of the most visually-engaging GPU-bound titles currently compelling gamers to spend lots of cash on multi-card CrossFire and SLI setups. This is the sort of app AMD would most like you to associate with FX.


Basically, the only winning chance AMD has is to compare the processor lineup using GPU-bound games like Crysis 2 and Battlefield 3.

So in short: Yes, GTX 670 will bottleneck that CPU, hard.



The T1090 is plenty fast enough - If you choose to upgrade you are wasting your money. If you had a dual core then yes.

I disagree with this claim - 60FPS (ish) is good enough at those high resolutions.

Yes the intels are "faster" however once you hit 60 its really just gravy and bragging. Any Q9550(intel), AMD T1090/1100, i920 is good enough. Toms recently did an article about even needing to update to ivy bridge from sandy, and the answer is .... Wait till the next gen. I'm holding off till there is a reason to upgrade, ie: cheap 6 core intel.

I also noticed yours is clocked at 3.8 GHZ - Good enough!

If you want to fork out 600 for a new mb/ram and CPU its your call ;) 
Score
0
a b à CPUs
May 11, 2012 6:33:24 PM

most screens only refresh at 60hz or 75hz so any more frames are wasted, on Nvidia 6xx cards I think they have technology to stop it rendering frames it doesn't need. CPU might bottleneck but the Pii X6 wasn't and isn't a weak CPU by any means.
Score
0
a b à CPUs
May 11, 2012 6:43:07 PM

eddieroolz said:
I always see those claims and I chuckle. For me, I can't play CoD4 MP with anything less than 90FPS. I usually try to stay above 120FPS but when it dips to 60FPS, I end up unable to react fast enough.



I love how people say their reaction time is 0.011 seconds or better and 0.016 seconds is just way too slow. (thats 90 fps and 60 fps)

lets see some 11 ms numbers http://www.humanbenchmark.com/tests/reactiontime/index....
Score
0
a b à CPUs
May 11, 2012 6:45:32 PM

It will certainly bottleneck in quite a few settings, but don't worry about that and just buy it.
Score
0
May 11, 2012 6:49:07 PM

Another thread with people blindly throwing up links on the internet that prove the human eye cant see more then X FPS.

The eye can see more then 24FPS. That's why 24fps vs 60 looks more smooth, less jittery camera movement at 60fps then 24fps. There's a very noticeable difference between 60fps and 120fps if you have a 120hz LCD computer(not a TV) monitor.


If we couldn't see more then 24FPS why was everyone complaining about how weird the Hobbit at 48FPS looked?
Score
0
May 11, 2012 7:45:46 PM

You can "feel" low fps, the game just doesnt feel smooth.
Score
0
a b à CPUs
May 11, 2012 7:48:40 PM

I agree, the eye definately can see more than 24 fps. Also the F1 2011 benchmarks look more clock speed bound than core bound. The Phenom 980 produced more frames yet had less cores it was also run at 1680x1050, and I doubt the guy asking the question wouldn't buy a 670 to run at that res. If the game in question was battlefield 3 multiplayer the phenom x6 will do better. The Phenom X6 is overclocked to 3.8ghz too. Basically that CPU won't really bottleneck a 670, it'll still provide good performance for the next few years.

Cmon even Core 2s can still cut it overclocked.
Score
0
a c 95 à CPUs
May 11, 2012 8:18:38 PM

eddieroolz said:
For me, I can't play CoD4 MP with anything less than 90FPS. I usually try to stay above 120FPS but when it dips to 60FPS, I end up unable to react fast enough.


such an eagle eye...:D 

Robi_g said:
most screens only refresh at 60hz or 75hz so any more frames are wasted, on Nvidia 6xx cards I think they have technology to stop it rendering frames it doesn't need.


i remembered when i was play fable III, game vsync setting even limit it to 30 fps..

proxy711 said:
The eye can see more then 24FPS. That's why 24fps vs 60 looks more smooth, less jittery camera movement at 60fps then 24fps.


if it can maintain framerate at 24 fps (or 4.166 frame/1ms) in all scenes, you'll barely notice the differences..

but somehow, graphics card works in average frame rate..graphics card which is only deliver 24 fps at average, some time can only deliver less than 20, another time deliver more than 24..

24 fps mean graphics card can deliver 4.166 frame for 1ms..

when you move camera, you've change all scene, different load, different light, and different object polygon, which is give a huge load to the memory, processor, gpu and graphics memory.

this change make graphics card cannot maintain framerate stability and will give less than 4.166 frame/ms. that's why you'll see some lagging..

rajaawad23 said:
thanks Quaddro for the correction, so now what is the point for aiming above 60 FPS???


so, 40fps is the safe point for the average frame rate..

Robi_g said:
There's a very noticeable difference between 60fps and 120fps if you have a 120hz LCD computer(not a TV) monitor.


are you sure..?
because i think this is just marketing gimmick to sell expensive 120hz monitor rather than cheap 60hz..
_________________________________

okay back to the topic..

There's no need to worry about bottleneck..just go for it.
if you donot satified with your processor, or find some "lagging" (well, with GTX670, 2012 games you'll not find lagging experiece..), upgrade your processor..
that's easy.

actually..the slowest part of your rig is mechanical harddrive..and that is make a really huge bottleneck.
you'll have to create a harddrive that as fast as ram in order to make a computer bottleneck free..(and that's almost imposibble right now :D  )
Score
0
May 11, 2012 8:40:24 PM

@Quaddro, that is true you are spending all that money on a GPU if you are not satisfied you can grab yourself a new CPU which are pretty cheap right now. and it is impossible to have a bottleneck free computer, as one thing will always bottleneck another.
Score
0
May 11, 2012 8:53:22 PM

I can only see the difference when in slow motion
Score
0
a b à CPUs
May 11, 2012 9:27:10 PM

Simple Answer. NO.

Compared to Intel you may not see the exact frame rates but. No, i don't see any real major bottlenecking for your CPU and a 670
Score
0
a b à CPUs
May 11, 2012 9:28:34 PM

My first question is... why aren't you going for 6870 crossfire?

In very fast motion games (such as CSS) I can tell the difference between 90FPS and 300FPS.

This argument is so dry my eyes are crusting. Sick of hearing the "eyes" and "refresh rate" argument.

I will agree though, you should be playable on any game on just about any settings with a Thuban (for now).

You'll be fine. Don't bother changing platforms until you need GTX 670 SLI to be playable on settings you prefer, or you start a very CPU bound game 3.8ghz Thuban can't keep up with.
Score
0
a b à CPUs
May 12, 2012 1:58:24 PM

noob2222 said:
I love how people say their reaction time is 0.011 seconds or better and 0.016 seconds is just way too slow. (thats 90 fps and 60 fps)


I didn't claim my reaction time was 0.011 seconds, but there's a noticeable difference when I play on my laptop monitor (1366x768) that gets 120FPS consistently and my 1080p monitor that occasionally dips into 40FPS. In the latter, I do noticeably worse.

Do keep in mind that not all people are the same...you may not notice a difference, but the eye can definitely see more than 24FPS and I take every advantage of that. So do a lot of other people too, hence the reason for 120Hz monitor and TVs.
Score
0
a b à CPUs
May 12, 2012 2:12:37 PM

It's fine. The more you worry about it the more you'll think its there. Game on.
Score
0
May 14, 2012 12:06:39 PM

Quaddro said:

when you move camera, you've change all scene, different load, different light, and different object polygon, which is give a huge load to the memory, processor, gpu and graphics memory.

this change make graphics card cannot maintain framerate stability and will give less than 4.166 frame/ms. that's why you'll see some lagging..
I'm not talking about fast 180 degree turns being laggy. I mean slow movements being smoother at higher FPS then 24. You can test this pretty easily. Set a fps limiter to 24 fps in a game you have no problems playing like HL2 walk around looking around then up the limiter to 60fps. Looks much better. See below quote for another example.


Quaddro said:

are you sure..?
because i think this is just marketing gimmick to sell expensive 120hz monitor rather than cheap 60hz..
)
Heres a quote from Anandtech's review of an Asus 120hz monitor http://www.anandtech.com/show/3842/asus-vg236h-review-o...

"Though the 120Hz refresh frequency does make games playable in 3D, there’s another important benefit of using a faster refresh rate - everything looks smoother, and you can now drive up to 120 FPS without tearing. The ASUS VG236H was my first exposure to 120Hz refresh displays that aren’t CRTs, and the difference is about as subtle as a dump truck driving through your living room. I spent the first half hour seriously just dragging windows back and forth across the desktop - from a 120Hz display to a 60Hz, stunned at how smooth and different 120Hz was. Yeah, it’s that different."

Not a marketing gimmick and another example of how seeing no more then 24FPS is complete bull.
Score
0
May 22, 2012 7:07:59 AM

wow guys... i just realized i had responses to this... tom's never sent me emails about responses until earlier today when it told me i had a bunch..

it seems the majority of you are saying that i should just go for it. i think i will do this, and upgrade my CPU in january when i have more money (from what i understand haswell or the next gen intels will only be quad core) so i'll probably just upgrade to an i7-2600k or 3770k.

the only issue now is waiting fo newegg to get that sweet Gigabyte 670 with the three fans in stock... can't beat that for $400...

as for the entire eye movement thing, i haven't seen too much about the argument before but i'm pretty sure i can tell 25 fps from 60, or even 60 from 90. even if i can't see a difference, i can definitely FEEL it.

thanks for your help!
Score
0
May 22, 2012 7:09:31 AM

Raidur said:
My first question is... why aren't you going for 6870 crossfire?

In very fast motion games (such as CSS) I can tell the difference between 90FPS and 300FPS.

This argument is so dry my eyes are crusting. Sick of hearing the "eyes" and "refresh rate" argument.

I will agree though, you should be playable on any game on just about any settings with a Thuban (for now).

You'll be fine. Don't bother changing platforms until you need GTX 670 SLI to be playable on settings you prefer, or you start a very CPU bound game 3.8ghz Thuban can't keep up with.


I'm thinking about doing xfire 6870s but my room can get a bit warm during the summer and two GPUs might generate too much heat. also i think i only have enough pci-e 6 pin connectors for one GPU, but i might be mistaken.
Score
0
!