Sign in with
Sign up | Sign in
Your question

Would i achieve 120hz with this crossfire?

Last response: in Graphics & Displays
Share
June 17, 2013 3:48:18 AM

This is the card http://www.gigabyte.com.au/products/product-page.aspx?p...

I'm running at 1920x1080p, CPU is a i7-3770k at 4.4ghz h100i cooler

Basically what i want to do is achieve 120hz in BF3, currently i get around about 80 fps average and i would like to buy a 120hz monitor with a smooth fps at around 120hz.

Are there problems with crossfire?

More about : achieve 120hz crossfire

June 17, 2013 4:01:09 AM

You would get over 120fps easily as an average, but then you have to consider the issues with crossfire:
http://www.anandtech.com/show/6857/amd-stuttering-issue...

These haven't been ironed out, otherwise we'd see news stories everywhere talking about how AMD fixed their microstutter issues with crossfire. They may even create a worldwide holiday when that happens.

You will still see benefits from getting the 120Hz monitor even with one 7970 since your framerates are above 60 on average. You won't have to turn on v-sync since you'll no longer experience any tearing because your monitor can accommodate all 80 frames per second coming at it. In general, you'll get a smoother experience with the 120Hz monitor.

At any rate, when AMD fixes their crossfire issues (hopefully with software instead of hardware for the current gen GPUs), the 120Hz monitor will really shine for you with a second GPU. I'd be patient though. Wait for the 'AMD: We Fixed It' headlines to flood the wire.
m
0
l
June 17, 2013 4:24:34 AM

Get the 120 hz monitor first. Reduce settigns to High in BF3 and see how the gameplay feels.

Crossfire will give you indeed more power, but as previous post stated some small issus might occur. Currently SLI (multi gpu from nvidia) is more stable than crossfire.
m
0
l
Related resources
June 17, 2013 4:38:04 AM

centaurius said:
Get the 120 hz monitor first. Reduce settigns to High in BF3 and see how the gameplay feels.

Crossfire will give you indeed more power, but as previous post stated some small issus might occur. Currently SLI (multi gpu from nvidia) is more stable than crossfire.


Not only is SLI more stable... It's just plain solid.
m
0
l
June 17, 2013 5:00:32 AM

You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.
m
0
l
June 17, 2013 5:15:42 AM

Wasn't AMD aiming for a July launch with their driver fixes? It's been rather quiet on that front for a little while now, I wonder if their ETA still holds. The prototype driver looked promising at least...
m
0
l
June 17, 2013 5:32:31 AM

  1.  
Djentleman said:
You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.


This review shows an i7-920 @3.8GHz and two 7970s (non-GHz editions) in crossfire averaging above 120fps at various resolutions:
http://www.techpowerup.com/reviews/ASUS/HD_7970_CrossFi...

1080p would be somewhere between 1680x1050 and 1900x1200 in this review with regard to performance. Why do you think the i7-3770k and Z77 would be a bottleneck?
m
0
l
June 17, 2013 5:49:20 AM

ubercake said:
  1.  
Djentleman said:
You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.


This review shows an i7-920 @3.8GHz and two 7970s (non-GHz editions) in crossfire averaging above 120fps at various resolutions:
http://www.techpowerup.com/reviews/ASUS/HD_7970_CrossFi...

1080p would be somewhere between 1680x1050 and 1900x1200 in this review with regard to performance. Why do you think the i7-3770k and Z77 would be a bottleneck?


Is that multiplayer benchmarks? Because singleplayer isn't cpu intensive at all.
I know in single player my OC 7950's get that much, but once i hit multiplayer... It's a different story. :/ 
m
0
l
June 17, 2013 6:02:32 AM

Djentleman said:
ubercake said:
  1.  
Djentleman said:
You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.


This review shows an i7-920 @3.8GHz and two 7970s (non-GHz editions) in crossfire averaging above 120fps at various resolutions:
http://www.techpowerup.com/reviews/ASUS/HD_7970_CrossFi...

1080p would be somewhere between 1680x1050 and 1900x1200 in this review with regard to performance. Why do you think the i7-3770k and Z77 would be a bottleneck?


Is that multiplayer benchmarks? Because singleplayer isn't cpu intensive at all.
I know in single player my OC 7950's get that much, but once i hit multiplayer... It's a different story. :/ 


Definitely have a point there. Those are single-player benchmarks. Multi-player engages all six of the cores on my 3930K. I'm consistently right around 120fps with my 780s with details and AA on full tilt with AA transparency also on in control panel.

Have you seen any benchmarks at all for multi-player?
m
0
l
June 17, 2013 6:07:19 AM

multiplayer could be rather difficult to benchmark consistently...
and I certainly haven't seen any :) 
m
0
l
June 17, 2013 6:07:43 AM

ubercake said:
Djentleman said:
ubercake said:
  1.  
Djentleman said:
You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.


This review shows an i7-920 @3.8GHz and two 7970s (non-GHz editions) in crossfire averaging above 120fps at various resolutions:
http://www.techpowerup.com/reviews/ASUS/HD_7970_CrossFi...

1080p would be somewhere between 1680x1050 and 1900x1200 in this review with regard to performance. Why do you think the i7-3770k and Z77 would be a bottleneck?


Is that multiplayer benchmarks? Because singleplayer isn't cpu intensive at all.
I know in single player my OC 7950's get that much, but once i hit multiplayer... It's a different story. :/ 


Definitely have a point there. Those are single-player benchmarks. Multi-player engages all six of the cores on my 3930K. I'm consistently right around 120fps with my 780s with details and AA on full tilt with AA transparency also on in control panel.

Have you seen any benchmarks at all for multi-player?


I just know my setup bottlenecks and i've seen a few other accounts at a resolution at 1080p. My gpu utilization hops around 50% - 90% with 90+% cpu usage
What's your gpu utilization?
You do have a 1000$ cpu.
Resolution has a big impact on this. To tax the gpu's you need something higher than 1080 for 7900 crossfire
m
0
l
June 17, 2013 6:28:15 AM

Djentleman said:
ubercake said:
Djentleman said:
ubercake said:
  1.  
Djentleman said:
You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.


This review shows an i7-920 @3.8GHz and two 7970s (non-GHz editions) in crossfire averaging above 120fps at various resolutions:
http://www.techpowerup.com/reviews/ASUS/HD_7970_CrossFi...

1080p would be somewhere between 1680x1050 and 1900x1200 in this review with regard to performance. Why do you think the i7-3770k and Z77 would be a bottleneck?


Is that multiplayer benchmarks? Because singleplayer isn't cpu intensive at all.
I know in single player my OC 7950's get that much, but once i hit multiplayer... It's a different story. :/ 


Definitely have a point there. Those are single-player benchmarks. Multi-player engages all six of the cores on my 3930K. I'm consistently right around 120fps with my 780s with details and AA on full tilt with AA transparency also on in control panel.

Have you seen any benchmarks at all for multi-player?


I just know my setup bottlenecks and i've seen a few other accounts at a resolution at 1080p. My gpu utilization hops around 50% - 90% with 90+% cpu usage
What's your gpu utilization?
You do have a 1000$ cpu.
Resolution has a big impact on this. To tax the gpu's you need something higher than 1080 for 7900 crossfire


I'll have to check on the GPU utilization. I'll create a log file and see what's going on there. Bad news is I can't do it for another 12 hours or so.

Also, just an FYI... my CPU is the $500 (at microcenter) edition.
m
0
l
June 17, 2013 6:38:21 AM

ubercake said:
Djentleman said:
ubercake said:
Djentleman said:
ubercake said:
  1.  
Djentleman said:
You wouldn't be able to max it at 120 fps constant.
It seems that cpu's bottleneck with 7900 crossfire in battlefield.


This review shows an i7-920 @3.8GHz and two 7970s (non-GHz editions) in crossfire averaging above 120fps at various resolutions:
http://www.techpowerup.com/reviews/ASUS/HD_7970_CrossFi...

1080p would be somewhere between 1680x1050 and 1900x1200 in this review with regard to performance. Why do you think the i7-3770k and Z77 would be a bottleneck?


Is that multiplayer benchmarks? Because singleplayer isn't cpu intensive at all.
I know in single player my OC 7950's get that much, but once i hit multiplayer... It's a different story. :/ 


Definitely have a point there. Those are single-player benchmarks. Multi-player engages all six of the cores on my 3930K. I'm consistently right around 120fps with my 780s with details and AA on full tilt with AA transparency also on in control panel.

Have you seen any benchmarks at all for multi-player?


I just know my setup bottlenecks and i've seen a few other accounts at a resolution at 1080p. My gpu utilization hops around 50% - 90% with 90+% cpu usage
What's your gpu utilization?
You do have a 1000$ cpu.
Resolution has a big impact on this. To tax the gpu's you need something higher than 1080 for 7900 crossfire


I'll have to check on the GPU utilization. I'll create a log file and see what's going on there. Bad news is I can't do it for another 12 hours or so.

Also, just an FYI... my CPU is the $500 (at microcenter) edition.


Try it at 1080 with bf3 multiplayer, it's were i notice it.

Lol, my mind saw 3960x. Sorry :p 
m
0
l
June 17, 2013 7:20:28 AM

ubercake said:
You will still see benefits from getting the 120Hz monitor even with one 7970 since your framerates are above 60 on average. You won't have to turn on v-sync since you'll no longer experience any tearing because your monitor can accommodate all 80 frames per second coming at it. In general, you'll get a smoother experience with the 120Hz monitor.


We've had this conversation about this before. 120hz does not stop tearing below 120 FPS. I have a 120hz monitor and I get tearing at FPS below 120. Without v-sync, tearing can happen at any time. It may be true that tearing happens less often and is less noticeable at FPS below 120, however.

There is nothing to stop the video card from writing to the frame buffer at the same time the monitor is refreshing the image. This causes tearing.
m
0
l
June 17, 2013 7:55:59 AM

bystander said:
ubercake said:
You will still see benefits from getting the 120Hz monitor even with one 7970 since your framerates are above 60 on average. You won't have to turn on v-sync since you'll no longer experience any tearing because your monitor can accommodate all 80 frames per second coming at it. In general, you'll get a smoother experience with the 120Hz monitor.


We've had this conversation about this before. 120hz does not stop tearing below 120 FPS. I have a 120hz monitor and I get tearing at FPS below 120. Without v-sync, tearing can happen at any time. It may be true that tearing happens less often and is less noticeable at FPS below 120, however.

There is nothing to stop the video card from writing to the frame buffer at the same time the monitor is refreshing the image. This causes tearing.


I just don't see the tearing on the 120Hz monitor. I don't even notice it. It must happen far less often because once I go to a 60Hz monitor it becomes easily apparent. And yes we've had this conversation, but it doesn't mean you're right about it.

It's also contrary to the adaptive v-sync solution implemented by Nvidia. Why would this feature only crank on the v-sync when the framerate exceeds your monitor's Hz (or the framerate you manually set as the max) if tearing occurs below the monitor's Hz? I've seen stuttering when fps hit 20s in Crysis 3 when I had my 680s prior to the driver release that improved performance, but this is because the frame rates are so low you can visually see the lack of smoothness. That's not tearing though.

I've never seen anything that says you experience tearing below the monitors FPS other than from you and one other commentor on this sight?

We both had two 680s when we had this conversation in the past. I'm just not seeing tearing or anything out of sync below my monitors refresh rate and I didn't then. I switched to a 60Hz monitor for a spell and couldn't even hang with the tearing and refuse to run with v-sync. The 120Hz monitor allows me to do so.

If you're seeing tearing with framerates below the monitor's refresh rate, it seems like something else might be going on?

Can you cite some article or technical document that explains this? Then I'll shut up about it? Something reputable that might explain why tearing is less than noticeable or occurs far less often when framerates are lower than the monitors refresh rate? I'm really not trying to be a dick about it, I just would like to know where this information comes from.
m
0
l
June 17, 2013 8:10:58 AM

ubercake said:
bystander said:
ubercake said:
You will still see benefits from getting the 120Hz monitor even with one 7970 since your framerates are above 60 on average. You won't have to turn on v-sync since you'll no longer experience any tearing because your monitor can accommodate all 80 frames per second coming at it. In general, you'll get a smoother experience with the 120Hz monitor.


We've had this conversation about this before. 120hz does not stop tearing below 120 FPS. I have a 120hz monitor and I get tearing at FPS below 120. Without v-sync, tearing can happen at any time. It may be true that tearing happens less often and is less noticeable at FPS below 120, however.

There is nothing to stop the video card from writing to the frame buffer at the same time the monitor is refreshing the image. This causes tearing.


I just don't see the tearing on the 120Hz monitor. I don't even notice it. It must happen far less often because once I go to a 60Hz monitor it becomes easily apparent. And yes we've had this conversation, but it doesn't mean you're right about it.

It's also contrary to the adaptive v-sync solution implemented by Nvidia. Why would this feature only crank on the v-sync when the framerate exceeds your monitor's Hz (or the framerate you manually set as the max) if tearing occurs below the monitor's Hz? I've seen stuttering when fps hit 20s in Crysis 3 when I had my 680s prior to the driver release that improved performance, but this is because the frame rates are so low you can visually see the lack of smoothness. That's not tearing though.

I've never seen anything that says you experience tearing below the monitors FPS other than from you and one other commentor on this sight?

We both had two 680s when we had this conversation in the past. I'm just not seeing tearing or anything out of sync below my monitors refresh rate and I didn't then. I switched to a 60Hz monitor for a spell and couldn't even hang with the tearing and refuse to run with v-sync. The 120Hz monitor allows me to do so.

If you're seeing tearing with framerates below the monitor's refresh rate, it seems like something else might be going on?

Can you cite some article or technical document that explains this? Then I'll shut up about it? Something reputable that might explain why tearing is less than noticeable or occurs far less often when framerates are lower the the monitors refresh rate? I'm really not trying to be a dick about it, I just would like to know where this information comes from.


Adaptive Vsync turns off vsync below your refresh rate because vsync causes stuttering when your FPS can't evenly divide into the refresh rate. It is there to help things remain smooth.

Here is an article that does talk about vsync, and even mentioned tearing happens any time vsync is not on. It also has videos and all test are done with FPS between 30 and 60.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...
(It is mentioned in the first paragraph).

This says the only way to prevent it is to use vsync, though I guess it doesn't specifically talk about the myth of lower FPS fixing it: http://en.wikipedia.org/wiki/Screen_tearing

Here is a forum written article on it: http://hardforum.com/showthread.php?t=928593

It seems to be difficult to find articles on this specific topic, so you'll have to live with those. That last one does a good job explaining it in detail.

Test it Yourself: Install MSI Afterburner or Precision X. Set a FPS limiter to 59, and set your monitors refresh rate to 60. Turn off v-sync and play a game. The tear is much more noticeable when your FPS matches your refresh rate, or close to it. Load a 1st person game, and move around.
m
0
l
June 17, 2013 8:59:12 AM

bystander said:


...
Adaptive Vsync turns off vsync below your refresh rate because vsync causes stuttering when your FPS can't evenly divide into the refresh rate. It is there to help things remain smooth.

Here is an article that does talk about vsync, and even mentioned tearing happens any time vsync is not on. It also has videos and all test are done with FPS between 30 and 60.
http://www.pcper.com/reviews/Graphics-Cards/Frame-Ratin...
(It is mentioned in the first paragraph).

This says the only way to prevent it is to use vsync, though I guess it doesn't specifically talk about the myth of lower FPS fixing it: http://en.wikipedia.org/wiki/Screen_tearing

Here is a forum written article on it: http://hardforum.com/showthread.php?t=928593

It seems to be difficult to find articles on this specific topic, so you'll have to live with those. That last one does a good job explaining it in detail.

Test it Yourself: Install MSI Afterburner or Precision X. Set a FPS limiter to 59, and set your monitors refresh rate to 60. Turn off v-sync and play a game. The tear is much more noticeable when your FPS matches your refresh rate, or close to it. Load a 1st person game, and move around.


The first link doesn't address tearing at all. It just talks about how limiting the frames using v-sync won't necessarily eliminate stutter in the video. It also allows them to show the difference in smoothness between 60fps and 30fps whereas 60fps is much smoother (now take this to 120fps/120Hz and it's even better). This is especially noticeable when they half the speed of the video.

The wikipedia article contains sparse bits of information at best. I know what your talking about and it seems like this is quite possible, but it also seems like modern hardware and software mitigate tearing when the fps are lower than the Hz even with no v-sync.

Tearing happens when the fps>Hz because the screen will receive and draw part of the next frame along with the frame it's currently drawing because it's receiving the info into the frame buffer faster than it can display it and ends up drawing parts of multiple frames to screen in a single refresh cycle, but when fps<Hz it seems the monitor would already have a frame available in the buffer on the next refresh cycle whether or not it received a new frame or not in that time. This is why noticeably laggy video occurs at low frame rates. The monitor's redrawing the same frame in the buffer for 3+ refresh cycles before presenting the next which could represent a perceivably and noticeably disproportiante amount of movement in that time; like a flip book where you stay on the same page for a relatively disproportianate amount of time (or the same image is repeated 3 times) compared to the other pages. This is definitely not tearing though.

I've seen that forum article you listed, but like the pcper link, it deals more with how v-sync works and doesn't mention anything about how you can achieve tearing when fps<Hz.

I asked Tom's to do an article on this. I hope they do at some point. I would like to see some video of tearing when the framerate<Hz.

I definitely respect your opinion when it comes to everything bystander, but I definitely need to see more info before I'm sold on the frequency of the occurrence of tearing when a video card's fps is less than a monitor's refresh rate. I'll try limiting the framerate as you've suggested, but then that would be a sort of v-sync itself and introduce issues that go along with it (like the judder described in the wikipedia description?). Additionally, with all the details and AA cranked in Crysis 3, I spend all of my time in that game with fps<Hz and no tearing.
m
0
l
June 17, 2013 10:13:53 AM

I hope they do an article on the subject. This myth keeps spreading with lots of confused people who get tearing at FPS below their refresh rate and wondering why. It would be nice to have a professional article to link. FCAT would make it extremely easy to point out with its overlays.

For now, I gave you a test that should make it apparent. You might have to try a few different games, as there is some methods used to avoid tearing with some pseudo forums of vsync, or in some cases, vsync is forced on (i.e. - Skyrim).

Just use a FPS limiter to 59 while using 60hz refresh. MSI afterburner and EVGA Precision X both have this feature. Play older games, as it is usually easier to see at lower detail levels. You should easily see the tearing going up and down the screen.

And this is not based on something subjective or an opinion. It is a fact. It is based on the science of how a frame is rendered and how it is delivered to the monitor.

The video card is an independent piece of hardware that creates frames and moves it into a frame buffer to be displayed. It has no knowledge of the monitor. Without v-sync on, it simply writes its info to the front buffer when ready. The monitor simply refreshes the image from the front buffer on a regular interval. It has no knowledge of the video card. There is nothing in place to stop the front buffer from being written to when it is refreshing.

Vsync forces the video card to check to see if the monitor is in the middle of updating its image and if so, it is told to wait until it is not before it writes to the buffer. Knowing the rules, explain to me how it could possible not have tearing at any FPS rate?

One thing that is interesting, is there are rare cases like the Unigine benchmarks, which create a pseudo vsync. It has the video card render as fast as it wants, but just does not flip the frame unless the monitor is in vertical retrace or blanking mode. This prevents tearing, and doesn't stop frames from generating while using a triple buffering system. Crysis 3 doesn't do this, as I've seen several videos on it with plenty of tearing, even with its low FPS.

I do believe you just aren't bothered by mild tearing, but give my test a go.
m
0
l
June 17, 2013 10:26:28 AM

I found a professional article that mentions it while describing Adapative Vsync, what should be a trusted source:
http://www.hardocp.com/article/2012/04/16/nvidia_adapti...
Quote:
VSync turned off sounds like a good thing, because your framerate is able to go as high as physically possible from your video card. However, there is a major drawback to allowing framerate to exceed the refresh rate of your display. The consequence is called "tearing," and it is a very real visual anomaly that you will notice more as you play your games as the framerate exceeds the refresh rate. Tearing is described as a frame literally breaking in half, or sometimes even in three parts, and part of the frame lagging behind the other part of the same frame. The result is a visually distorted image that can bother gamers. Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed.


Though one thing that is interesting is how often he talks about how you don't get tearing below your refresh rate, yet makes it clear that you do. Clearly the recognize it, but find it isn't noticeable enough to continue to make note of it. You may find that the closer your FPS are to your refresh, the more obvious it is.
m
0
l
June 17, 2013 12:17:01 PM

Again, though, no definitive answer. The article alludes to what you said saying it can "...technically occur if the framerate doesn't exceed the refresh rate...", but nobody goes into describing how it occurs in this situation. It makes sense that the video card can overwrite part of a frame and the monitor picks it up on its next refresh, but what makes it less likely to be noticed if the torn frame isn't refreshed faster? That would make you think tearing at fps<Hz would be more noticeable.

At the end of that article, he goes on to say "If all you need is 60 FPS in a game for it to be playable, then why not just go ahead and cap the game there so it doesn't exceed your refresh rate. Then, if the game has to fall below that, allow the game to perform at its real-time actual framerate, and Adaptive VSync allows that. It really is the best of all worlds,..."

This tells me the author either doesn't care about tearing, tearing is not perceptibly noticeable when fps<Hz (but how could that be when the frame hangs out on your screen much longer at a lower framerate?), or he doesn't know how to explain it away when fps<Hz after saying it still can occur in this situation.
m
0
l
June 17, 2013 12:27:13 PM

I decided to do the test I asked you to try. I set my monitors refresh to 60hz (normally 120hz, but it is easier to test with 60hz). I opened up EVGA Precision X and set a FPS limiter to 59 (below the refresh rate, but close so that the tears don't move terrible fast). I opened up Dragon Age: Origins and turned off Vsync. I then loaded a game and turned the camera. There was a tear that moved from the top of the screen, down to the bottom of the screen about 1 time per second or 2.

This isn't in my head, or any one else who has come here talking about it. Just test it yourself.
m
0
l
June 17, 2013 12:33:33 PM

bystander said:
I decided to do the test I asked you to try. I set my monitors refresh to 60hz (normally 120hz, but it is easier to test with 60hz). I opened up EVGA Precision X and set a FPS limiter to 59 (below the refresh rate, but close so that the tears don't move terrible fast). I opened up Dragon Age: Origins and turned off Vsync. I then loaded a game and turned the camera. There was a tear that moved from the top of the screen, down to the bottom of the screen about 1 time per second or 2.

This isn't in my head, or any one else who has come here talking about it. Just test it yourself.


I'll check it out. Maybe I've never seen it because I just never focus on it?

m
0
l
June 18, 2013 4:12:45 AM

I'm the OP.

I thought it would be quite easy to hit 120hz, with two 7970's as i said i'm hitting about 80-90 fps average, but after looking at some of the replies, it looks like they need to fix crossfire heh.

thanks for the replies everyone.
m
0
l
June 18, 2013 5:21:22 AM

Yeah. I'd wait and see. I had the microstutter issues with crossfire 3 years ago and AMD's just acknowledged it within the last few months even after I posted all over there support forums with people calling me crazy. Low and behold the problem went away when I traded my crossfire setup for SLI. The good thing is AMD's acknowledged the issue. This would likely force them to work on it?

And heck, 80-90fps is pretty awesome. I still think you'd see benefits from the 120Hz monitor if you're hitting those framerates.
m
0
l
June 18, 2013 2:23:17 PM

lukeoes said:
I'm the OP.

I thought it would be quite easy to hit 120hz, with two 7970's as i said i'm hitting about 80-90 fps average, but after looking at some of the replies, it looks like they need to fix crossfire heh.

thanks for the replies everyone.


What game is this in?
As i said, bf3 is stupidly cpu dependent. And if you have two graphics cards then it takes more work for it.
m
0
l
June 19, 2013 3:13:50 AM

Yes BF3 and pretty much other games as well, I can't go Max settings in far cry 3 and crysis 3.

My cpu is a i7-3770k @ 4.5ghz

This is just my opinion but if i bought a 120hz i'd like to fully utilize it, i don't see the point in getting a 120hz monitor if i only get 80-90 fps average @ full graphics.
m
0
l
June 20, 2013 8:04:40 AM

lukeoes said:
Yes BF3 and pretty much other games as well, I can't go Max settings in far cry 3 and crysis 3.

My cpu is a i7-3770k @ 4.5ghz

This is just my opinion but if i bought a 120hz i'd like to fully utilize it, i don't see the point in getting a 120hz monitor if i only get 80-90 fps average @ full graphics.


The funny thing is with crysis 3, i get crappier frames if i turn down the settings. So make sure to turn on AA and everything so that you can put as much of a workload as possible on the GPU's.

In bf3 i use radeon pro to put a ridiculous amount of AA in the game so it can alleviate the cpu bottleneck as much as possible.
m
0
l
!