Hello!
Something has been on my mind ever since I got a new graphics card and started playing Dying Light. The card I bought is a GTX 750, 2gb. I was very impressed by how it ran the game. I'm playing at Best Performance preset at 1280x720. Most of the time I would get 40-60 fps, in buildings 80-100, it would rarely drop under 30 and only when it was loading something from the hdd in a wide open area. I even tried surround at three times that resolution, and it did not drop the frames at all. Then I ran GPU-Z and realized it was not even using the card's full potential, running only at 75-80%. The bottleneck was the CPU, which was always at 100%. And here is where I start scratching my head.
The "minimum system requirement" for a cpu is listed as Intel Core i5-2500 @3.3 GHz / AMD FX-8320 @3.5 GHz . The one I have is a bottom of the barrel G1840 Celeron processor at 2.8 ghz. I understand that system requirements are a rough guideline, but isn't that worlds apart? My Celeron cost around $40, and the i5 costs around $200. How come I can play the game with a processor that's as far under the minimum as you can go?
I talked about this to a friend of mine and he told me the minimum requirement should guarantee that the game is playable, and it would only be that if it reaches 60 fps and never drops below 30. First of all, isn't that a little harsh? I've played many games that were choppy a lot of the time (Oblivion comes to mind). If I can play the game, is it not playable? Secondly, since I only get a slowdown in Dying Light when the hdd becomes audible, doesn't that mean the rest of the hardware is already providing enough power?
I know this is a very trivial problem that does not require immediate remedy, but I just wanted to share what's on my mind. Sorry for the long post ^^;
Something has been on my mind ever since I got a new graphics card and started playing Dying Light. The card I bought is a GTX 750, 2gb. I was very impressed by how it ran the game. I'm playing at Best Performance preset at 1280x720. Most of the time I would get 40-60 fps, in buildings 80-100, it would rarely drop under 30 and only when it was loading something from the hdd in a wide open area. I even tried surround at three times that resolution, and it did not drop the frames at all. Then I ran GPU-Z and realized it was not even using the card's full potential, running only at 75-80%. The bottleneck was the CPU, which was always at 100%. And here is where I start scratching my head.
The "minimum system requirement" for a cpu is listed as Intel Core i5-2500 @3.3 GHz / AMD FX-8320 @3.5 GHz . The one I have is a bottom of the barrel G1840 Celeron processor at 2.8 ghz. I understand that system requirements are a rough guideline, but isn't that worlds apart? My Celeron cost around $40, and the i5 costs around $200. How come I can play the game with a processor that's as far under the minimum as you can go?
I talked about this to a friend of mine and he told me the minimum requirement should guarantee that the game is playable, and it would only be that if it reaches 60 fps and never drops below 30. First of all, isn't that a little harsh? I've played many games that were choppy a lot of the time (Oblivion comes to mind). If I can play the game, is it not playable? Secondly, since I only get a slowdown in Dying Light when the hdd becomes audible, doesn't that mean the rest of the hardware is already providing enough power?
I know this is a very trivial problem that does not require immediate remedy, but I just wanted to share what's on my mind. Sorry for the long post ^^;