FX-6350 good for Titanfall?

apcs13

Honorable
Oct 2, 2013
960
0
11,360
Hey everyone, so a little bit ago the minimum requirements for Titanfall released and looking at them I saw that I could easily breeze through the game based on these. However, taking another look, I have one concern that may or may not be legit, deciding on which one is where you guys come in:

Link to minimum specs: http://www.pcgamer.com/2014/02/04/titanfall-minimum-system-requirements-revealed/

These are the minimum specifications, I'm not sure if they will ever release recommended specs, but I am assuming this is for running on low res and low detail settings. I have a GTX 770 so I figure no matter what I will be fine to max or near max the detail level at 1080P, but I am concerned about my CPU...

The CPU listed in the minimum requirements is the AMD Athlon IIX2 270 or the Intel Core 2 Duo (E6600), and my CPU is apparently only a little better than these old CPUs in single threaded tasks, which was not only kind of a big shock, but also concerned me.

I know that Titanfall runs on a heavily modified Source engine, and especially since it mentions dual core CPUs, will I not be able to run the game at higher settings because I will be CPU limited? I am just worried because although it is a hex-core CPU I have sitting at 4.4 GHz, I understand it's single threaded performance to be not exactly great, so I'm just having my doubts.

What do you guys think, is this a really dumb question or do I have a legitimate concern? Thanks!

Also, I doubt it really matters, but other source engine games, old and new, I run without issue. HL2 which is probably optimized for 2 cores at most I can max at the frame cap of 300 FPS or whatever, and L4D2 is also crazy high with 16XAA and the works, just in case that somehow helps.
 
Solution

apcs13

Honorable
Oct 2, 2013
960
0
11,360


Bah, bad wording! :/ I didn't mean that my CPU was a little better I know it's quite a bit better, but i think on the website CPU-world.com they listed the Core 2 Duo at 1.0 as the standard for comparison in single core performance, and then put the 6350 at 1.08, which is I believe a pretty small performance gain. Is the website wrong or is single core performance reall that bad?

Link to said Webpage here: http://www.cpu-world.com/Compare/450/AMD_FX-Series_FX-6350_vs_Intel_Core_2_Duo_E6600.html
 

xero99

Honorable
Aug 25, 2013
330
0
10,810


That website isn't 100% reliable and is also at STOCK clocks so you'll see a bigger gap with your CPU.. anyway Titan fall isn't going to be capped to one core ;)
 
Solution

apcs13

Honorable
Oct 2, 2013
960
0
11,360


Okay great to hear thanks!
 

I3lood Eagle

Honorable
Oct 1, 2013
249
0
10,710
6350 is a great midrange gaming chip/8350 is their high range/ and 9590 is their extreme range... from what I've been seeing TitanFall isn't as intensive as previously thought because they're afraid that people aren't going to be able to play it so they dumbed the GFX down a little(not too much) but enough for someone who is a big fanboy of eyecandy to notice. After optimizations within the first 1-3 months the game usually has some very pimp hand strong gains as far as framerate and fluidity goes... I'll tell you I don't think you're going to have the slightest bottlenecks to be honest and that 770 will carry you for a few years on ultra settings in 1080P I just recently picked up the 4GB windforce OC model due to all it's potential that I've seen and let me tell you this also. Whenever you think you're rig is going to fall short of a game you really want to play wait and see how your rig actually performs before you go out and spend $100s because really most of the time people get ultra picky when they play on ultra and try to grasp 60FPS which oft times will lead you to SLI'ing or CFX'ing your GPU. Try it out first and see for yourself you really have no idea what it's capable of until you do... you might have golden chips/you might have great support/ you might even just be blessed enough to have somewhere around the exact platform they're aiming at optimizing which is what I had in Black Ops 2 my system ran it on ultra everything all the way up at exactly 60FPS no dips, no cuts, no lost frames etc...
 

apcs13

Honorable
Oct 2, 2013
960
0
11,360


Yeah I wasn't really worried about the graphics or physics part because of the 770, I was just worried about, well, I don't even know what! Just a lack of CPU horsepower leading to lower FPS I guess. Thanks for the assurance!
 

apcs13

Honorable
Oct 2, 2013
960
0
11,360


Ah, I know you, you're the guy with the new 770! Yeah the 770 is pretty powerful I wasn't worried about that at all really more about the misleading results I found on the CPU world website. I think in a few years or so I'm going to do a pretty big tear down instead of a small upgrade in a short time, probably get an SLI-Compatible motherboard and some fast Intel CPU and keep the 770, that's gonna be good for a while.
 

AimOnly Him

Honorable
Jan 24, 2014
123
0
10,710


it depends on how resolution will you use at... i assumed you will run it at 1920x1080... for example, if you will be using at 3840x2160, you will get cpu bottleneck... so that, you will need higher than fx6350 if you want playing at 3840x2160 (ex : fx9590 or i7 4770K)...

you can do experiment to discover how does influence cpu usage to resolution use at... run anything 1080p video and 4K resolution video... you will find the difference cpu usage...