Sign in with
Sign up | Sign in
Your question
Closed

Crysis 2 system requirements

Last response: in Video Games
Share
February 21, 2011 4:45:39 AM

read it and weep........... highly recommended...


* Minimum: 2GHz Core 2 Duo / A64 X2 CPU, 2GB RAM, 8800GT / HD3850, 512MB Video Memory, DX9.0c, Shader Model 3.0, Windows XP, 20fps @ 1024 x 768

* Recommended: 2.66GHz Core 2 Duo / A64 X2 CPU, 3GB RAM, GTX280 / HD4870, 1GB Video Memory, DX9.0, Shader Model 3.0/4.0, Windows XP, 30fps @ 1650 x 1080

* Highly Recommended: 3GHz Core i7 4GB RAM, GTX560Ti / HD4870 X2, 1.8GB Video Memory, DX11, Shader Model 3.0/4.0, Windows 7, 30fps @ 1920 x 1200
February 21, 2011 7:40:02 AM

scary vram system requirement.Ati 6900 owners rejoice.
Score
0
February 21, 2011 12:16:57 PM

Doubt it PC Guru: Looking at the benchmarks based on the leaked version of the game: http://www.techspot.com/review/367-crysis2-beta-perform...

We see the GTX 580 calling out well ahead of the ATI 6xxx cards; this will certainly be exemplified with higher AA settings, where the ATI cards flatly fall down. More vRAM isn't a solution to inferior technology. Furthermore the 3gig 580s are already out, and they're not even that expensive (£440 here).

Back on topic, that vRAM req looks to be over-the-top to say the least. Given that this was a jan beta build there'll certainly be further performance tweaking. The only interesting thing lies within the DX11 performance as the leaked version was a dx9 beta. So we'll probably get a small performance drop there, a little less than dx9vdx10 (as we've seen that dx11 is offering a perf boost over dx10).

But, we'll see when the time comes. I've always found that lacking vRAM is easily tweakable - small sacrifices and tweaks can save a lot.
Score
0
Related resources
February 22, 2011 4:56:48 AM

Inferior technology? LOL!!! You mean theres no substitution to having a game written for your hardware...
Score
0
February 22, 2011 9:04:11 AM

No; I mean inferior technology. Games are designed around the hardware best suited to the job; for high end shader and tessellation intensive games this has always been Nvidia. There is no excuse for AA to cause the massive performance drop it does on ATI cards; at 0*AA they can get ahead, but as soon as you ram up AA (which I consider absolutely vital to a good looking game) the performance drop is incomparable. There are some games where ATI's approach pays off, an example being Just Cause 2; however as soon as you up to 4AA (at 1920*1200) the 6970 looses advantage over the 570 and they're neck-to-neck. The 480 pulling ahead of both.

Furthermore the lack of PhysX technology does nothing to its advantage - I can't think of anything vital that ATI has to its name right now. Eyefinity isn't bad if you want multi-display systems, and Nvidia should get its game up there. Other than that nothing shows off as being interesting.

Historically I haven't seen any top contributions from ATI/AMD either; they make great value upper mainstream/lower enthusiast cards, but beyond that they rarely get the advantage.
Score
0
February 22, 2011 9:54:00 AM

What are the chances of me maxing this out at 2560x1440 with my 2 gtx480s ?
Guess I can just wait a week til the demo is out and find out for myself. Slim though?
Score
0
February 22, 2011 10:07:24 AM

I understand your point Wampbit but if there was a night and day difference people wouldn't be jumping the fence once an ATI card outperforms the best Nvidia card. There's a difference in some cases but to the average user, which is 95% of every gamer out there, they can't see the difference well enough or ATI wouldn't sell another video card and would be out of business.

Most people aren't a video card enthusiast. In the HTPC arena, ATI sets the standard as the 5570/5770 video card outperforms the best Nvidia card, even the 580 in decoding. Anything above those two ATI cards is a waste of money. Nvidia has a lot of work to do in that area...

Score
0
February 22, 2011 2:48:18 PM

fergie said:
What are the chances of me maxing this out at 2560x1440 with my 2 gtx480s ?
Guess I can just wait a week til the demo is out and find out for myself. Slim though?


Im runing it with max settings on two gtx285s at 62.5 fps..... you should be fine.
Score
0
February 22, 2011 5:36:58 PM

gtx285fanboy said:
Im runing it with max settings on two gtx285s at 62.5 fps..... you should be fine.


Aye aye captain!


Score
0
February 22, 2011 6:56:47 PM

Re: the chap with 480s: you won't have a problem. You'd even pull it off (albeit a tad close) with 1 480, certainly 1 580 - especially with dx11 performance improvements and from what we've seen a far better engine :) .

Re: Englandr
I've redrafted 3 responses to you, but now have concluded that no response is really needed. I agree with you, especially regarding the average user point, as I stated ATI are great for to mainstream and lower enthusiast. This is the area I'd place most gamers, even those that are quite serious about their hardware. When it comes to non-gaming uses, each have their stronpoints. The CUDA technology is very useful for finite method calculations, and various video editing software built around it seems interesting. For anyone that wants value; go ATI. For anyone more interested in their games rather than the technology; go ATI! For anyone that doesn't spend three hours staring at shader effects commenting on their technical aspects (moi) go ATI! Nvidia only becomes interesting when you look at the 580, or overclocking 570s, that general area. Below that, I agree with ATI ruling. They provide everything but the best at the same standard and a lower price. (and now I ended up writing a response).
Score
0
February 23, 2011 1:31:32 AM

Lol, apparently we are dancing in circles. I think we're pretty much on the same page for the most part...
Score
0
February 23, 2011 4:50:45 PM

so my Core i5 OC'ed at 3.84 with 8gig 1600mhz RAM and an HD6970 should be able to handle this game with no issue? NICE!
Score
0
February 24, 2011 12:46:42 PM

Question for everyone: I'm very happy with the way Warhead runs on my current card (4890 OC), I average around 38-50fps at 1920 x 1080 w/gamer settings/enthusiast shaders.

Could I reasonably expect the samer performance with Crysis 2? If so, that will be good enough for me. Thanks in advance.
Score
0
February 27, 2011 6:17:51 AM

could i get it to high settings on a 24' display at 1920 by 1080 with

i5 650 @ 3.2ghz
gtx 460 1gb
4gb RAM only 3.6gb usable :( 
Score
0
February 27, 2011 11:51:15 AM

Sparky: Should be able to run at gamer plus a bit. Crysis 2 looks to run better than warhead.

Somerandom: Hmm - that's tougher. Probably, maybe with a few sacrifices.
Score
0
March 1, 2011 11:23:54 AM

Okay, I have a rig with the following specs:
- Intel E2140 proc. 1.6 GHz but overclocked to 2.6 GHz. This is Core Duo not Core 2 Duo.
- 2GB DDR2 800 memory modules
- Slightly overclocked Radeon 4870 with 1GB memory
- Windows XP SP2
- Screen resolution 1600*900

I guess the weakest piece is the processor. With the overclocking it is way above 2.0GHz, that is okay. But this is a first series Core Duo (or say Core 1 Duo) not Core 2 Duo. Will this cause any problem? Will it run in 1600*900 (I don't care about maximum quality settings, medium is okay for me, but I don't really want to downscale the resolution).

Thanks.
Score
0
March 2, 2011 4:26:45 AM

aambrozai said:
Okay, I have a rig with the following specs:
- Intel E2140 proc. 1.6 GHz but overclocked to 2.6 GHz. This is Core Duo not Core 2 Duo.
- 2GB DDR2 800 memory modules
- Slightly overclocked Radeon 4870 with 1GB memory
- Windows XP SP2
- Screen resolution 1600*900

I guess the weakest piece is the processor. With the overclocking it is way above 2.0GHz, that is okay. But this is a first series Core Duo (or say Core 1 Duo) not Core 2 Duo. Will this cause any problem? Will it run in 1600*900 (I don't care about maximum quality settings, medium is okay for me, but I don't really want to downscale the resolution).

Thanks.


Indeed, the E2140 is the weakest point. Overclocked to 2.6GHz, roughly it's an equivalent of a 2.2GHz Core 2 Duo. E2140 can be overclocked further than that, though, but let's not go there for the moment.
The difference of cache SRAM in E21x0 and Core 2 Duo is more than double, maybe quadruple, and your best bet is that Crysis 2 will squeeze every L2 cache available in the cpu.

You'll get more fps than average in the minimum setting, around 20-35fps, perfectly playable.
But crank the settings up to medium, and it'll be much less playable. My prediction that it's going to be around 15-24 fps.

Resolution usually impacts graphics card performance, and not cpu, because processor is normally handling the AI, physics, world entity, etc.
Score
0
March 2, 2011 4:53:10 PM

Yes, it can be clocked up to 3300(!) GHz (not bad from a processor that normal speed is 1600GHz :-)) But for some reason the original Crysis and Crysis Wars did not like that (sound come sooner than the video, white screen and other bizarre errors).

So. My motherboard does not support Intel I3/I5/I7 processors. But it does support E7xxx and E8xxx processors. If you were me, would you just upgrade my E2140 to an E7xxx or E8xxx processor (and of course replace memory as well as it needs to be faster) or would you replace the entire motherboard as well (and my money is limited plus I hate installing Windows and while you can replace processor and memory without reinstalling Win XP when you replace mobo as well you need to reainstall it). And also, what about the AMD processors? They seem to be approx. half price?! (I might be wrong and comparing wrong Intel-AMD procs) What AMD processor would I need (what would be equivalent to an Intel E7xxx or E8xxx processor?)

Thanks a lot
Score
0
March 2, 2011 5:31:48 PM

Wampbit said:
No; I mean inferior technology. Games are designed around the hardware best suited to the job; for high end shader and tessellation intensive games this has always been Nvidia. There is no excuse for AA to cause the massive performance drop it does on ATI cards; at 0*AA they can get ahead, but as soon as you ram up AA (which I consider absolutely vital to a good looking game) the performance drop is incomparable. There are some games where ATI's approach pays off, an example being Just Cause 2; however as soon as you up to 4AA (at 1920*1200) the 6970 looses advantage over the 570 and they're neck-to-neck. The 480 pulling ahead of both.

Furthermore the lack of PhysX technology does nothing to its advantage - I can't think of anything vital that ATI has to its name right now. Eyefinity isn't bad if you want multi-display systems, and Nvidia should get its game up there. Other than that nothing shows off as being interesting.

Historically I haven't seen any top contributions from ATI/AMD either; they make great value upper mainstream/lower enthusiast cards, but beyond that they rarely get the advantage.



Um where did/do you get your facts from because everything in here is strictly opinion. Games being designed for Nvidia architecture ...you know the ones that say NVIDIA the way its meants to be played, as well as ones endorsing ati ..... Dragon Age will be better suited to those cards due to how they were/are developed. Suprisingly Nvidia and ATI both give money to developers to support their product. . . Do you really think that games endorse which hardware is best suited? Its the same thing as product placement.

Look at the benchmarks out now with SLI scaling .....inferior cards? 2x 6950's CFed outperform a GTX 580 (same price point) just the same as 2x560's outperform a 580 (all around $500) Sure Nvidia has the crown now for single card/single GPU at the moment but it wont and has not always been the case. from the Radeon 9500 until Nvidia launched the 8800 series Ati was the clear leader in graphics for about 2 years.
Score
0
March 2, 2011 5:33:54 PM

Sorry didnt mean to highjack your post :( 

It will be playable on most systems ..... this hurts saying as an avid PC gamer but this game wasnt optimized for the PC it was made as a console port. Before you worry about system specs wait for RAGE and Battlefield 3 to come out both made for PC before console.
Score
0
March 4, 2011 1:45:19 AM

aambrozai said:
Yes, it can be clocked up to 3300(!) GHz (not bad from a processor that normal speed is 1600GHz :-)) But for some reason the original Crysis and Crysis Wars did not like that (sound come sooner than the video, white screen and other bizarre errors).

So. My motherboard does not support Intel I3/I5/I7 processors. But it does support E7xxx and E8xxx processors. If you were me, would you just upgrade my E2140 to an E7xxx or E8xxx processor (and of course replace memory as well as it needs to be faster) or would you replace the entire motherboard as well (and my money is limited plus I hate installing Windows and while you can replace processor and memory without reinstalling Win XP when you replace mobo as well you need to reainstall it). And also, what about the AMD processors? They seem to be approx. half price?! (I might be wrong and comparing wrong Intel-AMD procs) What AMD processor would I need (what would be equivalent to an Intel E7xxx or E8xxx processor?)

Thanks a lot


Money is a limiting factor, then go for the E7500/7600 (~$100-130). The money that you save can be allocated to adding more RAM, another 2GB will be great (around~$30). These Wolfdales have high multiplier (11/11.5) and low bus frequency (1066MHz), ideal for overclocking to realms beyond 3.6GHz+.

Or you can grab an E8400 (~$130-140) if you don't feel like extreme overclocking. The retail price is just tad higher than E7600, but it has double the L2 cache (6MB as opposed to E7xxx's 3MB). Of course, you can still bump it to above 3.xGhz, but because it has lower multiplier (9) and high FSB (1333MHz) your motherboard must be a very good one to tame it.

In my opinion, scrapping your Core2Duo build altogether and migrate to an AMD system would cost you more money than upgrading the cpu/RAM at the moment (and the hassle of re-installing everything). Later, if you have enough dough to go hexacore (~$180-200), with DDR3 (~$40-60) and everything else (~$80-150), then go ahead. Don't worry about the Core i3/5/7 series. With a decent Wolfdale processor and a bit of overclocking, your system still can hold its own against the newer ones for months in gaming, even for a year maybe.

I myself use a 2.93GHz Wolfdale clocked to 4.2GHz (E6500), and all the newer games like Bulletstorm, Dead Space 2, even Crysis 2 demo runs butter smooth in full HD (well, the latter is kinda limited by my graphics card, but very playable in Advanced profile)
Score
0
March 4, 2011 12:25:20 PM

Quote:
if you have to OC A cpu to 4ghz to play games then something tells you got a good gpu but you bought it for your ego not to power your screen resolution. The higher the resolution the more the work will be on the gpu. Your cpu might bottleneck at a low resolution but not at a high resolution. Most bottlenecks occur not because the cpu is too slow but because the gpu can run away from it due to the light work on a low resolution. Any core2 cpu or Amd x2 cpus and up with a clock of 3Ghz will happily run most games. Paying 130usd for another core2 would make no sense. You might as well get a Amd x4 cpu of a tri core. But a upgrade from a dual core to another dual core is a waste


I must admit I had a hard time digesting some of the things you wrote there...

Let me tell you a bit about why I overclock the cpu that far.
I've went through lots of upgrading. The first pair that I have is an E2160 OC'ed to 3GHz and 3870, it was good, and balanced, but that time I was using a 1440x900 19" monitor. Then I bought a new 1920x1080 23 inchers. The gpu struggled to produce a playable fps playing COD4 at the native resolution.

I upgraded to a 5850, and it's much better at running COD4, but then I install Battlefield: Bad Company 2 and the cpu can't keep up. Stutters and frame skips happened here and there. The fact that my GPU doesn't support hardware physx, only put an even harder load to my cpu. It's at 3GHz, by the way.

I found out that the cache and processor architecture that of a Dual-core Pentium is different to a Core 2 Duo, so I get the cheapest Wolfdale I can get. The E6500 at 2.93GHz. It turns out that the doubled L2 cache provided enough fps juice, but still not as smooth as 60fps, but I can't dish out more money to get a quadcore at the moment, so I stayed with the duals, but I overclock it to the maximum point it can take. And, the fps is smooth now. So I've went through some horrible bottleneck and it's all to deliver playable fps to my monitor. I don't see anything wrong with that.

As for the upgrade recommendation, aambrozai said that he's on a budget. So I believe any quad-core that has a minimum price of $163 (http://ark.intel.com/ProductCollection.aspx?familyId=28...) is not an option. A bigger cache dual-core would help him on achieving higher fps in Crysis 2, and balance the already powerful gpu with the cpu. In my opinion, it's more less of a sense to buy an entirely new system based on an AMD X4 or X3, and it wouldn't help with the budget situation.

I aggree that any Core 2 @ 3GHz can run games, but at what detail level? Remember that E2140 is not a Core 2 Duo, and has a mere 1MB L2 cache. Provided that you have a powerful gpu, it's best to avoid any bottleneck by clocking the cpu as fast as you can so the gpu can maximize the pixel output without waiting.
Score
0
March 5, 2011 3:45:29 AM

Quote:
So a better cpu will provide much more performance in games than a better gpu will do? You will only notice the difference in a cpu change if you run low resolutions like 1280 and lower. At 1080p and a cpu overclocked to 3Ghz and above the gap closes to virtually none.

But maybe you couldve saved some time and money and looked around coz the 5870 and the 5850 had some high driver problems with COD 4. It had problems from its released. With Catalyst 10.2 people were still complaining with issues. So its not the cpu its not the cache its just the drivers. As the drivers improved so did your performance. I remember people running the 5870 with COD 4 running a I7 at 3.5ghz complaining bout issues which turned out to be drivers.
If you do a quick google you will notice Amd gpus and cod 4. DRIVERS


No, the better way to put it is that a better cpu can help maximize the minimum and maximum fps you can get from a graphics card, but a low-end cpu paired with a powerful gpu will only limit the fps down because of cpu scaling, and when there's things like drivers issues happen, it's always better to have an extra fps or two to offset the lag.
Indeed drivers are one of the contributing factor to COD4 sluggishness, but I've bought the card, and it's kind of impractical to swap it for a, lets say, GTX 460 at the time.

Same thing with aambrozai there. Do you suggest him upgrading his gpu instead of the cpu to get a better fps in Crysis 2?
If I look at the http://www.tomshardware.co.uk/charts/desktop-cpu-charts... cpu chart,
narrow it down to comparing the 2.93GHz E6500 with 3GHz E8400 : http://www.tomshardware.co.uk/charts/desktop-cpu-charts...[4428]=on&prod[4434]=on
I see a 'gap' here, BIG. That's a 23fps or 22.5% more fps! Same dualcore Wolfdale architecture, only different L2 cache amount. The difference of ~70Mhz is neglectible in this situation.

Same thing I predict will happen with aambrozai if he pair his E2140 and a high-end, let's say GTX 560Ti. The gpu will be held back by the lack of cpu L2 cache, even if he overclocked that to +3.xGHz, and he'd lost more than 25% performance in that scenario.
From the budget standpoint, upgrading to a capable gpu requires at least $200, while upgrading a cpu only takes $100-140 max.

p.s.
there ARE some games out there that totally ignores the cpu performance, such as Metro2033, and no matter how low cpu you use, you'll still get the maximum fps out of your graphics card. But as far as I know, CryEngine is definitely not one of them.
Score
0
March 5, 2011 4:51:29 AM

swifty_morgan said:
read it and weep........... highly recommended...


* Minimum: 2GHz Core 2 Duo / A64 X2 CPU, 2GB RAM, 8800GT / HD3850, 512MB Video Memory, DX9.0c, Shader Model 3.0, Windows XP, 20fps @ 1024 x 768

* Recommended: 2.66GHz Core 2 Duo / A64 X2 CPU, 3GB RAM, GTX280 / HD4870, 1GB Video Memory, DX9.0, Shader Model 3.0/4.0, Windows XP, 30fps @ 1650 x 1080

* Highly Recommended: 3GHz Core i7 4GB RAM, GTX560Ti / HD4870 X2, 1.8GB Video Memory, DX11, Shader Model 3.0/4.0, Windows 7, 30fps @ 1920 x 1200


How will my Core i7 930, 6GB RAM, and Two HD 5870's in CF handle Crysis 2?
Score
0
March 5, 2011 5:33:01 AM

Quote:
ok fair enough.
One question. At what resolution was that cpu tests run at?
What's is his native resolution?
My main point is not that his cpu is not on the slow side but a 140usd from a dual core to a dual core is not worth the money. for 140usd he can buy himself a tri core Amd cpu already clocked past 3Ghz and a mobo and will do much better as he has a third core to run background applications. Now that's a step up performance wise plus a extra core.

Now that's just a example before there's another cpu vs cpu war flaring up.

If your gpu is pulling from out under your cpu try to put more of a workload on it. How do you do that. Get a monitor with a higher resolution, enable higher settings than influence the gpu not cpu. Its burning rubber at a huge pace you want to slow it down. That's why they always say.

Buy a gpu to power your monitor not your ego


Most of today's PC games are finally starting to use quad-core processors. I remember about Four years ago when people argued how dual cores were a better option because "games won't use quad core processors for another ten years". Obviously they were way wrong.

I would imagine within a few years games will use Six and Eight core processors. The Core i7 has hyper-threading, so this will help fill the performance gap when games use more than Four threads. I should have waited and bought the Sandy Bridge and 2nd generation of DX11 graphics cards. But I was too impatient and built a Nehalem rig with the HD 5xxx series.
Score
0
March 5, 2011 7:09:19 AM

Quote:
my friend. If you unlock the true powers of nehalem then you will love it. Remember Sandy bridge users of the I5/I7 series have limitations which you don't have. That's huge f ing bandwidth. You got tripple channel ram. Now a pc is as fast as its slowest part. You can can run raid configs with a Pci-e add in controller, 3 x 580s and a quality sound card without problems. You got the bandwidth to do it and a very good IMC. That's what the old 1156 and 1155 users don't have. Now the people seem to make the mistake to drop they perfectly good rigs and jump on a new platform bandwagon due to some cherry picked chips performance in some benchmarks. You already got the cpu. There's no game that you can't play all you need to do is build on your gpu setup and balance the rest of your system. 3 x 580s in your cpu will beat 3x 580s in a Sandy Bridge setup. As long as consoles stay on 2 cores gaming on pcs will do the same


I thought that the SB 2600K is 35-40% faster than even the 6-core Nehalem.

What about the 8-core Sandy Bridge?

How will my Two HD 5870's (1GB) handle Crysis 2? I can run the first Crysis on very high/1920x1080 and get 30-50 fps.
Score
0
March 6, 2011 4:14:03 PM

ambam said:
How will my Core i7 930, 6GB RAM, and Two HD 5870's in CF handle Crysis 2?



I don't think it will....... LOL
Score
0
March 7, 2011 1:42:39 AM

I've seen tons of gameplay footage from Crysis 2, and I would have to say conclusively that the original Crysis has FAR better graphics than Crysis 2. Even on the PC, Crysis 2 looks like a high-end XBOX 360 game. Crysis was never released for the consoles because the hardware couldn't run it.
Score
0
August 16, 2012 6:44:01 PM

This topic has been closed by Mousemonkey
Score
0
!