Sign in with
Sign up | Sign in
Your question

Very High -- Crysis 1280x1024

Last response: in Graphics & Displays
Share
a b U Graphics card
June 19, 2009 11:29:25 PM

So I was just wondering since these days people don't like to benchmark 1280x1024 (i can't use a monitor any bigger, hurts my eyes) how much GPU power do you need to max out Crysis Warhead at 1280x1024? Since i'm always looking to upgrade my pc (not that i have the money) i was wondering what i'd have to get to achieve this. Also i'm not too interested in the eye candy but hey, if i have a free 20 fps, why not?

More about : high crysis 1280x1024

a b U Graphics card
June 20, 2009 1:03:46 AM

A 4850 would do it fairly easily. Perhaps not with full AA, but definitely all settings on very high.
a b U Graphics card
June 20, 2009 1:16:14 AM

hmm thats good. i was considering a 4850 or a GTS 250 so i like what i hear. speaking of crysis, i wont get the current one but anyone know when Crysis 2 is expected?
Related resources
a b U Graphics card
June 20, 2009 1:19:00 AM

Heres a review of 4850 doing Crysis at your resolution - think you would need more than that.

http://www.tweaktown.com/reviews/1471/sapphire_radeon_h...

If you can't find your exact resolution in a review, you might find a 16XX x XXX that handles close to the same number of pixels as your res.
a b U Graphics card
June 20, 2009 1:25:08 AM

Vista for me, but i dont think my budget can reach out and pick up a 260. maybe if daddy steps in..... :D  ahh teenage years...
June 20, 2009 5:18:49 AM

Go for GTS 250 or the 9800GTX+
Both are same...
a c 130 U Graphics card
June 20, 2009 7:47:07 AM

Its just not going to happen i just read a review of a 4890 where they had to back off the AA at x4 and only had it on Gamer settings, so there is a whole level of settings to go yet.

Mactronix
a c 164 U Graphics card
June 20, 2009 11:55:09 AM

Seeing as the OP said eye candy isn't needed, the 4850 will probably do fine. At full/max it was already around 22FPS, I'm sure moving some sliders down will get him above 30FPS. Not to mention that link was 6months old, I'm sure patches and driver updates have given it a boost since then. If the 9800GTX/GTS 250 runs it faster, he should look for one of those.

BTW, 1280x1024 = 1,310,720, 1600x1200 = 1,920,000, 1680x1050 = 1,764,000, and finally 1440x900 = 1,296,000. In terms of number of pixels, 1440x900 is much closer then either of the 1600 resolutions.
a b U Graphics card
June 20, 2009 12:55:15 PM

Isnt he going to need a past cpu as well playing at such a low resolution?
a b U Graphics card
June 20, 2009 3:53:44 PM

well if it matters, i have an Athlon 64 X2 4800+. its not toooo slow i guess, it could be better, but the BIOS locked overclocking so i'm stuck with it. And yeah i'm thinking the GTS250 might be better cuz i remember seeing nvidia does better with Crysis.
a b U Graphics card
June 20, 2009 4:30:22 PM

I thought at low resolution the pc needs more cpu power.
a b U Graphics card
June 20, 2009 5:02:14 PM

invisik said:
I thought at low resolution the pc needs more cpu power.

i think the CPU usage stays pretty much the same at all resolutions, gets a little higher at a higher resolution tho. i'm not %100 sure tho
a b U Graphics card
June 20, 2009 5:56:30 PM

I don't think you can make any general rule about cpu utilization and resolution. It depends on what bottleneck you create or relieve, and the game itself.
a c 164 U Graphics card
June 20, 2009 10:52:05 PM

By running at a lower resolution, your GPU doesn't work as hard correct? If so, its able to push a lot more pixels, providing you with more FPS. This puts a strain on the CPU to keep up. If the CPU is to slow, then its not going to be any faster. You'll still be able to play, its just that the bottleneck has moved to the CPU instead of the GPU. Max is right.
a b U Graphics card
June 20, 2009 10:59:03 PM

you wont max out crysis at that resolution with a 4850 .

I have played at 1440 x 900 with a 4850 at got about 35fps on high

A 4870 1 gig might do it , and so might a gtx 260 ,but any less than that you will have to lose quality or put up with lousy fps
June 20, 2009 11:35:43 PM

at lower resolutions, a higher cpu clock speed is needed or your video card will be bottlenecked.

JUST fyi, they had a qx9770 at 4.0ghz, and the gtx 280 they were testing was still bottlenecked at 1280x1024.

how do you know if its bottlenecked? if you seem to stop seeing gains as you lower your resolution, your cpu is bottlenecking your gpu
a c 106 U Graphics card
June 21, 2009 12:33:15 AM

If you want to play Crysis on very high, then go nVidia and at least get a GTS250. A GTS 260 sp192 should do it pretty good. Crysis doesn't really like ATI GPUs. Anyway the next Crytek engine will be designed with a more multi-platform concept so it should be less demanding and more efficient. It's impossible to say just how current cards will preform on it.
a c 130 U Graphics card
June 21, 2009 10:57:51 AM

Oh we are on this old chestnut again are we :) 
I'm not going to go into it all again strangestranger has certain views on it which are technically valid as far as i can see, and we have talked this over more than enough.
However i agree with what invisik, 4745454b and eklipz330 posted.
Its not that the CPU would try and run faster to keep up, (well an i7 would if turbo mode was on).
Its got a set speed and when the GPU is capable of drawing and rendering the frames faster than this then the "Restriction" in the machine as far as FPS goes is the CPU.
Its not that its even a bad thing per say, as long as you have a playable frame rate when you hit this restriction then the extra speed the GPU has found at the lower resolution can be translated into higher Quality settings.

Mactronix
June 21, 2009 11:19:21 AM

You need at least a 4870 or a 9800 GTX+... I am not kidding.

If you want to play in 1900X1080, you need at less something really strong like a 4870X2 or 2X280 in SLI to really max it out... even there forget AA... it's already a lost battle for the little it can gives you.
June 21, 2009 11:21:24 AM

megamanx00 said:
If you want to play Crysis on very high, then go nVidia and at least get a GTS250. A GTS 260 sp192 should do it pretty good. Crysis doesn't really like ATI GPUs. Anyway the next Crytek engine will be designed with a more multi-platform concept so it should be less demanding and more efficient. It's impossible to say just how current cards will preform on it.


True statement, we don't know how DX11 cards will perform. If you can, buy the cheapest solution until you can get a glimpse of the new era coming this autumn.
a b U Graphics card
June 21, 2009 11:43:23 PM

well see when i mean max it out, i can give up certain things. It doesn't have to be absolutly at very high but i want things like Textures to atleast be maxed.
June 22, 2009 6:43:02 AM

uncfan_2563 said:
So I was just wondering since these days people don't like to benchmark 1280x1024 (i can't use a monitor any bigger, hurts my eyes)


Monitor size___usage
19’ __________no use, is only suitable for the garbage.
22’ __________for office use, nothing more.
26’ __________you can play some games.
30’ __________perfect for games.


Edit:
These are facts.

a b U Graphics card
June 22, 2009 11:28:49 AM

Tom's still tests at 1280x1024 in articles such as SBM's. In Crysis (not Warhead), the GTX 260 and HD 4870 are not up to the task of DX10 (all) very high.

OC'ed GTX 260 isn't enough:
http://www.tomshardware.com/reviews/core-2-overclock,23...

HD4870 is a no go, HD 4870 X2 is capable given enough CPU:
http://www.tomshardware.com/reviews/amd-cpu-overclock,2...

Of course, not everyone agrees on exactly what is playable performance. Most of the game may be fine if settings get dumbed down in the worst areas.
a b U Graphics card
June 22, 2009 1:01:31 PM

successful_troll said:
Monitor size___usage
19’ __________no use, is only suitable for the garbage.
22’ __________for office use, nothing more.
26’ __________you can play some games.
30’ __________perfect for games.


Edit:
These are facts.

umm i think it's still very playable at 19". I can't use that big of a monitor because 1. I'd rather upgrade my pc before getting a bigger display 2. I do a lot of programming, so when i look at a bigger monitor for extended period of time, my eyes tear and burn
June 23, 2009 7:36:03 AM

uncfan_2563 said:
umm i think it's still very playable at 19". I can't use that big of a monitor because 1. I'd rather upgrade my pc before getting a bigger display 2. I do a lot of programming, so when i look at a bigger monitor for extended period of time, my eyes tear and burn

if you do programming a big monitor is essential so you can see more code per page.
You are just making up excuses so go buy a decent monitor and stop being so cheap.
June 23, 2009 9:11:34 AM

uncfan_2563 said:
umm i think it's still very playable at 19". I can't use that big of a monitor because 1. I'd rather upgrade my pc before getting a bigger display 2. I do a lot of programming, so when i look at a bigger monitor for extended period of time, my eyes tear and burn


I partly agree with troll because 30 inch monitors suck at gaming otherwise, 19 suck even more 23 or 24 are the best, y?

30 inch has bad contrast, bad response, and bad refresh rate. Example samsung p2370 has led backlits which 30 inch doesnt. 50k dynamic contrast. I have seen the monitor, and also apples 30 inch cinema played same games on it and guess what? The 30 inch really sucks compared to samsung p2370 or my t240.
June 23, 2009 11:54:06 AM

rescawen said:
30 inch monitors suck at gaming

I do not agree with that.
I have a samsung T260 (26 inch) and I can say that it is small for gaming.

a c 164 U Graphics card
June 23, 2009 4:24:18 PM

Hey! Read my sig if you have to, DON'T FEED THE TROLLS!

I'm a little confused with Uncfan. His first post he says he's not that interested in "the eye candy", then goes on to say "want things like Textures to atleast be maxed." So how maxed is maxed? Is there a list, or do you just want things to look good? Again, if you just want it to look good, the 4850/GTX260 is good enough to make most games look good. Crysis and a few others might need a bit of tweaking, but other then that...
a b U Graphics card
June 23, 2009 4:42:45 PM

I'm 15 and don't have a job, i'm not cheap at all. The whole programming thing isn't an excuse either, i stay really close to my monitor so big monitors kill my eyes. And seeing a lot more code isn't at all essential. Seriously, it doesn't matter. Why would you need to?
a b U Graphics card
June 23, 2009 4:43:33 PM

And by eyecandy i meant effects and filters (HDR, Anisotropic filtering, AA, shadows, etc...)
a c 164 U Graphics card
June 23, 2009 5:58:38 PM

I used to code back when I was in college. I hated having to do it at the school. 17" CRTs, and only one of them. At home I was using a 21" and 17" CRT. This was great, as I could have many windows open at the same time, thus increasing the speed of my programing. (imaging having to look up a bit of coding online, and not having to minimize or maximize every time you wanted to see it. I could look at the example code and my own at the same time, side by side.) For coding, you don't need a fast GPU, just a lot of desktop real estate. Gaming however chokes if you try to push to many pixels. Having a single good monitor is what you want to use. A good sized monitor with a fast response time and good specs will look a lot better then a larger monitor with poor specs.

Quote:
And by eyecandy i meant effects and filters (HDR, Anisotropic filtering, AA, shadows, etc...)


So you want to turn on the details, but you don't need HDR, AA/AF, etc? What you want will depend on the game engine. Some games look fine with no AA, while others you'll need at least 2X. I always try to turn on 2AA, even if I have to lower textures or turn shadows down. Again, if you can get the best GPU you can, you'll have the best shot at making your video games look the best. Coding is nothing, so don't worry to much about that. (other then getting a second monitor...)
June 23, 2009 6:22:38 PM

4745454b said:
Hey! Read my sig if you have to, DON'T FEED THE TROLLS!

The voice of REASON listen to the trolls!!!

Because trolls always speak the truth!!!

a b U Graphics card
June 24, 2009 11:57:26 PM

wth is up with this troll thing?? i think i missed that O.o

edit: just looked at HD 4870 prices on newegg, and if i spend like another 10-15 dollars i can get one. Should i get that over the GTS 250?
a b U Graphics card
July 5, 2009 7:57:41 PM

Personally I would go with the 250 but just make sure you get the 1GB version. I can get around 60-75 FPS with everything on Enthusiast and 4x88 playing Crysis Warhead in Sli. With just a single 250 I only get 35-45 frames average.
a b U Graphics card
July 5, 2009 8:07:03 PM

You don't need the 1GB since the GTS250 won't use it...unless you have two of them in SLi on a much larger resolution.

Just get a 9800GTX+/GTS250 or 4850, you can play with all on VH and AA off, or
everything except post processing on VH with AAx4
a b U Graphics card
July 6, 2009 2:31:47 AM

but i think a 4870 is a better choice cuz even if crysis loves nvidia, the 4870 performs better in other games
March 4, 2010 11:32:29 AM

Would an HD 5850 max out crysis fully at 1280x1024? Also have a tri core clocked at 3.5
January 26, 2012 7:32:32 AM

I'm new to benchmarking generally and not much of a gamer. I bought Crysis purely as a bench marking tool, I played it with a 34FPS average and thought that it was a good as anything on my old X-Box 360, so why people are saying it us unplayable at that speed I don't know.
An old Custom PC mag had the best cards at the time (mid 2008) struggling along at about 25 - 30 FPS, admittedly on higher resolution.

I play it on a 17" monitor at 1280 x 1024 on very high settings with 2xAA. My card is a GTX260 216 and I benched it at with the Phenom II CPU as a dual core, triple and quad at 2.7ghz, 3.2Ghz and 3.5Ghz. Barley made any difference although it did seem a little jerky a couple of times on the dual core at 3.2ghz and 2.7ghz. The minimum frame rate dropped to 18FPS on 2.7Ghz but max and average was similar throughout the other cpu settings.

I will post results on my site, might get a bit bigger monitor in the future, I've played games on my 32" HD TV but that is overkill when I'm at my desk. I think a 21" - 23" would be optimum for my needs.
a c 290 U Graphics card
January 26, 2012 7:50:59 AM

You shouldn't necro threads..
!