As far as I can tell there are only three titles that have native 64bit client:
Far Cry 2
Half Life 2
So everyone has 64 bit processors but the lack of 64bit games is daunting. Yes I realize that many gamers have Vista 32bit installed cause they didn't even know they had a 64bit option. And I realize that many people don't understand the RAM limits with a 32bit OS. But still, why so few native 64bit games?
Do most of the current games just not need the more than 2GB RAM?
A valid question.....I do have vista 64 and it performs just fine......those games play fine on them......I'd like to know why they don't have more games that are 64 bit capable too.....I do use 4 gigs of memory but I don't think that is required.....just makes vista run more smoothly that's all
Have you noticed any performance difference between the 32-bit and 64-bit clients for Crysis and HL2?
If you don't notice much difference (if any), then it may not be in the developers' best interest to build both 32-bit and 64-bit clients. Many people still have 32-bit OS (like me). Resources could be better spent on squashing bugs and bringing games to market as quickly as possible w/o compromising quality.
I haven't done a direct comparison between 32bit Crysis and 64bit Crysis under DX10, so I don't really know. I am surprised I was able to maintain a pretty consistant 50-60 fps in Crysis using 64bit client with DX10 and 4870 CF with graphics at HIGH and 1920 x 1080 4XAA and no stutters at all. But maybe this will happen on 32bit Vista?? Just don't know.
I'm not sure how resolving bugs or speed at which the game comes to market is relevant to 64bit development? By definition, bringing a game to market faster will increase the likelihood of bugs -- quality assurance (testing) is a slow methodical process so it can't be both? You're asking for the impossible.
But I think you're missing out of an important factor, coding for a 64bit environment is NOT a massively daunting task, it takes a little work but not much. We're NOT talking a completely different code set, it will be the same code set. Depending of how the application is written, it's often just a matter of recompiling to 64bit and then recompiling to 32bit. I've coded and deployed 64bit applications using VS 2008, it's not terrible difficult.
Also, if you develop for a 64bit environment, there are options available to the developer, such as no 2GB RAM limit -- you can pre-load huge chunks of a map, 3D matrix, other textures, etc. into RAM and the end results is that move from area to area on a 3D world doesn't require reading from the hard disk and no lag or delay or stutters as the necessary data in already preloaded into RAM.
Lots of possibilities open up for the game developers to help keep the game flow smooth.
The reason why is pretty simple, I think: 32 bit programs will run on 64 bit OS's just fine because, as long as you program to the correct API's, the procesor couldn't care less if the binaries are x86 or x64 and the OS handles the necessary address redirection. So why bother? By writing purely in 64 bits, you'd exclude the bulk of the marketplace from using/buying your product.
The difference isn't all that massive, from a programmer's standpoint - You'd dump the code from your development tools into a 64 bit compiler, switch it on, and go home for the night while the thing does it's work. BUT the issue then becomes that you have double the QA work, and now have two sets of code to maintain and support. Companies may decide it's worthwhile sometime in the (hopefully near) future. Certainly enthusiasts appear to be going to 64 bit faster than a lot of companies/analysts may have predicted. But for the time being (this generation) I doubt enough of a mass market user base exists yet for it to be deemed worthwhile to take on the expense if maintaining and supporting two code bases.
I don't know...as you said, there doesn't appear to be much reason to go 64-bit at the moment. However, it is possible that titles in the next couple of years will become more heavily dependent on 64 bit architecture.
Considering there was no additional cost to go with 64 bit, i chose that for my build last year to 'futureproof' it. (Just have to be careful and make sure all your hardware is 64 bit compatible).
Think you're missing some key points of software development, it doesn't work the way you describe it. The reason to bother is that 64bit allows a developer to use more than 2GB - 2GB is the thread limit for a 32bit OS application.
But like I said, 90% of the market has 64bit processing power, but they install a 32bit OS.
I'm pretty sure game developers could utilize more than 2GB if they were permitted to do so in the specification of the game - they wouldn't need to load/unload chunks of data from hard disk, they could just load it all up into the application threads address space (which can be much larger on a x64 OS).
I can't think of any game where "if the door to more RAM where available" it wouldn't be used, every developer would use as much RAM as permitted.
...and the reason not to bother is your 64 bit app won't run on a 32 bit OS. By making your app 64 bit, you are excluding the majority of the marketplace - You figure 95% of all Windows users are still running 32 bits. And flat out - You can run 32 bit programs on a 64 bit OS, but the reverse is NOT true. So your 64 bit app applies to less than 5% of your market. Nobody in their right mind is going to make a large effort to support such a small segment of the marketplace.
Now, you can write a 32 bit game/app to be Large_Address_Aware and it will be able to use more than 2GB of RAM. Developers often decide not to because 32 bit Windows defaults to 2GB app space and 2GB OS space for the addresses anyhow. So writing the extra functionality into the game doesn't buy much for 32 bit systems unless the user knows about the /3GB switch. That drops the space allocated for the OS to 1gb and allocates the other 3 to applications. The problem with that is if your OS needs more than 1GB of address space, then when it runs out, your computer will crash. Meaning, installing a graphics setup that uses more than around 750 MB of video memory will cause blue screens.
So how is that solved?? Write a 32 bit app that's large address aware. It'll run on everyone's computer and get a little extra on a 64 bit OS. Problem solved, and you get to keep your current tools and not have to support two versions of the same software. *AND* you get to include a "64 bit mode" in your game.
Hum, interesting stuff. I hadn't really known/considered the 2GB-2GB split before, but it makes sense. I was running CIV4 the other day and my RAM usage got up over 80% and the game started to slow down and stutter. Since my OS usually take 25-30% I'm guessing that the game was pushings it's 2GB and causing the stuttering. I'd like to see more 64bit native apps, but it's slow happening. Large address aware seems the more reasonable way to go for now. But really if anything game developers need to work on getting games to scale for multiple cores more than anything.
I had done some benchmarks in the past for my system (see below). You can search the internet for other results and benchmarks. But just to give you guys an idea...my results for Crysis 32-bit vs 64 bit (In Vista) were very similar. I got an extra 2-3 fps MAX with 64-bit. However, here's where I noticed much improvements.
When playing multiplayer, the maps would load almost entirely (I had 2.8 GB loaded in my RAM) and although the average FPS were pretty much the same, there was much less sluttering. It played a lot smoother. Also, I noticed my PC handelling physics slightly better (you know does physics map-demos that the community created). That's what I noticed while doing the benchmarks between 32-bit and 64-bit.
Conclusion I had made...FPS will be very similar. BUT, much less sluttering and loading pauses while using 64-bit.
Alex - That dovetails nicely with my own experience. I had a pretty powerful system to start with, and making the switch to 64 bits didn't make much of a difference (if any) in peak performance. *However* - The minimum performance has very noticably increased. Leave a zone/area and come back?? It loads near instantly! Hanging out with 25 of your guildmates, then the fight begins?? Frame rates don't drop!
Towards that end, I've come to feel that many reviewers are doing their readers somewhat of a disservice by not including minimum frame rates/performance numbers in their articles. If card A delivers 105 frames, and B 'only' delivers 102 - How does that impact the end user's gameing experience in the least? Price being equal, readers are likely to buy A... But almost no human beings could tell the difference. Especially given that you're stuck with a 60Hz refresh rate on the monitor any how.
But when you really beat the tar out of the things, what if B's minimum was 30 and A's was 15?. 30 frames is plenty playable. 15?? Clearly, and noticably less so. Except people who follow the review didn't get that info, and are buying the wrong card.