Multitasking Is What Is Really Affected
We assembled a simple multitasking test in order to further explore the importance of system memory size. Up until this moment it has been fairly obvious that when using only one application at a time, 512 MB system memory can indeed be enough. This is especially true in applications that don't need a lot of memory, although, as we've shown, it is absolutely not true when the application requires large amounts of data (e.g. high quality textures).
Quake 4 has shown the most benefits from more system memory so far, or at least it took the biggest performance hit due to insufficient system memory. While 512 MB clearly was not enough at Ultra Quality, we were interested in seeing how much system memory we needed in order to use a memory-hungry application at the same time while looping the game.
If you have ever been at a LAN party, you have probably more than once screamed in panic during that really important game (against that 13 year-old you told everyone you could beat in Counter-Strike without even using your mouse), because all of a sudden five people decided to start downloading your uncompressed, home recorded movie from last Christmas, and your game goes from a smooth win towards a stuttering loss you'll never forget. As previously mentioned, hard drive access is your worst enemy, so we decided to transfer a large amount of data via FTP (8.11 GB over Gigabit LAN connection, to really sweat it) while looping Quake 4 at the same time.
Quake 4 With Intensive FTP Transfer
Since the game restarts before every new demo run, the green columns are all non-cached results. See Doom 3 benchmarking for more info on that.
We got some interesting results. First, we should mention that in the graph above you can see two different kinds of performance drops while transferring files: performance drop due to insufficient system memory, and performance drop due to higher load on the CPU. For the file transfer alone, around 40% CPU power is needed for Gigabit Ethernet solutions without a TCP offload engine. This includes all of today's network solutions that are integrated on motherboards: These are fast, but they put the processor to work (yes, a dual- core chip helps).
Unsurprisingly, using 512 MB system memory was even worse when transferring files. Actually, the transfer time would have been much higher if transfer speeds had not picked up a lot during the extremely long periods of time we had to wait between each demo run. But the chart also shows you that even 1 GB was not enough in this scenario.
With 1.5 GB (2x 256 MB + 2x 512 MB) and 2 GB system memory, the FPS result was lower due to higher CPU load, but the game experience amazingly was just as smooth during the file transfer as before or after it. At the same time, the file transfer quickly reached its maximum speed and didn't slow down. Actually, this was also the case when running the game at Low Quality settings with only 512 MB of system memory.
The Mac is a powerbook G4.
Applications do crash but I like to do things quickly so that's why. With the IBM (Netvista), I don't use it that much, but sometimes I control it through RDC. The "Netvista" has 1 Gigabyte of ram.
I don't really mind about the computers and the ram-I just care about that I can use them and that they don't crash each time I play a game.
From someone, who is learning applescript.
Performance is achieved by striking a good balance between RAM, CPU, Hard disks, and the mobo.