SoDNighthawk

Splendid
Nov 6, 2003
3,542
0
22,780
I have a question. Let me ask you this and lets see if anyone can answer it or has an idea why this seems to be a real problem.
Computers have come a long way in CPU, Motherboard and Graphics cards as well as the jump to Windows XP that lets us use massive amounts of hardware installed memory.

Ok now that we know this is true the question I have is this.
For example an old game called Battlezone originally written to be used in the Windows 95 operating system using as described here off the original retail box I have in my hand for system resources.
A Pentium 120 MHz Processor Pentium 166 MHz recommended for a minimum of 640X480 Graphics Resolution with textures on 16 MB of RAM. DX 5 or better.
This game old as it is works perfectly correctly in Windows XP Home or Pro. The game on my old IBM 500MHz 512MB Win 98 operating system with a Voodoo 4 graphics card ran it great in fact perfectly with high frame rates.
The KEY thing here is the game would use 100% of the CPU time. Just remember that for a second and let me see what I am running the game on now.

I am running that old BZ game on a ASUS home made computer. The main board is a A7V8X-X and the CPU is an AMD XP ready 2800+ with 1 Gig of DDR hardware Memory installed. The graphics card is a XFX Ti 4200 8X AGP card.

Ok all well and good for what I pay for parts to stuff into a box.

The thing is the old Battlezone games even in this new more advanced computer uses 100% of the CPU.

Can anyone tell me where all the resources are going in our computer systems.

Seems to me that after we log our computers online in a game server that is proly running at T1 or better for speeds and 10 other Users log their home computers online proly all gaming computers how come games use 100% of the CPU time for the game ?

Seems to me with 10 1Gig computer systems hooked into a high end server there is enough computer power to plot the star chart for half of the night sky every 20 minutes.

So (1) why do games use so much of the CPU time ? and (2) Is the government raping our computers for CPU time when we log into games online.

I mean the games from some manufactures like UBI.SOFT can run really poor frame rates and an old game like Battlezone that runs the FPS right at 100 FPS and games like Day Of Defeat that can show off a net_graph 3 the packet loss flux and choke on the connections as well as the FPS.

I mean if I show 100FPS Zero Loss and Zero choke on the game server. Then I am seriously starting to think that there are some outside forces stealing computer resources of home computer users.

Put that in your pipe and smoke it. Anyone who thinks they have a better resolution to the Jive we are seeing in computer hardware and connectivity post a reply message to me here in this lobby I will be checking back to see who replies.
 

Terracide

Distinguished
Sep 7, 2003
88
0
18,630
Unless the game is "capped"(programmed to not produce framrates higher than eg. 60 fps) the game will render all the fps it can(read: as many frames as the CPU(+GPU) can handle).
It's that simple...
No conspiracy there.

Don't pretend - BE!
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
(2) Is the government raping our computers for CPU time when we log into games online.
**ROFL** Let's get Mulder & Scully to look into it :eek:
sorry. I couldn't resist.... :lol:

On the subject of CPU usage, XP/2000's DOS emulation stuff I know consumes masses of CPU cycles - I don't actually know why, but if you simply have a command prompt open & not doing anything on 2000/XP it has a dramatic effect on system performance. That's probably why the old games use so many cycles, as they're running in some old emulation mode.

As far as online lag - a chain is only as strong as it's weakest link, so even if you have a 2Mb DSL connection if there's someone else in the game with a 56K modem you're going to get some lag. Until the whole world is wired on one big gigabit network you'll always have lag problems.

Put that in your pipe and smoke it.
No way man - that stuff seems to give you paranoia, big-time.. :wink:

---
<font color=red>The preceding text is assembled from information stored in an unreliable organic storage medium. As such it may be innacurate, incomplete, or completely wrong</font color=red> :wink:
 

SoDNighthawk

Splendid
Nov 6, 2003
3,542
0
22,780
You are correct about games having a cap in fact if you are server admin for a game like Day Of Defeat you can set the FPS for he game server. In the config folder for DoD the frames per second are default set to 64 fps but you can adjust that to 100 fps and if the server is set to run that fast you will get 100 fps provided your graphics card will go that fast. In my case using a Nvidia Ti 4200 8X AGP card at 128 MB of DDR memory with a supporting Motherboard that also supplies an 8X AGP buss I have no problems running most games at 100 FPS in fact in game you can see your FPS in the lower right corner for DoD and it runs at a flicker between 99 and 100 FPS.
I suspect that the card is actually providing faster frame rates then is showmen.

All this still does not explain where all the system resources are going..........