G
Guest
Guest
Archived from groups: alt.games.unreal.tournament (More info?)
Hi all, I recently found this newsgroup and was quite pleased to see
it exists.
I admin a UT server (I have for years now, so assume a high level of
skills on my part, plus I'm an electrical engineer, so I know what I'm
doing, designed lots of Intel based puters and have programmed in 10
or so languages, including for windows, so I'm no dummy). Anyway, we
are running on Linux Redhat 9 and I continue to struggle with why the
UT server is sending such huge amounts of data compared to other Linux
UT servers. In particular, the server is setup with a tick of 30-35
(server provider does not care what tick rate we run) and data rates
from the server become large, approaching 10k bytes/sec when there is
lots of action in a 12 man game (open map btw) and the client netspeed
is set to 10000. I think this data rate is inundating the clients and
perhaps the server. Interestingly, the packets/sec rate tracks the
tick rate, if I set tick to 35, my client machine receives 35
packets/sec. When I play on windows based UT servers, I never see
data rates this large.
I setup a Linux server on my local machine with the same ipdrv
settings (running Mandrake 9.2) and I can set the tick rate up to 50
and do not see the packets/sec rate exceed 25 until tick is set to 50
or higher. If I set tick rate to 35, I receive 25 packets/sec. This
is really odd. Now, I connected to my local Linux machine via the
internet by running the Linux machine on my DSL connection, then
connecting to it via my Comcast connection, to simulate a true
"internet connection" for comparative purposes. The odd thing is with
all the ipdrv settings the same in both servers, my local Linux UT
server never outputs the same amount of data to clients as the server
my clan uses for UT described above.
If anyone has some insights, thoughts, comments, they would be greatly
appreciated.
Here is the IP of out team server if anyone cares to have a look:
unreal://69.13.248.161
--
Best regards,
The_Rifleman
Hi all, I recently found this newsgroup and was quite pleased to see
it exists.
I admin a UT server (I have for years now, so assume a high level of
skills on my part, plus I'm an electrical engineer, so I know what I'm
doing, designed lots of Intel based puters and have programmed in 10
or so languages, including for windows, so I'm no dummy). Anyway, we
are running on Linux Redhat 9 and I continue to struggle with why the
UT server is sending such huge amounts of data compared to other Linux
UT servers. In particular, the server is setup with a tick of 30-35
(server provider does not care what tick rate we run) and data rates
from the server become large, approaching 10k bytes/sec when there is
lots of action in a 12 man game (open map btw) and the client netspeed
is set to 10000. I think this data rate is inundating the clients and
perhaps the server. Interestingly, the packets/sec rate tracks the
tick rate, if I set tick to 35, my client machine receives 35
packets/sec. When I play on windows based UT servers, I never see
data rates this large.
I setup a Linux server on my local machine with the same ipdrv
settings (running Mandrake 9.2) and I can set the tick rate up to 50
and do not see the packets/sec rate exceed 25 until tick is set to 50
or higher. If I set tick rate to 35, I receive 25 packets/sec. This
is really odd. Now, I connected to my local Linux machine via the
internet by running the Linux machine on my DSL connection, then
connecting to it via my Comcast connection, to simulate a true
"internet connection" for comparative purposes. The odd thing is with
all the ipdrv settings the same in both servers, my local Linux UT
server never outputs the same amount of data to clients as the server
my clan uses for UT described above.
If anyone has some insights, thoughts, comments, they would be greatly
appreciated.
Here is the IP of out team server if anyone cares to have a look:
unreal://69.13.248.161
--
Best regards,
The_Rifleman