Latency

G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Opinions please!!

I've deceided to use the delta times between TCP SYN Ethernet frames
to determin network latency. I've collected this data over time. This
now presents a philosopical question; i.e., what exactly is network
latency? One could argue that a network's baseline latency is the
fastest time seen. Others might think that the average of these
discovered delta times would be a network's latency. I'm wondering
what folks think about that.

Jim
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Jim,

There are a couple of ways to look at a system.
Some would simply look at delay as a function of frequency and that would
tell the whole story.
If the signals are bunched into a pass band, then the delay at that
frequency, the group delay, would be a central measure. Note that this says
little more about the system. So, if the communication channel were also
band-limited then there could also be something to be said for the rate of
change of the signals in that band.
Now, in systems where we have a guaranteed signalling bandwidth - like
Ethernet perhaps - we might also talk about bandwidth and latency. This is
because (as also above) the bandwidth measure alone doesn't tell us anything
about delay / latency. So, as you know, the latency becomes an important
factor if you are dealing perhaps with real time systems, etc.....

So far I've lumped delay and latency together - as in linear systems theory.
However, latency generally contains more than simply the delay of a linear
system. It includes the delays that are caused by processing. As such,
latency may be a variable as you've considered.
In that case I believe the engineering measure for latency would come in two
flavors depending on your application reuirements:

- the worst-case (largest or smallest) latency is important in things like
interrupt-driven real time systems. So, you have to pick the largest number
that you see in order to guarantee performance in a design or to predict
what can happen in doing an analysis. The shortest latency may also be used
in a worst-case analysis if your purpose is to rely on latency for some
system characteristic. I should think this would be an unusual case but
mention it for completeness.
- the average latency may be of some value if your analysis is statistical.

So, you have to answer your own question it appears.
I would guess that most often, and for most practical purposes, the network
latency is the *largest delay* that can be measured or the largest delay
with 95% (pick your number) probability of occurrence over a given time,
etc. The latter might be useful if you want to know how bad things can be
"most of the time", allowing for an occasional event to be even worse.

Fred

"Jim Moore" <g9gu-7wqw@spamex.com> wrote in message
news:292af2dc.0410280542.dfa001@posting.google.com...
> Opinions please!!
>
> I've deceided to use the delta times between TCP SYN Ethernet frames
> to determin network latency. I've collected this data over time. This
> now presents a philosopical question; i.e., what exactly is network
> latency? One could argue that a network's baseline latency is the
> fastest time seen. Others might think that the average of these
> discovered delta times would be a network's latency. I'm wondering
> what folks think about that.
>
> Jim