Wireless Range vs Coverage

somwkd

Prominent
Aug 30, 2017
1
0
510
Ok. So .. simple question that going back to basics doesn't make a whole lot of sense. I'm going to skip a lot of variables -- transmit power, dBi, SNR, interference, etc - and keep it as simple as possible.

What is the relationship between wireless coverage (square footage) vs range?

For instance, a Google WiFi AP is rated at 1500 square foot coverage. We'll use 1600 for simple math. Since most APs use omni-directional antennas, that means the AP is in the center of the coverage. That should roughly be a 40x40 box, or a circle with a 20' radius (more accurately 22.5').

So, how is it that the actual range from a client to the AP - and still have a usable signal - can be 50 foot, or more? How can wireless N be rated at maximum range of 230 ft?

I know there are a LOT of variables with wireless. There is just an obvious disconnect somewhere in my brain.
 
Solution
That is the problem it has massive amounts of complex things that affect it.

The main issue is what does "usable signal" mean. This is partially what causes all the confusion. If you take take the simple case and all you do is measure the signal strength you get in DB it is mostly a math equation to determine distance. Still even things like the humidity level of the air affect how quickly the signal is absorbed. You can to a point estimate how many DB of signal you have at certain distances.

The thing with WiFi that confuses things is people try to use transfer speed to measure. You now add in the complexity of how the data is encoded. Is a signal with exactly the same signal strength really better because you use a data...
That is the problem it has massive amounts of complex things that affect it.

The main issue is what does "usable signal" mean. This is partially what causes all the confusion. If you take take the simple case and all you do is measure the signal strength you get in DB it is mostly a math equation to determine distance. Still even things like the humidity level of the air affect how quickly the signal is absorbed. You can to a point estimate how many DB of signal you have at certain distances.

The thing with WiFi that confuses things is people try to use transfer speed to measure. You now add in the complexity of how the data is encoded. Is a signal with exactly the same signal strength really better because you use a data encoding that can put more data in the stream. Can you really compare the distance a signal that uses the maximum encoding with 4 overlapping signals to one that uses a single signal and one of the simpler data encodings. If you were to use 802.11ac at the 1733m speed you are lucky to get 10feet, how can you compare this to using 11m 802.11b encoding.

 
Solution

TRENDING THREADS