While wifi is mighty convenient, it’s not very efficient.
Only two wireless stations can be communicating at any given time on a given frequency. In effect, access is SERIALIZED. So let’s say you have two wireless clients each downloading files at the same time. And the ISP is providing 10mbps. The effective throughput for each wireless client will be ~5mbps (HALF) because one wireless client is always waiting on the other to complete its transmission. And the more concurrent wireless clients you have, the worse it gets, quickly. It’s also HALF DUPLEX (i.e., it can’t send and receive at the same time). It’s very much like the old hub technology in its behavior. And anyone who ever worked w/ hubs years ago knows they didn’t scale well either. It took the invention and widespread adoption of full duplex switching to eliminate those inefficiencies.
Now of course I presented an extreme example, large downloads. Obviously things like web browsing, email, and other light Internet activities will be better tolerated, up to a point. Esp. if you use other technologies to improve efficiency (e.g., local caching proxy).
But the idea of 100’s of users w/ today’s wifi technology sends a chill up my spine. I don’t care how “powerful” the base station is, because it’s not a matter of signal strength. It’s a fundamental limitation in the technology wrt to concurrency. That’s what will ultimately do you in.