The people posting here obviously don't run their own Wifi networks, and haven't setup wireless networks either.
Wireless 802.11g is a great thing to have for your laptop, music streaming device, and PDA . It sucks for everything else.
There's no such thing as "packet collision" on an ethernet network, unless you are using HUBs instead of switches - and who in blue blazes would be dumb enough tobuy a 100Mb HUB??? Even with HUBs, there are no collisions on a radio device.
The big thing with Wireless is "packet loss" (not collision!) and loss can be as high as 75% on a dodgy Wifi connection. Even with very little (5% - which is massive when compared to the 0% of wired ethernet) loss, a wireless entwork is HIGHLY unlikely to achieve more than 20% of rated throughput. Not 80%, not 50% - you will be LUCKY to average 20%.
That's less than 10Mb/s - and in actual fact, don't be surprised when transfering files, if your Wifi averages 1Mb/s - 1 tenth of the 30% I just mentioned!
Remember, that "54Mb/s wireless router" means "54Mb/s SHARED BETWEEN ALL CLIENTS!"
So, if you have 4 Wifi devices connected, then, at most, they'll handshake with the router, at 13.5Mb/s - but more likely, is 8Mb/s or less. Then, once again, except less than 20% of that rate for file transfers. Don't be surprised if you get 1/10th of THAT rate either!
Wifi is freaking great where you can't install wires - have devices which will be useless with an ethernet cable attached (Lapto, PDA) but it is utterly useless for speed, reliability, and robustness. Ask anyone who actually runs Wifi networks.
Gaming on a wifi connect would be insane - because of the lossy nature of the connect.