While many consumers have used 802.11n wireless routers for around two years now, it was assumed that the 802.11n standard was finalized before now. However, that's not the case as verified by Bob Heile, the chairman of the IEEE 802.15 working group on Personal Area Networks. In a recent email, Heile confirmed that the IEEE 802.11n draft standard was finally submitted to the Standards Review Committee (RevCom).
"On other fronts, 802.11 was granted unconditional approval to forward 11n to RevCom," Heile wrote. "After a bit of a rocky period on getting acceptable coexistence language included in the draft, I was pleased to support this approval. Congratulations to Bruce for his patience and perseverance in getting this done. This was an extremely complex project."
According to PC Magazine, the road to this point has been a long one for the 802.11n standard, its evolution dating back to 2004. An early draft version was approved in January 2006 (1.0) that would eventually kick-start the first wave of routers implementing the draft-n standard. But when the draft 802.11n standard failed to pass in May 2006, the Wi-Fi Alliance eventually agreed to certify the draft-n--or rather pre-N, based on Draft 2.0--products in June 2007.
However, based on Heile's recent email, the 802.11n struggle is expected to come to a full close by September 11 instead of the predicted January 2010 publication date.
I used to work at a small ISP and I got so tired of people calling and complaining that "they got a 300mbps router and our service was still limiting them to -insert plan speeds here-". They just didn't know any better than what the salesman told them.
Also, 99% of the "rangebooster" and "super-duper range N" is crap. Almost all have internal antennae with boosted power levels. A WRT54GL with the high gain set and dd-wrt can be had for (in most cases) half the money and it KILLS all this N crap the Best Buy/Frys kid wants you to buy.
I don't have much to add to the compatibility conversation, but basically all N products had G and B modes. If you happen to find a half-assed v1 N product I suspect you could kill the N mode altogether and be all set. ...its better than nothing :)
It's certainly taking an awful long time to finalize the N standard...
Any wireless device and standard has an amount of radiation it is allowed to emit, like Wireless B&G have around 1Watt for notebooks that can be emitted.
I thought that because Wireless N uses 3 antenna's each able to transmit 1Watt each, that it might be an issue as those multiple antenna's actually give off more watts than are healthy for a person to receive on his lap.
Or are there other reasons why Wireless N is no standard yet?
Benefit is that N uses 3 (or more) antenna's, angled in different angles (preferably in X, Y, Z coordinates.)
These antenna's can receive signal bouncing off of walls that a regular B or G router, and even SuperG routers with more than one antenna) have difficulties with.
My Netgear Super G router was one router before Wireless N generation,and has 6 or 8 antenna's (forgot exactly how many), but all antenna's inside the router are aimed from the center outward in 360 degrees around the Y axis (the vertical axis).
In other words, the antenna's are mounted in a 2D plane.
Laptops with Wireless N often have 2 antenna's in the screen and one on the side of the laptop angling it in a XYZ 3D plane.
So signal strength by receiving the signal through the best antenna, is probably more important on Wireless N than the total speed Wireless N has.
Speeds of 300mbits (around 30MB/s) are actually good for routers with multiple computers connected, where not only a network is served, but also where computers share data in adhoc networks or environments.
That way, eg: 2 computers could share files with eachother at around 15MB/s.
had to reduce it, i still fear of being hacked despite all the security measures possible i did in the router settings.
Beamforming is where you can make the system behave like a directional antenna using the components received at each antenna with the delays between the same signal arriving at different antennae. It also allows you to transmit in such a way that a beam is directed more towards the receiving system.
In MIMO technology, (multiple input, multiple output) you are saying that if the antennae are spaced a specific distance apart then the channel (connection between each antenna at the source and receiver) are independent. Meaning interference on each path with be different. This means you can get a lower bit error rate on a noiser connection compared to using single a single antenna at each end. Meaning you can transmit quicker with the transmission power divided amongst the multiple transmit antennae.
Using the same transmission power as a single antenna but spread over a number antennae (3 antennae at the base station and 2 antennae on the client) it is possible to mitigate noise and what is called channel fading (caused by the signal being reflected off different surfaces in the room).
With 3 antennae at the base station and 2 antennae on the client you have 6 different paths so it is possible to exploit the independence of these paths by encoding the data is certain ways.
I would go into more depth but the encoding scheme for data sent over the antennae is complex and involves solving matrices. Search the internet for MIMO Radio Communications Systems.