I stumbled across this interesting article on SmallNetBuilder about the performance of commercial 802.11g devices. To summarize, they used an RF channel emulator between two 802.11g stations to observe the performance of competing devices. The results are, in my opinion, shocking. The best device performs as well as the worst device with over 10 dB more path loss.

Wireless LAN Comparison

As engineers we often believe that better engineering results in more sales and a larger market share…but this isn’t necessarily true. Linksys will sell many, many products due to their brand name recognition, even if the 11g chip they use performs poorly. This brings me to the question, does the average consumer care enough about engineering to make a difference? That is, even if you develop a poorly performing device, as long as it performs well enough it really doesn’t mean much if you can’t market it. I think maybe this situation is unique to wireless LAN, because there is more patience in the performance of WLAN devices. For cell phone chipmakers, if your product drops calls at a high rate, I think you’ll see a consumer reaction. What are all of your thoughts?

As a side note, check out the throughput curve from the figure above. Notice that, at best, you get 23 Mbps of throughput when the PHY rate supports 54 Mbps. This is a point-to-point link using near-field antennas. At best, you won’t even get 50% of the PHY throughput. Clearly, there is much work to be done in improving the 802.11 MAC (even though alot of work was done for the 802.11n standard). You can also get a hint about the link adaptation used from the throughput curve. I suspect the curves that have dips at low path loss must not use SNR-based feedback and maybe an auto-rate fallback method while the others use SNR-based feedback. I cannot confirm this because I don’t know how reliable the testing is, but it seems plausible given some of the observations we’ve seen testing these algorithms in Hydra.