Archive for the 'WSIL News & Views' Category

Machine Learning: The Solution to Link Adaptation Woes

WSIL News & Views, Wireless Research 2 Comments »

In this post I want to discuss the deficiencies of practical link adaptation in current wireless systems and how algorithms that exploit machine learning will be the solution to all of these problems. First, a little background on link adaptation.

The quality of the medium through which communication occurs, i.e., the wireless channel, is constantly changing. Consider the following figure, courtesy of the Laboratory for Information Technology at Aachen University, which shows the channel quality of a cellular network throughout a rural town in Germany (green=good, red=bad). City Path Loss Reality is actually much worse than this since the channel also fluctuates due to small scale fading.

Digital communication parameters used to construct wireless waveforms (e.g., QAM order, FEC rate, the number of spatial streams, etc.) observe a tradeoff between reliability, e.g., the probability of a dropped call in a voice network, and data rate in terms of the wireless channel quality. Consequently, modern wireless networks get the most “bang for their buck'’ in today’s spectrum limited wireless market through link adaptation where link adaptation is the process of selecting digital communication parameters based on real-time channel quality measurements. Link adaptation allows for joint optimization of reliability and data rate tailored to each application.

Most published research makes link adaptation seem like a straightforward problem.

  1. Measure the wireless channel.
  2. Extract a link quality metric from the wireless channel.
  3. Map the link quality metric to digital communication parameters using a look-up-table or rate/reliability formulas.

Look-up-tables/formulas are created offline through analysis, simulations, or measurements. When I attempted to do this on our IEEE 802.11n prototype, Hydra, I found out that link adaptation in practice can be very difficult. Here are a few of the lessons I learned.

  • System nonlinearities make look-up-tables created through analysis or simulations inaccurate.
  • Non-Gaussian noise make look-up-tables created through analysis or simulations inaccurate.
  • Channel estimation error must be accounted for.
  • Look-up-tables based on actual over-the-air measurements result in the most accurate adaptation.
  • Look-up-tables based on actual over-the-air measurements for one device may result in inaccurate adaptation if placed on a different wireless device.

After my initial testing I was concerned that it might be difficult to use our prototype, even with significant amplifier backoff to design/evaluate practical link adaptation algorithms. Apparently, however, commercial systems suffer from the same issues. For example, the Roofnet project found that the SNR link quality metric did not consistently reflect the expected reliability of the associated digital communication parameters, even when interference/collisions are not an issue.

Another large problem we discovered is that good link quality metrics for MIMO-OFDM systems just weren’t available. It turns out that analyzing the error rate of practical MIMO-OFDM links is very difficult. Consequently, finding a good, single-dimensional (which is required for look-up-tables) link quality metric that modeled the spatial and frequency selective effects of the channel was also very difficult.

So what do we do? The linear, time-invariant models we have used to create link adaptation rate/reliability expressions or to run link simulations (which may then result in look-up-tables) do not reflect the actual system. One approach is to make our analysis model more complex to include nonlinear and non-Gaussian noise effects. This seemed like a difficult undertaking. Analysis was already difficult with our simplistic linear, time-invariant system and additive Gaussian noise. Using a more complex system model would only lead to more design time for engineers. Moreover, even if we are able to find a good link quality metric, it will likely be multi-dimensional, and look-up-tables aren’t easily created in this case. All of this lead us (Professor Heath and I) to machine learning.

Machine learning algorithms allow systems (in our case the link adaptation algorithm) to learn behavior solely from data observations. Hence, as long as we were able to define an accurate link quality metric and pose the problem correctly, machine learning should be able to discover the relationship between the link quality metric and the reliability of the link. First, we created the multi-dimensional ordered SNR link quality metric based on our new expression of packet error rate in MIMO-OFDM systems. Then, with the help of Professor Caramanis, we validated the efficacy of classification algorithms that exploited this new link quality metric for link adaptation. However, all this work was done using system models. To compensate for unique hardware nonidealities, we needed an online algorithm that tuned link adaptation to each transmit/receive device pair. Consequently, Professor Heath and I designed online classifiers that harness training information on-the-fly. These algorithms constantly observe packets transmitted over channels and improve the link adaptation classifier in real time based on these observations.

For example, see the following figure which shows a plot of throughput and packet error rate for offline and online machine learning in a static wireless channel with our wireless prototype and a packet error rate reliability constraint of 10%. City Path Loss The offline algorithm is not able to tune itself to the unique hardware characteristics of the transmit/receive pair, resulting in lost rate/reliability. The online algorithm, however, discovers the correct digital communication parameters in real-time. There have been other rate adaptation algorithms, notably auto-rate fallback (ARF), which adapt online. Unfortunately, they don’t take advantage of explicit link quality metrics and so cannot adapt well in dynamic channel conditions (see following figure).

The best part of our online learning algorithms for link adaptation is simplicity. The algorithms are installed and we’re done. No complex analysis of the reliability curves. No calibration algorithms to determine amplifier backoff. Additionally, our recent results with support vector machines also show that online learning can be implemented with low memory/processing complexity.

For More Information:

R. C. Daniels and R. W. Heath, Jr., “Online Adaptive Modulation and Coding with Support Vector Machines,” to appear in Proceedings of the IEEE European Wireless Conference, Lucca, Italy, April 2010.

R. C. Daniels and R. W. Heath, Jr, “An Online Learning Framework for Link Adaptation in Wireless Networks,” Proceedings of the Information Theory and Applications Workshop, San Diego, CA, February, 2009.

R. C. Daniels, C. M. Caramanis, and R. W. Heath, Jr, “Adaptation in Convolutionally-Coded MIMO-OFDM Wireless Systems through Supervised Learning and SNR Ordering,” IEEE Transactions on Vehicular Technology, January, 2010.

WSIL Team Wins Contest

WSIL News & Views No Comments »

San Francisco was the site of the 2008 WiNTECH Workshop, which held a contest between teams presenting demonstrations of their research prototypes. The theme of the contest was “The Next Big Thing in Wireless”, and consisted of teams building real wireless systems and giving a live demonstration of their capabilities.

From left, Ketan Mandke, Robert Daniels, contest judge Dennis McCain, Steven Peters, and Prof. Robert Heath, Jr.

The winning team consisted of Robert C. Daniels, Ketan Mandke, Steven W. Peters, Prof. Scott M. Nettles, and Prof. Robert W. Heath, Jr., and their winning presentation was called “Machine Learning for Physical Layer Link Adaptation in Multiple-Antenna Wireless Networks”. Daniels and Peters are graduate students in the WSIL, which is directed by Prof. Heath. The demo consisted of using simple machine learning techniques to do adaptation in wireless devices using a custom-built IEEE 802.11n physical layer (PHY). Because this PHY uses coded MIMO-OFDM, adaptation is a difficult prospect. The team successfully demonstrated that the devices were learning the channel with no pre-existing knowledge, and could easily adapt to changing conditions. They used the Hydra prototype, which is in continuous development, as the foundation for the demo.

The winning students received a $2500 cash prize kindly donated by the sponsors, ViaSat, Nokia Siemens Networks, The Center for Multimedia Communication, and BBN Technologies.

The work was sponsored in part by NSF and DARPA ITMANET.

60 GHz Wireless Communications

WSIL News & Views, WSIL Publications 4 Comments »

It has been some time since my last post. For a while (before many of the current members were even in the WSIL) I was one of the few posting, so I’m glad to see that the WindoWSIL has really taken off in terms of group participation and readership. Recently, my 60 GHz Wireless Communications (60G) tutorial paper appeared in IEEE Vehicular Technology Magazine, so I thought as an appropriate companion I would write an informal post for the interested reader.

What makes 60G so interesting for research?
In my opinion there are two primary reasons that 60 GHz has received so much attention recently.

  1. There is 7 GHz of unlicensed (free) bandwidth available in the United States alone! 60 GHz Bandwidth Regulation Additionally, there is around 5 GHz of unlicensed bandwidth common to Europe, most of Asia, Australia, and North America. This amount of bandwidth is unparalleled at traditional frequencies such as 900 MHz, 1800 MHz, 2450 MHz, and 5000 MHz. 62 MHz of licensed spectrum around the 700 MHz carrier frequency was recently auctioned in the United States netting a total of around 20 billions of dollars. Obviously bandwidth is precious, and when 7 GHz is available, people pay attention.
  2. The 60G carrier frequency in fundamentally range limited and presents significant challenges in antenna design, analog circuits, digital processing, and higher (cross) layer design. For this reason the 7 GHz of bandwidth is still relatively unused even though it was opened up over 5 years ago. If operating at 60G was just a matter of ramping up the carrier frequency of current systems, there might not be so much buzz (or money) in 60G research. As communication engineers, we should be licking our chops, because it gives us some unsolved problems we can sink our teeth into. Next, I will elaborate on some of the limitations and design challenges that make 60G so interesting.

60G Challenges
The most obvious limitation of 60G to an engineer is the carrier frequency itself. Physics (via the Friis free space propagation formula) dictates that we lose 20 dB in received power every time we multiply the carrier frequency by 10. That means if I have two systems, one operating at 6 GHz and one operating at 60 GHz, I would have to transmit 20 dB more power in the 60 GHz system to achieve the same performance in a vacuum. Of course we don’t live in a vacuum, and that brings us to two more limitations at 60 GHz: reduced propagation through materials and increased Doppler effects. The ability of electromagnetic waves to penetrate solid objects is a function of the carrier frequency. Measurements have shown that propagation through objects is reduced at 60 GHz, limiting the ability of non-line-of-sight communication. Doppler frequency, which results from mobility of objects around communication devices, is proportional to the carrier frequency. Since the carrier frequency is significantly larger than traditional systems, we will experience much larger Doppler frequencies, which may complicate feedback, channel estimation, and higher layer design. Atmospheric AbsorptionFinally, and perhaps the limitation that encouraged national entities to remove licensing restrictions, 60 GHz wireless transmissions experience Oxygen absorption effects. It turns out that Oxygen resonates at 60 GHz, providing 10-20 dB of signal attenuation per kilometer (see image at right sourced from Marcus Spectrum Solutions).

From a design perspective, the fundamental limitations of the wireless channel are not the only difficulties. 60G antenna design is unique since the antenna dimensions are reduced. Therefore, antenna fabrication will require more expensive milling technology, especially for planar antenna arrays. RF circuits at 60 GHz may also have to discard simplifying lumped element assumptions as wavelengths approach device dimensions. Analog circuit design is complicated for many other reasons including reduced amplifier gain characteristics, reduced power output capabilities for transistors, amplification of frequency instability (phase noise). In general this leads to increased cost in engineering design or the technology implemented.

Current Activity
After the laundry list of limitations and design obstacles presented above you might be thinking to yourself “there’s no point in trying” or “it will never work.” Thankfully, that’s not the consensus of the wireless market. We have seen significant activity in industry recently to try and bring 60 GHz to the consumer in the near future. IEEE 802.16d, the precursor to WiMax, actually considered using frequencies up to 66 GHz. Unfortunately, it quickly became apparent that the above challenges were too much for mobile broadband devices that need to communicate over kilometer distances. As a consequence, all the wireless devices to appear in the near future are targeted for indoor, short range communications. IEEE 802.15.3c is currently constructing a standard for operating wireless personal area networks at 60 GHz. The 3c working group has formally accepted a proposal, an important first step towards actual standardization. 15.3c devices should be seen as a replacement for UWB, which will eventually replace Bluetooth. While, 802.15.3c is addressing a host of applications for high throughput communications, WirelessHD has a single focus. WirelessHD is a consortium of companies that have banded together to create 60G technology for high definiton wireless video streaming. Here is a nice overview of their OFDM-based standard as well as a video summary of a CES 2008 presentation. In the past wireless video streaming was subject to low performance lossy compression or high cost lossless compression. Because of the 60 GHz bandwidth available the wireless industry is hoping to provide low cost, high performance uncompressed video streaming. With 60 GHz it is only a matter of time before all peripheral devices on a personal computer are cable-less and all the wires in your home entertainment center are gone (except for power cables of course). For completeness I want to also mention that ECMA has thrown their hat into the ring and are in the process of building a 60 GHz standard. Since this is the same standardization body that brought us the WiMedia Multiband OFDM standard for UWB, I can only expect they see 60 GHz as an extension for higher rates. In that sense IEEE 802.15.3c and the ECMA standard will be directly competing; whether the two can coexist is unknown at this point.

Design Ideas
If you’d like to know more about how to design the physical layer in view of these challenges or a look at developing technologies that may change the landscape of 60G, please refer to the aforementioned IEEE reference or its preprint here. I think there is copious room for physical layer or higher layer design suggestions that take advantage of 60 GHz with consideration of its challenges. Along with the 60G challenges, we also experience unique advantages such as reduced size constraints on antenna arrays and improved security of communication signals. I’m more than willing to hear your ideas so please contact me with any additional questions.

IEEE 802.16j (WiMAX multihop relay specification) Draft 2 passes Letter Ballot

WSIL News & Views No Comments »

IEEE P802.16j/D2 passed letter ballot today. After comments on the draft made by negative voters are resolved, the draft will move to Sponsor Ballot and then on the the Standards Board Review Committee (RevCom). I expect it will be approved by the Standards Board by October.

WiMAX Deploying

WSIL News & Views 3 Comments »

There’s a really nice article* in this month’s IEEE Spectrum Magazine about Sprint’s Xohm, which will provide the first WiMAX service in the United States. I’m usually not too excited about this kind of stuff, but I will definitely look into it when it is offered here in Austin. This year they plan on deploying in Chicago, Baltimore, and Washington, D.C., followed by New York City.

Perks:

  • 2-4 Mbps, which is 4-8 times faster than 3G.
  • Equipment available will include laptop adapters, mobile phones, and home modems, meaning one service can give you broadband access to your home, give you a WiFi-like hotspot anywhere in the city, and be your mobile phone.
  • Prices are expected to be competitive with DSL.
  • No contracts.

Now we just have to hope WiMAX equipment isn’t too expensive.


*the article isn’t on IEEE Xplore quite yet.

How Important are Engineers for the Success of a Product?

Miscellaneous, WSIL News & Views 3 Comments »

I stumbled across this interesting article on SmallNetBuilder about the performance of commercial 802.11g devices. To summarize, they used an RF channel emulator between two 802.11g stations to observe the performance of competing devices. The results are, in my opinion, shocking. The best device performs as well as the worst device with over 10 dB more path loss.

Wireless LAN Comparison

As engineers we often believe that better engineering results in more sales and a larger market share…but this isn’t necessarily true. Linksys will sell many, many products due to their brand name recognition, even if the 11g chip they use performs poorly. This brings me to the question, does the average consumer care enough about engineering to make a difference? That is, even if you develop a poorly performing device, as long as it performs well enough it really doesn’t mean much if you can’t market it. I think maybe this situation is unique to wireless LAN, because there is more patience in the performance of WLAN devices. For cell phone chipmakers, if your product drops calls at a high rate, I think you’ll see a consumer reaction. What are all of your thoughts?

As a side note, check out the throughput curve from the figure above. Notice that, at best, you get 23 Mbps of throughput when the PHY rate supports 54 Mbps. This is a point-to-point link using near-field antennas. At best, you won’t even get 50% of the PHY throughput. Clearly, there is much work to be done in improving the 802.11 MAC (even though alot of work was done for the 802.11n standard). You can also get a hint about the link adaptation used from the throughput curve. I suspect the curves that have dips at low path loss must not use SNR-based feedback and maybe an auto-rate fallback method while the others use SNR-based feedback. I cannot confirm this because I don’t know how reliable the testing is, but it seems plausible given some of the observations we’ve seen testing these algorithms in Hydra.

WSIL End of Semester Report

Miscellaneous, WSIL News & Views 5 Comments »

I know this is Bob’s thing, but he’s busy doing real engineering work at the moment.

This semester has gone by extremely quickly. Unbelievably, I’ve been in WSIL for six months already. Amazing.

I’ll first present the latest news. Sumohana Channappayya successfully defended his dissertation and will be heading to San Diego to join the real world as a newly anointed Ph.D. Congratulations, Sumo!

Along similar lines, Caleb Lo and Chan-Byoung Chae both successfully passed quals yesterday. These two, along with Kaibin Huang, are next in line to graduate.

Further, we have published, submitted, or have had accepted numerous papers on topics ranging from prototyping to information theory. In particular, we published the following seven journal papers this semester:

  • R. Chen, J. G. Andrews, R. W. Heath, Jr., and A. Ghosh, “Uplink Power Control in Multi-Cell Spatial Multiplexing Wireless Systems,” IEEE Trans. on Wireless, vol. 6, no. 7, pp. 2700-2711, July 2007. [IEEE Xplore]
  • A. Forenza, D. J. Love, and R. W. Heath, Jr., “Simplified Spatial Correlation Models for Clustered MIMO Channels with Different Array Configurations,'’ IEEE Trans. on Veh. Tech., vol. 56, no. 4, part 2, pp. 1924-1934, July 2007. [IEEE Xplore]
  • K. Huang, R. W. Heath, Jr., and J. G. Andrews, “Space Division Multiple Access with a Sum Feedback Rate Constraint”, IEEE Trans. on Signal Processing, pp 3879-3891, Jul. 2007. [IEEE Xplore]
  • B. Mondal and R. W. Heath, Jr., “Quantization on the Grassmann Manifold,'’ IEEE Trans. on Signal Processing, vol. 55, no. 8, pp. 4208-4216, Aug. 2007. [IEEE Xplore]
  • V. Raghavan, R. W. Heath, Jr., and A. Sayeed, “Systematic Codebook Designs for Quantized Beamforming in Correlated MIMO Channels,'’ IEEE Journal on Sel. Areas in Comm., Special Issue on Optimization of MIMO Transceivers for Realistic Communication Networks: Challenges and Opportunities, vol. 25, no. 7. pp. 1298-1310, Sept. 2007. [IEEE Xplore]
  • M. R. McKay, I. B. Collings, A. Forenza, and R. W. Heath, Jr., “Multiplexing/Beamforming Switching for Coded MIMO in Spatially Correlated Channels Based on Closed-Form BER Approximations,'’ IEEE Transactions on Vehicular Technology, vol. 56, no. 5, part 1. pp. 2555-2567, Sept. 2007. [IEEE Xplore]
  • D. Gesbert, M. Kountouris, R. W. Heath, Jr., C. B. Chae, and T. Salzer, ‘Shifting the MIMO Paradigm: From Single User to Multiuser Communications,’ IEEE Signal Processing Magazine, Vol. 24, No. 5, pp. 36-46, Oct., 2007 [IEEE Xplore]

Also, first year grad student Alvin Leung taught us all a ping pong lesson in the first annual WSIL ping pong tournament. His skills will surely diminish as grad school takes over his life.

There really was so much more to this semester. In many ways, Fall 07 will be remembered as a semester where a lot of work was put into projects that other semesters will be able to claim to have finished, as this semester did with the above journal papers. We submitted far more journal papers than we published. We welcomed 6 new members and said goodbye to only 1. And we put in hundreds of hours into projects whose payoffs are still months away. In that way, you might say it was a blue-collared semester.

If anyone else has something to add (feel free to toot your own horn), please comment.

4G spectrum agreed at ITU-R

WSIL News & Views 5 Comments »

This is perhaps the first big move towards the 4G mobile wireless system that has been in the talks for the last 5 years or so. Today, ITU-R at the World Radiocommunication Conference (WRC ‘07) has agreed on the spectrum for 4G.

  • 450−470 MHz band
  • 698−862 MHz band in Region 2 and nine countries of Region 3
  • 790−862 MHz band in Regions 1 and 3
  • 2.3−2.4 GHz band
  • 3.4−3.6 GHz band (no global allocation, but accepted by many countries)

Now the question is what the radio specification will be for 4G (=IMT-Advanced).

http://www.itu.int/newsroom/press_releases/2007/36.html

http://www.itu.int/newsroom/wrc/2007/itur_web_flash/20071019.html

Google Goes Mobile

Miscellaneous, WSIL News & Views 3 Comments »

Last week Google announced Android, an open source platform for mobile software development. Now google has released the software development kit. To top things off the Android Design Challenge is offering a total of $10 million in awards for good mobile software designs. Anybody feel like moving up a few layers and trying this out?

Writing good proofs

WSIL News & Views 1 Comment »

I was doing a little late-night reading tonight, and it is amazing how poorly some proofs are written by very intelligent people in our field. I’ve even seen poorly written proofs in the midst of a well-written paper. I understand that the primary concern with proofs is correctness, but shouldn’t proofs also be readable? My question is, how essential are nice, clean, understandable proofs in a journal draft? If the goal of our paper is understanding, then clearly they should be just as readable as the text. Maybe that’s why most authors place them in the appendix…so their hideous features don’t destroy the rest of the paper. Of course, some will tell you that if the math in your publication is too understandable, then your peers won’t respect it as much. The point being that, if your proof is straightforward, the result must have come to you easily and is therefore not worthy of publication. You might think I’m joking, but I’ve heard this from many academics over the years.
This really makes me appreciate my first undergraduate class in algebra and number theory. The professor in this class was a bit “picky'’ about the structure of proofs and the logic used to draw conclusions. At the time it made assignments dreadfully tedius, but in the end it made me appreciate a well-written proof. For those of you who didn’t have the opportunity to take undergraduate classes focused on proof-writing, how did you pick up your skills? If you’d like to learn more about proofing skills, check out this link (courtesy of Professor Cusick at Cal State Fresno).