Author Archive

Machine Learning: The Solution to Link Adaptation Woes

WSIL News & Views, Wireless Research 2 Comments »

In this post I want to discuss the deficiencies of practical link adaptation in current wireless systems and how algorithms that exploit machine learning will be the solution to all of these problems. First, a little background on link adaptation.

The quality of the medium through which communication occurs, i.e., the wireless channel, is constantly changing. Consider the following figure, courtesy of the Laboratory for Information Technology at Aachen University, which shows the channel quality of a cellular network throughout a rural town in Germany (green=good, red=bad). City Path Loss Reality is actually much worse than this since the channel also fluctuates due to small scale fading.

Digital communication parameters used to construct wireless waveforms (e.g., QAM order, FEC rate, the number of spatial streams, etc.) observe a tradeoff between reliability, e.g., the probability of a dropped call in a voice network, and data rate in terms of the wireless channel quality. Consequently, modern wireless networks get the most “bang for their buck'’ in today’s spectrum limited wireless market through link adaptation where link adaptation is the process of selecting digital communication parameters based on real-time channel quality measurements. Link adaptation allows for joint optimization of reliability and data rate tailored to each application.

Most published research makes link adaptation seem like a straightforward problem.

  1. Measure the wireless channel.
  2. Extract a link quality metric from the wireless channel.
  3. Map the link quality metric to digital communication parameters using a look-up-table or rate/reliability formulas.

Look-up-tables/formulas are created offline through analysis, simulations, or measurements. When I attempted to do this on our IEEE 802.11n prototype, Hydra, I found out that link adaptation in practice can be very difficult. Here are a few of the lessons I learned.

  • System nonlinearities make look-up-tables created through analysis or simulations inaccurate.
  • Non-Gaussian noise make look-up-tables created through analysis or simulations inaccurate.
  • Channel estimation error must be accounted for.
  • Look-up-tables based on actual over-the-air measurements result in the most accurate adaptation.
  • Look-up-tables based on actual over-the-air measurements for one device may result in inaccurate adaptation if placed on a different wireless device.

After my initial testing I was concerned that it might be difficult to use our prototype, even with significant amplifier backoff to design/evaluate practical link adaptation algorithms. Apparently, however, commercial systems suffer from the same issues. For example, the Roofnet project found that the SNR link quality metric did not consistently reflect the expected reliability of the associated digital communication parameters, even when interference/collisions are not an issue.

Another large problem we discovered is that good link quality metrics for MIMO-OFDM systems just weren’t available. It turns out that analyzing the error rate of practical MIMO-OFDM links is very difficult. Consequently, finding a good, single-dimensional (which is required for look-up-tables) link quality metric that modeled the spatial and frequency selective effects of the channel was also very difficult.

So what do we do? The linear, time-invariant models we have used to create link adaptation rate/reliability expressions or to run link simulations (which may then result in look-up-tables) do not reflect the actual system. One approach is to make our analysis model more complex to include nonlinear and non-Gaussian noise effects. This seemed like a difficult undertaking. Analysis was already difficult with our simplistic linear, time-invariant system and additive Gaussian noise. Using a more complex system model would only lead to more design time for engineers. Moreover, even if we are able to find a good link quality metric, it will likely be multi-dimensional, and look-up-tables aren’t easily created in this case. All of this lead us (Professor Heath and I) to machine learning.

Machine learning algorithms allow systems (in our case the link adaptation algorithm) to learn behavior solely from data observations. Hence, as long as we were able to define an accurate link quality metric and pose the problem correctly, machine learning should be able to discover the relationship between the link quality metric and the reliability of the link. First, we created the multi-dimensional ordered SNR link quality metric based on our new expression of packet error rate in MIMO-OFDM systems. Then, with the help of Professor Caramanis, we validated the efficacy of classification algorithms that exploited this new link quality metric for link adaptation. However, all this work was done using system models. To compensate for unique hardware nonidealities, we needed an online algorithm that tuned link adaptation to each transmit/receive device pair. Consequently, Professor Heath and I designed online classifiers that harness training information on-the-fly. These algorithms constantly observe packets transmitted over channels and improve the link adaptation classifier in real time based on these observations.

For example, see the following figure which shows a plot of throughput and packet error rate for offline and online machine learning in a static wireless channel with our wireless prototype and a packet error rate reliability constraint of 10%. City Path Loss The offline algorithm is not able to tune itself to the unique hardware characteristics of the transmit/receive pair, resulting in lost rate/reliability. The online algorithm, however, discovers the correct digital communication parameters in real-time. There have been other rate adaptation algorithms, notably auto-rate fallback (ARF), which adapt online. Unfortunately, they don’t take advantage of explicit link quality metrics and so cannot adapt well in dynamic channel conditions (see following figure).

The best part of our online learning algorithms for link adaptation is simplicity. The algorithms are installed and we’re done. No complex analysis of the reliability curves. No calibration algorithms to determine amplifier backoff. Additionally, our recent results with support vector machines also show that online learning can be implemented with low memory/processing complexity.

For More Information:

R. C. Daniels and R. W. Heath, Jr., “Online Adaptive Modulation and Coding with Support Vector Machines,” to appear in Proceedings of the IEEE European Wireless Conference, Lucca, Italy, April 2010.

R. C. Daniels and R. W. Heath, Jr, “An Online Learning Framework for Link Adaptation in Wireless Networks,” Proceedings of the Information Theory and Applications Workshop, San Diego, CA, February, 2009.

R. C. Daniels, C. M. Caramanis, and R. W. Heath, Jr, “Adaptation in Convolutionally-Coded MIMO-OFDM Wireless Systems through Supervised Learning and SNR Ordering,” IEEE Transactions on Vehicular Technology, January, 2010.

60 GHz Wireless Communications

WSIL News & Views, WSIL Publications 4 Comments »

It has been some time since my last post. For a while (before many of the current members were even in the WSIL) I was one of the few posting, so I’m glad to see that the WindoWSIL has really taken off in terms of group participation and readership. Recently, my 60 GHz Wireless Communications (60G) tutorial paper appeared in IEEE Vehicular Technology Magazine, so I thought as an appropriate companion I would write an informal post for the interested reader.

What makes 60G so interesting for research?
In my opinion there are two primary reasons that 60 GHz has received so much attention recently.

  1. There is 7 GHz of unlicensed (free) bandwidth available in the United States alone! 60 GHz Bandwidth Regulation Additionally, there is around 5 GHz of unlicensed bandwidth common to Europe, most of Asia, Australia, and North America. This amount of bandwidth is unparalleled at traditional frequencies such as 900 MHz, 1800 MHz, 2450 MHz, and 5000 MHz. 62 MHz of licensed spectrum around the 700 MHz carrier frequency was recently auctioned in the United States netting a total of around 20 billions of dollars. Obviously bandwidth is precious, and when 7 GHz is available, people pay attention.
  2. The 60G carrier frequency in fundamentally range limited and presents significant challenges in antenna design, analog circuits, digital processing, and higher (cross) layer design. For this reason the 7 GHz of bandwidth is still relatively unused even though it was opened up over 5 years ago. If operating at 60G was just a matter of ramping up the carrier frequency of current systems, there might not be so much buzz (or money) in 60G research. As communication engineers, we should be licking our chops, because it gives us some unsolved problems we can sink our teeth into. Next, I will elaborate on some of the limitations and design challenges that make 60G so interesting.

60G Challenges
The most obvious limitation of 60G to an engineer is the carrier frequency itself. Physics (via the Friis free space propagation formula) dictates that we lose 20 dB in received power every time we multiply the carrier frequency by 10. That means if I have two systems, one operating at 6 GHz and one operating at 60 GHz, I would have to transmit 20 dB more power in the 60 GHz system to achieve the same performance in a vacuum. Of course we don’t live in a vacuum, and that brings us to two more limitations at 60 GHz: reduced propagation through materials and increased Doppler effects. The ability of electromagnetic waves to penetrate solid objects is a function of the carrier frequency. Measurements have shown that propagation through objects is reduced at 60 GHz, limiting the ability of non-line-of-sight communication. Doppler frequency, which results from mobility of objects around communication devices, is proportional to the carrier frequency. Since the carrier frequency is significantly larger than traditional systems, we will experience much larger Doppler frequencies, which may complicate feedback, channel estimation, and higher layer design. Atmospheric AbsorptionFinally, and perhaps the limitation that encouraged national entities to remove licensing restrictions, 60 GHz wireless transmissions experience Oxygen absorption effects. It turns out that Oxygen resonates at 60 GHz, providing 10-20 dB of signal attenuation per kilometer (see image at right sourced from Marcus Spectrum Solutions).

From a design perspective, the fundamental limitations of the wireless channel are not the only difficulties. 60G antenna design is unique since the antenna dimensions are reduced. Therefore, antenna fabrication will require more expensive milling technology, especially for planar antenna arrays. RF circuits at 60 GHz may also have to discard simplifying lumped element assumptions as wavelengths approach device dimensions. Analog circuit design is complicated for many other reasons including reduced amplifier gain characteristics, reduced power output capabilities for transistors, amplification of frequency instability (phase noise). In general this leads to increased cost in engineering design or the technology implemented.

Current Activity
After the laundry list of limitations and design obstacles presented above you might be thinking to yourself “there’s no point in trying” or “it will never work.” Thankfully, that’s not the consensus of the wireless market. We have seen significant activity in industry recently to try and bring 60 GHz to the consumer in the near future. IEEE 802.16d, the precursor to WiMax, actually considered using frequencies up to 66 GHz. Unfortunately, it quickly became apparent that the above challenges were too much for mobile broadband devices that need to communicate over kilometer distances. As a consequence, all the wireless devices to appear in the near future are targeted for indoor, short range communications. IEEE 802.15.3c is currently constructing a standard for operating wireless personal area networks at 60 GHz. The 3c working group has formally accepted a proposal, an important first step towards actual standardization. 15.3c devices should be seen as a replacement for UWB, which will eventually replace Bluetooth. While, 802.15.3c is addressing a host of applications for high throughput communications, WirelessHD has a single focus. WirelessHD is a consortium of companies that have banded together to create 60G technology for high definiton wireless video streaming. Here is a nice overview of their OFDM-based standard as well as a video summary of a CES 2008 presentation. In the past wireless video streaming was subject to low performance lossy compression or high cost lossless compression. Because of the 60 GHz bandwidth available the wireless industry is hoping to provide low cost, high performance uncompressed video streaming. With 60 GHz it is only a matter of time before all peripheral devices on a personal computer are cable-less and all the wires in your home entertainment center are gone (except for power cables of course). For completeness I want to also mention that ECMA has thrown their hat into the ring and are in the process of building a 60 GHz standard. Since this is the same standardization body that brought us the WiMedia Multiband OFDM standard for UWB, I can only expect they see 60 GHz as an extension for higher rates. In that sense IEEE 802.15.3c and the ECMA standard will be directly competing; whether the two can coexist is unknown at this point.

Design Ideas
If you’d like to know more about how to design the physical layer in view of these challenges or a look at developing technologies that may change the landscape of 60G, please refer to the aforementioned IEEE reference or its preprint here. I think there is copious room for physical layer or higher layer design suggestions that take advantage of 60 GHz with consideration of its challenges. Along with the 60G challenges, we also experience unique advantages such as reduced size constraints on antenna arrays and improved security of communication signals. I’m more than willing to hear your ideas so please contact me with any additional questions.

How Important are Engineers for the Success of a Product?

Miscellaneous, WSIL News & Views 3 Comments »

I stumbled across this interesting article on SmallNetBuilder about the performance of commercial 802.11g devices. To summarize, they used an RF channel emulator between two 802.11g stations to observe the performance of competing devices. The results are, in my opinion, shocking. The best device performs as well as the worst device with over 10 dB more path loss.

Wireless LAN Comparison

As engineers we often believe that better engineering results in more sales and a larger market share…but this isn’t necessarily true. Linksys will sell many, many products due to their brand name recognition, even if the 11g chip they use performs poorly. This brings me to the question, does the average consumer care enough about engineering to make a difference? That is, even if you develop a poorly performing device, as long as it performs well enough it really doesn’t mean much if you can’t market it. I think maybe this situation is unique to wireless LAN, because there is more patience in the performance of WLAN devices. For cell phone chipmakers, if your product drops calls at a high rate, I think you’ll see a consumer reaction. What are all of your thoughts?

As a side note, check out the throughput curve from the figure above. Notice that, at best, you get 23 Mbps of throughput when the PHY rate supports 54 Mbps. This is a point-to-point link using near-field antennas. At best, you won’t even get 50% of the PHY throughput. Clearly, there is much work to be done in improving the 802.11 MAC (even though alot of work was done for the 802.11n standard). You can also get a hint about the link adaptation used from the throughput curve. I suspect the curves that have dips at low path loss must not use SNR-based feedback and maybe an auto-rate fallback method while the others use SNR-based feedback. I cannot confirm this because I don’t know how reliable the testing is, but it seems plausible given some of the observations we’ve seen testing these algorithms in Hydra.

Google Goes Mobile

Miscellaneous, WSIL News & Views 3 Comments »

Last week Google announced Android, an open source platform for mobile software development. Now google has released the software development kit. To top things off the Android Design Challenge is offering a total of $10 million in awards for good mobile software designs. Anybody feel like moving up a few layers and trying this out?

Writing good proofs

WSIL News & Views 1 Comment »

I was doing a little late-night reading tonight, and it is amazing how poorly some proofs are written by very intelligent people in our field. I’ve even seen poorly written proofs in the midst of a well-written paper. I understand that the primary concern with proofs is correctness, but shouldn’t proofs also be readable? My question is, how essential are nice, clean, understandable proofs in a journal draft? If the goal of our paper is understanding, then clearly they should be just as readable as the text. Maybe that’s why most authors place them in the appendix…so their hideous features don’t destroy the rest of the paper. Of course, some will tell you that if the math in your publication is too understandable, then your peers won’t respect it as much. The point being that, if your proof is straightforward, the result must have come to you easily and is therefore not worthy of publication. You might think I’m joking, but I’ve heard this from many academics over the years.
This really makes me appreciate my first undergraduate class in algebra and number theory. The professor in this class was a bit “picky'’ about the structure of proofs and the logic used to draw conclusions. At the time it made assignments dreadfully tedius, but in the end it made me appreciate a well-written proof. For those of you who didn’t have the opportunity to take undergraduate classes focused on proof-writing, how did you pick up your skills? If you’d like to learn more about proofing skills, check out this link (courtesy of Professor Cusick at Cal State Fresno).

Spring 2007 Semester Recap

WSIL News & Views, Alumni 3 Comments »

It’s hard to believe, but another semester is in the books. I’d just like to update the status of the WSIL.

    • In February, Runhua Chen successfully defended his thesis titled “Multiuser MIMO Communication Systems with Cooperative Transmission”. A special congratulations is in order for Runhua and we wish him the best of luck as he transitions into the his new life in the real world.
    • Postdoctoal scholar Dr. Seijoon Shim will be returning to Korea after spending over a year at UT. Dr. Shim will always be remembered fondly in Austin, TX for his killer vocals of ’80s era metal rock songs. Good luck to Dr. Shim in his new position…don’t forget to visit us at the WSIL from time to time, Dr. Shim.
    • During this semester, 3 new members have joined the WSIL. In January 2007, Takao Inoue returned to UT from Japan to provide the WSIL with a jack-of-all-trades PhD student. Masters students Sanmi Koyejo and Steve Peters also joined the group to pursue their PhDs. Good luck to all of these students in their future at UT.

Is MIMO Dead?

WSIL News & Views 1 Comment »

As Lee Corso of College Gameday on ESPN would say “Not so fast, my friend!” I know I’ve heard that “MIMO is dead” many times since 2004 when I joined the WSIL at the University of Texas. Lee CorsoThis claim is usually a result of two opinions: (1) We know everything we need to know about MIMO (2) MIMO only works well in simulations, not the real world. In my opinion, to automatically dismiss wireless research because it focuses on MIMO techniques is a bit shallow. While it is true that the wireless community’s conceptual understanding of MIMO communication theory is very advanced, much of this theoretical understanding of MIMO is insufficient to make MIMO work in the real world. Recent research into practical feedback methods, mutli-user MIMO, and MAC-design in MIMO networks are just a few research topics that will likely, in part, enable future wireless implementations.

Not convinced? At the January IEEE meeting in London, a plan for 802.16m was formally announced. 802.16m, a sequel of WiMax will deliver a maximum of 1 Gbps. One of the proposed components of such a system is a large number of transmit and receive antennas. Many are skeptical that such deliverables are possible since this is more than ten times the throughput of current WiMax systems. If you look at technology throughout history, there’s always been this battle between the “visionaries” and the “realists”. Twenty years ago who would have believed you could make telephone calls, watch color television/on-demand videos, listen to any music you desire, and connect to a world-wide network of computing devices all on a electronics component smaller than your wallet?

We may never deliver 1 Gbps wireless traffic in mobile systems by using the current allocated bandwidth with MIMO techniques, but we’re not fully realizing MIMO’s potential either. To say that “MIMO is Dead” is just plain silly.

IT++ for Mac OS X and Linux

Reference 4 Comments »

Many of you may be aware of the IT++ C++ library of information technology related functions.

IT++

IT++ includes vector and matrix functionality much like Matlab as well as many communications related functions (coding, interleaving, modulation, etc). Of course you’re still programming in C++ so it’s definitely more painful than Matlab, but if you have any computationally intensive functions that you’d like to use for simulation purposes, this package may be useful to you. The Hydra multihop prototype makes use of this library for it’s vector/matrix structure.

Tux.jpg

The installation of IT++ for use with Linux distributions is a fairly straightforward affair:

  1. Using the link from the IT++ site, download the latest IT++ source.
  2. Follow these instructions for making the source.

If you are using the Gentoo Linux distribution, the package manager (emerge) will do steps 1 and 2 for you. When compiling C++ programs using IT++ libraries, you must consider the appropriate syntax for proper linking.

Recently, I had the desire to install IT++ on my intel Mac OS X laptop. Unfortunately, there is little documentation on how to get this working. It took me a while to tweak the settings, but I finally got it to work. Here are some hints:

  • Use fink to install FFTW and ATLAS.
  • Using the x11 terminal keep the default prefix (/usr/local) when executing the configure script. Other locations will make header inclusions messy.
  • When you use fink to install the FFTW and ATLAS dependencies (which optimize matrix/vector and FFT computations), you must add this to the library and include paths. For example, you might execute:
    export LDFLAGS=”-L/sw/lib”
    export CPPFLAGS=”-I/sw/include”
    ./configure
  • Before executing the configure script, you need to point to doxygen, dvips, and latex (I’m not really sure if you need all of this). The problem is, the installation looks for these programs in the root directory, and if you’re using fink (again, I recommend this) these programs are installed in the /sw/ directory. I created links for all three of these programs. For example
    sudo ln -s /sw/bin/dvips /usr/bin/dvips

    creates a symbolic link in the root to the fink-installed dvips program.

Are you afflicted with `The Knack’?

WSIL News & Views 4 Comments »

You might want to check this out to make sure. I think some of us may be suffering from this. A special thanks is in store for Eric Blossom of GNU radio fame for raising awareness.

Medic

A lucrative part time job for wireless grad students?

WSIL News & Views No Comments »

For those of you who aren’t aware, there’s a technique called Bluesnarfing where security flaws in Bluetooth networks are exploited to take control of mobile phones. Hackers have been able to take advantage and make a decent buck as displayed in this YouTube video. Pretty scary stuff. Anybody know the particulars of this security flaw?