Since 7signal provides a vendor agnostic Wi-Fi Performance Management system, several institutions have used our solution to measure end-user experience in vendor bake-offs as they contemplate an upgrade to their WLAN. Evaluating WLAN vendors is not easy – there are many factors to consider beyond WLAN performance tests, such as access control, WLAN/LAN integration, as well as overall ease of deployment. It’s also important to try out the latest-and-greatest from leading vendors and compare end-user WLAN performance test results; after all, improving end-user Wi-Fi experience is why you are upgrading in the first place. We’ve found 6 key elements to consider when doing Wi-Fi benchmarking.
The best organizations perform their WLAN benchmarking in a real environment, such as a wing of their hospital or floor of their enterprise, and they use several trial APs configured in way that would be deployed in production. I recently heard of a WLAN performance bake-off by an IT group responsible for Wi-Fi at an indoor arena – they trialed access points from two vendors by putting them on a desk in their office and ran speedtest.net throughput tests using 80MHz 802.11ac channels. Are you serious? Measuring desktop performance in a stand-alone environment provides unrealistic, and somewhat useless data.
If you put APs in a real environment, you gain the benefit of testing with real clients. In hospitals this is particularly important, because there are many Wi-Fi devices like IV-pumps and nurse-call badges that your vendor may not have completed WLAN performance testing against. Even in an office environment, there is a great variety of laptops, tablets and phones and you want to make sure you understand potential problems with the variety of WLAN gear.
So much affects the end-user Wi-Fi experience besides throughput – factors such as attach time, authentication and IP address assignment will make the Wi-Fi seem slow or unusable, even if there is adequate capacity to move bits. So, you’ll want to measure these items. The graph below shows a WLAN performance test bake-off that 7signal participated in where the attach time was markedly different between two vendors.
Despite what I said above, throughput is still an important WLAN performance test metric to measure; however; too often only TCP or HTTP tests are used to benchmark the throughput through the AP. These WLAN performance tests are important, and provide an indicator of the raw processing capability of the AP. But voice tests are just as important, since voice over Wi-Fi has become a key service of the WLAN system. VoIP uses UDP packets and the key measurements are latency, packet loss and jitter. These elements are summarized nicely in a voice MOS test (Mean Opinion Score), which provides a numeric value of 1 to 5 corresponding to perceived voice quality. A WLAN performance bake-off should look at MOS scores – a higher MOS indicates that the AP (and associated backend) can process and move a large number of smaller packets with minimal delay and jitter; a lower MOS score may indicate that the access point is challenged.
One advantage of using a system like 7signal’s is that you can achieve WLAN performance tests over an extended period of time, instead of simply a “spot check” which you might perform with another tool. You want to see the data during busy periods, because that is when things break down and people complain. The 7signal active test is like adding the 31st or the 51st user to the system – now you really see the capability of the WLAN and how it can truly handle Wi-Fi congestion. You also want to take WLAN performance tests in the quiet periods to get a view of the raw capability of your WLAN’s ability to move data.
By the way, you really don’t want to be testing your internet connection in these tests. Your internet connection will be a natural choke-point and throughput values can vary quite a bit in a real environment. It’s better to use a WLAN performance test endpoint, like 7signal’s Sonar, which can run as a VM or appliance in your data center if you are trying to gauge true throughput capability of your WLAN.
Every WLAN vendor has some form of automatic channel assignment and power control. These algorithms remove the task of manually assigning values, and theoretically, should make managing the WLAN easier. Even though these algorithms have improved over time, if you are depending on them, you will want to verify their operation in your environment. We have seen WLAN performance test bake-offs where automation produced channel flapping, power fluctuations and other undesirable affects. You need a system that can identify these issues and you want to understand whether you can place proper limits on these algorithms to gain the operational advantage you desire.
If you take into consideration these 6 key elements of WLAN performance test benchmarking when you do your bake-off, then you will have the right end-user Wi-Fi experience data to make an informed decision. Then, as you deploy your network, you can compare ongoing results with your benchmarks to make sure there are no surprises.