Why is this man smiling?
It's been vewy, vewy quiet around ol' SmallNetBuilder since my last post. Since then, I've been focused on getting the Broadband Forum's TR-398 test suite up and running on a new octoScope test platform. This work will soon be available as a turnkey product from octoScope.
TR-398 was unveiled in February at Mobile World Congress 2019 as the industry's first Wi-Fi performance test standard. While it's not as comprehensive as some might want, it's a good start toward providing a set of benchmarks that can be used to compare product performance.
What Is The Broadband Forum?
The Broadband Forum (BBF) is a non-profit corporation organized to create guidelines for broadband network system development and deployment. They are perhaps best known for their TR-069 CPE WAN Management Protocol that is used by service providers to remotely manage broadband modems.
What Is TR-398?
Quoting from the TR-398 Wi-Fi In-Premises Performance Testing standard:
The primary goal of TR-398 is to provide a set of test cases and framework to verify the performance between Access Point (AP) (e.g., a CPE with Wi-Fi) and one or more Station (STA) (e.g., Personal Computer [PC], integrated testing equipment, etc.).
This is the first time an industry group has attempted to establish Wi-Fi performance test benchmarks. You might think the Wi-Fi Alliance had already done this long ago. But the WFA's focus is on functional / interoperability Certification, which they strongly encourage their members to perform so that they can display a Wi-Fi Certification logo.
You need to be a WFA member to access their test plans. Membership ain't cheap, so SmallNetBuilder is not a WFA member.
The BBF has no certification program for TR-398, so there are no logos to earn or display. You don't need to be a BBF member to access their standards (aka Technical Reports / TRs), as evidenced by the links above.
The TR-398 suite is organized into eleven tests covering five performance aspects:
- RF capability
- Receive sensitivity
- Baseline performance
- Maximum connection
- Maximum throughput
- Airtime Fairness
- Range vs. Rate
- Spatial consistency
- Multiple STA performance
- Multiple STA performance
- Multiple Association/Disassociation Stability
- Downlink MU-MIMO performance
- Long Term Stability
- AP Coexistence
In general, the TR-398 suite uses two stream devices as the test devices (STA). This makes sense, since that's the configuration of most mobile Wi-Fi devices today. It also mandates that the router/AP device under test (DUT) be set to channel 6 @ 20 MHz channel bandwidth in 2.4 GHz and channel 36 @ 80 MHz channel bandwidth in 5 GHz. 2.4 GHz band tests are done with DUT and STA configured for 802.11n; 5 GHz band tests use 802.11ac.
Since you can download and read the standard, I'm not going to go into the details of each test. Instead, I'll focus on some of the tests that address areas that the SmallNetBuilder Wi-Fi benchmarks have not tested and look at some test results.
The TR-398 suite uses a new testbed configuration shown below. Although originally designed to suppport the TR-398, this configuration can also support roaming, band steering and other tests not included in the spec.
octoScope TR-398 testbed
The testbed uses octoScope's Pal partner devices that can function as a station (STA), virtual stations (vSTAs), access point (AP), traffic generator, load generator, sniffer and an expert monitor. The Pal-24 supports up to four stream 802.11 b/g/n operation in 2.4 GHz and the Pal-5 supports up to four stream 802.11 a/n/ac. There will also be a version of this testbed that uses octoScope's Pal-6 smartBox subsystem that supports 802.11a/b/g/n/ac/ax testing.
My implementation of the TR-398 suite uses octoScope's octoBox software platform, automated via Python scripts. Test results are captured in CSV files and analyzed via Python scripts to test against limits specified in TR-398 and yield pass/fail results.
octoBox software screenshot
Traffic generation is done via a customized version of iperf3 that is controlled by the octoBox software and can be run in a multipoint configuration.
Now let's look at a few of the tests and test results from three products:
- Linksys LAPAC1200 - two stream AC1200 Wi-Fi 5 access point
- NETGEAR R7800 Nighthawk X4S - four stream AC2600 Wi-Fi 5 router
- NETGEAR RAX80 Nighthawk AX8 - four stream AX6000 Wi-Fi 6 router
Given the nature of these tests, I didn't expect to see much difference between the two NETGEAR routers. Draft 11ax doesn't really bring anything to the party over a four-stream 11ac router when used with 11ac STAs. But it will be interesting to see if these tests reveal any advantages of the more expensive four-stream routers over the two-stream Linksys.
Instead of the usual rate vs. range tests, I'm going to look at some of the TR-398 benchmarks that test things I haven't tested in the past.
I've been on a quest to come up with test methods that can show how much of a Wi-Fi load a router, AP or Wi-Fi mesh system can handle. Since top-of-line routers now sell for $600, it would really be nice to show that those products deliver measurable value over less expensive alternatives.While that goal remains a work in progress, TR-398 has taken some steps in that direction.
The Maximum Connection test (6.2.1) places a 32 device (STA) load on the router/AP under test (DUT), with each two-stream device running a 2 Mbps UDP stream for the 2.4 GHz test and 8 Mbps for 5 GHz for two minutes. Each STA must have less than 1% packet loss and total throughput for the 32 STAs must be not less than 64 Mbps * 99% for 2.4 GHz and not less than 256 Mbps * 99% for 5 GHz. The test is run under best-case path-loss conditions, i.e. high signal level, downlink and uplink.
I use the octoScope Pal's vSTA (virtual STA) capability for this test to produce the 32 STAs. This is easier than cramming 32 real devices into a box and much easier than controlling each device's association/disassociation and reading back results from 32 iperf3 endpoints.
Maximum Connection Test
I've been using a similar approach in my load-testing quest, experimenting with different traffic rates and ramping the number of connections to find the break point. Based on those results, I'd say BBF has placed the bar relatively low for this test, at least for current-generation 11ac four-stream routers.
Table 1 shows the results from this test. All three products pass all tests. So as it's currently designed, this test doesn't look like it will be useful to assess router/AP capacity.
|Linksys LAPAC1200||NETGEAR R7800||NETGEAR RAX80|
|2.4 GHz Dn||Pass [82.5 Mbps]||Pass [80.2]||Pass [82.2 Mbps]|
|2.4 GHz Up||Pass [80.5 Mbps]||Pass [82.2]||Pass [80.5 Mbps]|
|5 GHz Dn||Pass [271.7 Mbps]||Pass [272.8]||Pass [272.1 Mbps]|
|5 GHz Up||Pass [272 Mbps]||Pass [270.8]||Pass [270.4 Mbps]|
Packet Loss (%)
|2.4 GHz Dn||Pass [No loss]||Pass [No loss]||Pass [No loss]|
|2.4 GHz Up||Pass [No loss]||Pass [No loss]||Pass [No loss]|
|5 GHz Dn||Pass [No loss]||Pass [No loss]||Pass [No loss]|
|5 GHz Up||Pass [No loss]||Pass [No loss]||Pass [No loss]|