Working out which wireless network is the best is, as we’ve covered here extensively before, not a simple thing to do. There’s a bunch of different methodologies for comparing cell networks against each other based on the networks’ speed and coverage, and even within the same broad techniques, tiny methodological differences can have a startling impact on results.
As you’d expect, different wireless networks tend to favor the results that are most favorable to them. T-Mobile has long touted the outcome of crowd-sourced speed tests from OpenSignal and Ookla, two firms that use millions of speed tests conducted by users across the country. The crowd-sourced results are consistently the nicest to T-Mobile, so it’s no surprise that the Un-carrier focuses on those results, rather than certain others.
With the mid-year reports having already come in from all the big-name companies, T-Mobile’s Chief Technology Office Neville Ray has taken the time to write a blog post trying to explain the differences between the tests, and why his pet results are in fact the best. Unfortunately, Ray’s arguments don’t quite add up.
Things start out well, with an explanation of the three main types of reports that are commonly seen:
-
Crowdsourcing:
Crowdsourced data measures customers’ real-world network experiences based on millions, and sometimes billions, of data points collected anywhere and everywhere people go, indoors and out. The network performance data is gathered via both consumer-initiated tests and tests running in the background of apps, even when people aren’t actually using their phones. While some allege consumer-initiated tests could introduce user bias, tests are gathered only when consumers are actually using their phones. -
Surveys:
These are great for measuring wireless customers’ perceptions, but not actual network performance. And perceptions may be influenced by marketing and other external factors beyond performance. -
Drive testing:
Drive tests use paid consultants who create software scripts to run a broad range of tests and travel across the country, but these tests are typically limited to one or a handful of phones vs the multitude of makes and models people use. Drive tests run only a few times a year in each area and measure experience primarily on roadways and public spaces and don’t reach wherever people happen to be – at home, at work or on the go.
This is where things go off the rail a little:
So what type of network report data do consumers trust the most?
When picking a burger joint, 88% of consumers surveyed say they’d trust the experience of millions of everyday paying customers who’ve eaten there more than a handful of paid food critics. So, why should wireless networks be any different? They shouldn’t.
This analogy is forced at best — nationwide wireless networks are objectively very different to a fast-food chain — but it also just doesn’t make sense. “Trusting the experience” sounds an awful lot like using word-of-mouth perceptions of how good people think the burger chain is, which in our analogy, means surveys rather than crowd-sourced testing. Those surveys, by the way, do favor Verizon quite heavily.
You might also notice that in the blog, Ray says that surveys may not be reliable as “perceptions may be influenced by marketing and other external factors beyond performance.” This is entirely correct — conducting good surveys is a discipline that has its own academic field, and one thing that everyone agrees on is that the questions, their presentation, and their order can have a huge impact on the data collected.
You may also notice that in his blog post, Ray quotes from a survey conducted by T-Mobile on people’s attitudes towards opinions on burger joints and wireless networks. A T-Mobile representative told BGR that the survey involved asking more than 700 mobile users the following questions:
1. When selecting a burger restaurant, which would you trust more?
- The experiences of 12 million people who’ve paid to eat at the restaurant
- A few paid food critics
2. When selecting a new wireless phone service, which would you trust more?
- A rating based on billions of paying customer experiences
- A rating based on a few paid testers
Obviously, you don’t have to be a pollster to realize that the way those questions are worded is leading, and the survey data is virtually meaningless. It’s good for providing a soundbite within the context of the blog post, but calling it “powerful evidence that consumers believe in the power of crowdsourced data” is misleading, at best.
Ray also said that “because all these reports use different approaches and have varying results, the whole thing has become a confusing mess for consumers. “Most reliable this.” “Best that.” “Fastest this.” “1% of that.” Consumers simply want to know if a network works where they live, work and play.” He’s entirely right, but it’s a little much coming from T-Mobile. Earlier this year, the National Advertising Division, the voluntary ad industry watchdog, recommended that T-Mobile discontinue a series of ads which — surprise surprise! — claimed that T-Mobile had the “best unlimited network,” based on crowd-sourced data that the NAD said didn’t give the whole picture.
All of this isn’t meant to discredit crowd-sourced network testing, or T-Mobile’s network for that matter. Sticking up for results that show your own company’s network in the best possible light is only natural, and you can bet that Verizon puts out a celebratory press release every time RootMetrics or JD Power shows its network to be the best. It’s even beneficial for T-Mobile to address the different kinds of network testing and what they show — just remember who’s doing the explaining here.