For over a decade, Verizon has traded on the unparalleled coverage and speed that its network provides. Thanks to dense tower infrastructure and a wealth of low-band spectrum, Verizon’s coverage was famously reliable. But in recent years, other carriers — and most notably T-Mobile — have been challenging Verizon for first place, and in some cases beating Verizon out.

There are two main types of network testing that carriers point to to claim their networks are “best.” Road-testing involves driving around to thousands of predetermined locations with phones from all the networks and running speed tests side-by-side, while crowd-sourced testing aggregates millions of speed tests from real-world users and averages out the results.

If you look back over the last two years of results, there’s a clear trend: crowd-sourced testing shows T-Mobile to be the fastest network, while road-testing consistently puts Verizon in first place (and AT&T ahead of T-Mobile, for what it’s worth). So that’s why a new crowd-sourced study, which shows Verizon to be in first place and T-Mobile languishing in third, is such an outlier.

The data comes from Wirefly’s Speed Test, an HTML5-based online speedtest. The blog post announcing the results cites “thousands” of unique mobile devices conducting thousands of speed tests during Q4 2017 and Q1 2018, which is substantially fewer than the millions of tests that data from Ookla and OpenSignal normally rely on.

The results from this test are so different from the crowd-sourced results from Ookla and OpenSignal that there’s clearly something significantly different in the methodology or analysis. The most obvious culprit would be the sample size; OpenSignal and Ookla’s methodology only works because they have so many data points available, any outliers become statistically insignificant. That’s not the case with fewer data points, but the results also don’t look random. The average speeds and ranking closely mirror the nationwide road-tested results from Rootmetrics, which makes it less likely that this is just a “bad” study.

Without access to all the underlying data sets (which are proprietary and not public) there’s no real way to say which studies are good and which are bad. But more than anything, it just goes to show that networking testing is as much art as science, and there’s no ultimate right or wrong answer — or best network.

Comments