I belong to that rare breed of human that enjoys commercials. As a social scientist with an interest in the impact of advertisement on consumer behavior, I often find myself, possibly to the chagrin of my wife (though she has not complained), assessing commercials out loud. Are they informative? Are they persuasive or attempt simply to elicit attention to the good in the ad? Might they unintentionally lead to brand confusion? Most importantly, are they funny?
Thus, having also spent some time among wireless regulators, I cannot help but comment on the recent spate of wireless attack ads perpetuated by three of the U.S. nationwide mobile wireless providers. The initial culprit this time around was Verizon Wireless, which determined that balls were a good method to represent relative mobile wireless performance among the nationwide competitors. Shortly thereafter, Sprint aired a commercial using bigger balls while T-Mobile brought in Steve Harvey to demand that Verizon #Ballagize.
There are myriad takeaways that can be had from these commercials. First, at least on the face of it, the nationwide mobile wireless providers appear to be fiercely competitive with one another. It would be interesting to look at advertising to sales ratios for this industry relative to that of other industries in the U.S., though at the time of writing of this blog, I did not have access to such data (Ad Age appears to be a convenient source). Moreover, the content of the commercials suggests that although price continues to be an important factor (Sprint did not veer away from its “half-off” theme in its ball commercial), quality competition that allows competitors to differentiate their product (and in doing so, justify higher prices) remains paramount.
Unfortunately, as a consumer, it is difficult for me to properly assess what these commercials say about wireless quality. There are a number of points at play here.
- The relative comparisons are vague: When Sprint says that it delivers faster download speeds than the other nationwide providers, what does that mean? When I zoom into the aforementioned Sprint commercial at the 10 second mark, the bottom of the screen shows, “Claim based on Sprint’s analysis of average LTE download speeds using Nielsen NMP data (Oct. thru Dec. 2015). NMP data captures real consumer usage and performance for downloads of all file sizes greater than 150kb. Actual speeds may vary by location and device capability.” As a consumer who spends most of his time in East Lansing, MI, I am not particularly well informed by a nationwide average. Moreover, I know nothing about the statistical validity of the data (though here I am willing to give Nielsen the benefit of the doubt). Moreover, I would be interested to know when Sprint states that it delivers faster download speeds, how much faster they are (in absolute terms) relative to the next fastest competitor.
- The small print is too small: Verizon took flak from its competitors for using outdated data in its commercial. This is a valid claim. Verizon’s small print (13 second mark in its commercial) states that RootMetrics data is based on the 1st half of 2015. But unless I am actually analyzing these commercials as I am here, and viewing them side by side, it is difficult for me to make the comparison.
- The mobile wireless providers constantly question one another’s credibility, and this is likely to make me less willing to believe that they are indeed credible. Ricky Gervais explains this much better than I do: Ricky Gervais on speed, coverage, and network comparisons.
Alas, how is a consumer supposed to assess wireless providers? An obvious source is Consumer Reports, but my sense, without paying for a subscription, is that these largely depend on expert reviews and not necessarily data analysis (someone correct me if I am wrong). Another if one is not in the habit of paying for information about rival firms is the FCC. The FCC’s Wireless Telecommunications Bureau publishes an “Annual Report and Analysis of Competitive Market Conditions with Respect to Mobile Wireless.” The most recent, Eighteenth Report, contains a lengthy section on industry metrics with a focus on coverage (see Section III) as well as a section on service quality (see Section VI.C). The latter section focuses on nationwide average speed according to FCC Speed Test data as well as on data from private sources Ookla, RootMetrics (yes, the one mentioned in those commercials), and CalSPEED (for California only). If you are interested, be sure to check out the Appendix, which has a wealth of additional data. For those who don’t want to read through a massive pdf file, there is also a set of Quick Facts containing some of the aforementioned data.
However, what I think is lacking is speed data at a granular level. When analyzing transactions or assessing competition, the FCC does so at a level that is far more granular than the state, and rightly so, as consumers do not generally make purchasing decision across an entire state, needless to say, the nation as a whole. This is because service where consumers are likely to be present for the majority of their time is a major concern when deciding on wireless quality. In a previous blog post I mentioned that the FCC releases granular fixed broadband data, but unfortunately, as far as I am aware, this is still not the case for wireless, particularly with regard to individual carrier speed data.
The FCC Speed Test App provides the FCC with such data. The Android version which I have on my phone provides nifty statistics about download and upload speed as well as latency and packet loss, with the option to parse the data according to mobile or WiFi. My monthly mobile only data for the past month showed a download speed above 30 Mbps. Go Verizon! My Wifi average was more than double that. Go SpartenNet! Yet, my observation does not allow me to compare data across providers in East Lansing and my current contract happens to expire in a couple of weeks. The problem is that in a place like East Lansing and particularly so in more rural areas of the United States, not enough people have downloaded the FCC Speed Test App and I doubt that the FCC would be willing to report firm level data at a level deemed not to have statistical validity.
For all I know, the entire East Lansing sample consists of my twice or so daily automatic tests that if aggregated to a quarter of a year make up less than 200 observations for Verizon Wireless. Whether this is sufficient for a statistically significant sample depends on the dispersion in speed observations for a non-parametric measure such as a median speed and also on the assumed distribution for mean speeds. I encourage people to try this app out. The more people who download it, the more likely that the FCC will have sufficient data to be comfortable enough to report it at a level that will make it reliable as a decision making tool. Perhaps then, the FCC will also redesign the app to also report competitor speeds for the relevant geographic area.
Alex, from my perspective, this commercial really works. It is understandable to non-regulators, and non-geeks. And it generated counter-ads from the other carriers. Hard to ask for more. See: https://www.youtube.com/watch?v=WkN12ItMdSM&ebc=ANyPxKqRVWj—tDzntTnr1mVAolOfF28HWNhyM0GPnWdPzsFDeGRlkt9BVFlHSThB7LmUO22xxC8-RsTRAcrcVtQcn0JvLaDw
It is a great idea to use crowd sourcing to measure download speeds more accurately. It is in the interest of the major online app markets to promote such an app because it is not in conflict with their own interests and because their reach is wider than that of any particular wireless carrier.
On the subject of the ads, I too believe that they have an effect on the viewing experience. This was very clear to me when I was streaming one of the recent presidential debates to the living room tv. The reception was flawless and you had the impression that you were watching cable. Until it was the time for commercial breaks. All you would see is a slide that states that ads are blocked from this streaming. This kept me wondering about the content of those ads, and whether all viewers were gathering the same impressions from the debate itself. This is also relevant to recorded content that is viewed elsewhere and at different times on streamed media.
Bill, I couldn’t agree more. The ad was so effective that I wrote a lengthy, un#ballogetic blog post in response.
Another reader who preferred to remain anonymous allowed me to share the following, highly insightful comment:
“Thanks, Aleks. My 2 cents: all speed test data and apps are inherently worthless as a decision-making tool for consumers, even if they paid attention to, or were even aware of the existence of, the FCC. Even granular, localized data with cross-carrier comparisons in a given local market is worthless, because each individual consumer’s experience will depend on where they live and work, their travel/driving habits, how and when they use their device, how and when competing users are using their devices and the impact of unforeseen events on usage (the congestion issue), etc. And even if the data gives you a snapshot at a particular time, speeds can change rapidly over time due to network upgrades and other factors such as those noted in the previous sentence. The consumer’s experience of network speed is, to me, a classically Hayekian process – it is information that inherently depends on the particular time and place. And collecting and processing the granular information required to make the speed tests more useful would be prohibitively expensive (the Hayekian critique of central planning), and in any event, the tests depend on voluntary participation that is not likely to be forthcoming apart from a few outliers like you.
“As far as I know, carriers still give new customers a trial period (perhaps up to 2 weeks), during which they can try the service and, if found wanting, return the device and unsubscribe without penalty. That is how the market handles this problem. The only purpose of the advertisements is to entice consumers to give the service a try.”
I can’t help but agree to some extent with the first point. In spite of the very rapid download speeds that Verizon Wireless has been providing for me, my phone is practically useless in my driveway, especially when I need to use Google Maps, which is what I want to use it for in my driveway. Nevertheless, I suspect that on average, such data is more informative to consumers than the wireless ads, which I fully acknowledge are there to attract customers, but getting such data to consumers consistently is potentially cost prohibitive.
I was particularly intrigued by the point regarding the trial period, enough so, as to call a local store from each of the four nationwide providers to see what their offers were. I learned that each provider offers a 14 day trial period, but that AT&T, Sprint, and Verizon Wireless include a $35 restocking fee for equipment, whereas T-Mobile has a $50 restocking fee. As an economist, I prefer a market solution when one is available, and this is indeed a market solution. But if crowd sourcing could potentially cost-effectively decrease consumer search costs, I am all for that as well.