Work Begun on James H. Quello Archive

by

We have just begun work on a digital archive of James H. Quello’s speeches, articles, and statements dating from 21 January 1974, for his Senate Confirmation Hearing. My thanks to the MSU Library for helping the Quello Center with this project, and from today we will start searching for funding to support this archiving project.

James H. Quello

James H. Quello

The core material will be Commissioner Quello’s written speeches, articles and statements, but we will be adding biographical materials, photos, and video material. This should be a valuable source for anyone seriously interested in the history of regulation and policy in the communication sector in the USA.

Our thanks to the MSU Library and to Sarah Roberts with the MSU Archives & Historical Collections.

Tags: , , , , , ,


The Un#ballogetic World of Wireless Ads

by

I belong to that rare breed of human that enjoys commercials.  As a social scientist with an interest in the impact of advertisement on consumer behavior, I often find myself, possibly to the chagrin of my wife (though she has not complained), assessing commercials out loud.  Are they informative?  Are they persuasive or attempt simply to elicit attention to the good in the ad?  Might they unintentionally lead to brand confusion?  Most importantly, are they funny?

Thus, having also spent some time among wireless regulators, I cannot help but comment on the recent spate of wireless attack ads perpetuated by three of the U.S. nationwide mobile wireless providers.  The initial culprit this time around was Verizon Wireless, which determined that balls were a good method to represent relative mobile wireless performance among the nationwide competitors.  Shortly thereafter, Sprint aired a commercial using bigger balls while T-Mobile brought in Steve Harvey to demand that Verizon #Ballagize.

There are myriad takeaways that can be had from these commercials.  First, at least on the face of it, the nationwide mobile wireless providers appear to be fiercely competitive with one another.  It would be interesting to look at advertising to sales ratios for this industry relative to that of other industries in the U.S., though at the time of writing of this blog, I did not have access to such data (Ad Age appears to be a convenient source).  Moreover, the content of the commercials suggests that although price continues to be an important factor (Sprint did not veer away from its “half-off” theme in its ball commercial), quality competition that allows competitors to differentiate their product (and in doing so, justify higher prices) remains paramount.

Unfortunately, as a consumer, it is difficult for me to properly assess what these commercials say about wireless quality.  There are a number of points at play here.

  1. The relative comparisons are vague: When Sprint says that it delivers faster download speeds than the other nationwide providers, what does that mean?  When I zoom into the aforementioned Sprint commercial at the 10 second mark, the bottom of the screen shows, “Claim based on Sprint’s analysis of average LTE download speeds using Nielsen NMP data (Oct. thru Dec. 2015).  NMP data captures real consumer usage and performance for downloads of all file sizes greater than 150kb.  Actual speeds may vary by location and device capability.”  As a consumer who spends most of his time in East Lansing, MI, I am not particularly well informed by a nationwide average.  Moreover, I know nothing about the statistical validity of the data (though here I am willing to give Nielsen the benefit of the doubt).  Moreover, I would be interested to know when Sprint states that it delivers faster download speeds, how much faster they are (in absolute terms) relative to the next fastest competitor.
  2. The small print is too small: Verizon took flak from its competitors for using outdated data in its commercial.  This is a valid claim.  Verizon’s small print (13 second mark in its commercial) states that RootMetrics data is based on the 1st half of 2015.  But unless I am actually analyzing these commercials as I am here, and viewing them side by side, it is difficult for me to make the comparison.
  3. The mobile wireless providers constantly question one another’s credibility, and this is likely to make me less willing to believe that they are indeed credible. Ricky Gervais explains this much better than I do: Ricky Gervais on speed, coverage, and network comparisons.

Alas, how is a consumer supposed to assess wireless providers?  An obvious source is Consumer Reports, but my sense, without paying for a subscription, is that these largely depend on expert reviews and not necessarily data analysis (someone correct me if I am wrong).  Another if one is not in the habit of paying for information about rival firms is the FCC.  The FCC’s Wireless Telecommunications Bureau publishes an “Annual Report and Analysis of Competitive Market Conditions with Respect to Mobile Wireless.”  The most recent, Eighteenth Report, contains a lengthy section on industry metrics with a focus on coverage (see Section III) as well as a section on service quality (see Section VI.C).  The latter section focuses on nationwide average speed according to FCC Speed Test data as well as on data from private sources Ookla, RootMetrics (yes, the one mentioned in those commercials), and CalSPEED (for California only).  If you are interested, be sure to check out the Appendix, which has a wealth of additional data.  For those who don’t want to read through a massive pdf file, there is also a set of Quick Facts containing some of the aforementioned data.

However, what I think is lacking is speed data at a granular level.  When analyzing transactions or assessing competition, the FCC does so at a level that is far more granular than the state, and rightly so, as consumers do not generally make purchasing decision across an entire state, needless to say, the nation as a whole.  This is because service where consumers are likely to be present for the majority of their time is a major concern when deciding on wireless quality.  In a previous blog post I mentioned that the FCC releases granular fixed broadband data, but unfortunately, as far as I am aware, this is still not the case for wireless, particularly with regard to individual carrier speed data.

The FCC Speed Test App provides the FCC with such data.  The Android version which I have on my phone provides nifty statistics about download and upload speed as well as latency and packet loss, with the option to parse the data according to mobile or WiFi.  My monthly mobile only data for the past month showed a download speed above 30 Mbps.  Go Verizon!  My Wifi average was more than double that.  Go SpartenNet!  Yet, my observation does not allow me to compare data across providers in East Lansing and my current contract happens to expire in a couple of weeks.  The problem is that in a place like East Lansing and particularly so in more rural areas of the United States, not enough people have downloaded the FCC Speed Test App and I doubt that the FCC would be willing to report firm level data at a level deemed not to have statistical validity.

For all I know, the entire East Lansing sample consists of my twice or so daily automatic tests that if aggregated to a quarter of a year make up less than 200 observations for Verizon Wireless.  Whether this is sufficient for a statistically significant sample depends on the dispersion in speed observations for a non-parametric measure such as a median speed and also on the assumed distribution for mean speeds.  I encourage people to try this app out.  The more people who download it, the more likely that the FCC will have sufficient data to be comfortable enough to report it at a level that will make it reliable as a decision making tool.  Perhaps then, the FCC will also redesign the app to also report competitor speeds for the relevant geographic area.

Tags: , , , , , , , ,


What is ‘Special Access’ and Why is It So Important? by Aleks Yankelevich

by

Dr Aleks Yankelevich gave a one hour Quello Center brown-bag presentation entitled “Regulating the Intranet: What is Special Access and Why is it Important?” (yes Intranet, not Internet) on January 26th 2016. His talk clarified the concept of special access, how it is regulated by the Federal Communications Commission, and ended with some ideas on research that might focus on this relatively under-researched area.

Aleksandr Yankelevich – Regulating the Internet – What is Special Access And Why Is It Important from Quello Center on Vimeo.

Special access lines are dedicated high-capacity connections used by businesses and institutions to transmit their voice and data traffic. These connections are used by businesses to facilitate intranet communication, by wireless providers to funnel cell phone traffic between towers, and by banks to connect to their ATMs. When the costs of special access services increase, these costs are passed on by businesses to consumers. Because many parts of the United States face limited competition in the provision of special access, these services are highly regulated. In this brown-bag seminar, Aleks will discuss the significance of the special access market, why regulation of the intranet is relatively under-studied, and briefly explain a number of FCC related proceedings with respect to special access as well as his ongoing and potential research on the topic.

Tags: , , , ,


Delivering Pizza Without Offering Pizza Delivery

by

Having appreciated my colleague Aleks’ Yankelevich’s creative use of a “food” metaphor to explain an important aspect of economic analysis, I thought it fitting, on the day of oral arguments in the legal challenge to the FCC’s Open Internet Order, to consider another effective use of such a metaphor:  Supreme Court Justice Antonin Scalia’s dissent in the Brand X case.  Whereas the majority opinion in that case deferred to an earlier FCC ruling that Internet access was an “information” rather than a “telecommunication” service, Scalia–joined by two liberal justices, Ruth Bader Ginsburg and David Souter–argued that the majority’s view was akin to accepting a claim by the owner of a pizzeria that it delivered pizza, but didn’t “offer pizza delivery service.”

Below are some excerpts from Scalia’s dissent that I find most significant in terms of how the DC Circuit (and perhaps later, the Supreme Court) should and will rule in the latest challenge to the FCC’s Open Internet Order, which is the first in which the Commission has treated Internet access as a Title II “telecommunication” service rather than an “information” service.

The first sentence of the FCC ruling under review reads as follows: “Cable modem service provides high-speed access to the Internet, as well as many applications or functions that can be used with that access, over cable system facilities”…Does this mean that cable companies “offer” high-speed access to the Internet?  Surprisingly not, if the Commission and the Court are to be believed.

It happens that cable-modem service is popular precisely because of the high-speed access it provides, and that, once connected with the Internet, cable-modem subscribers often use Internet applications and functions from providers other than the cable company. Nevertheless, for purposes of classifying what the cable company does, the Commission (with the Court’s approval) puts all the emphasis on the rest of the package (the additional “applications or functions”). It does so by claiming that the cable company does not “offe[r]” its customers high-speed Internet access because it offers that access only in conjunction with particular applications and functions, rather than “separate[ly],” as a “stand-alone offering…”

There are instances in which it is ridiculous to deny that one part of a joint offering is being offered merely because it is not offered on a “stand-alone” basis…If, for example, I call up a pizzeria and ask whether they offer delivery, both common sense and common “usage”…would prevent them from answering: “No, we do not offer delivery–but if you order a pizza from us, we’ll bake it for you and then bring it to your house.” The logical response to this would be something on the order of, “so, you do offer delivery.” But our pizza-man may continue to deny the obvious and explain, paraphrasing the FCC and the Court: “No, even though we bring the pizza to your house, we are not actually “offering” you delivery, because the delivery that we provide to our end users is ‘part and parcel’ of our pizzeria-pizza-at-home service and is ‘integral to its other capabilities.’”… Any reasonable customer would conclude at that point that his interlocutor was either crazy or following some too-clever-by-half legal advice.

In short, for the inputs of a finished service to qualify as the objects of an “offer” (as that term is reasonably understood), it is perhaps a sufficient, but surely not a necessary, condition that the seller offer separately “each discrete input that is necessary to providing . . . a finished service…”

Shifting his analogy from pizza to puppies, Justice Scalia adds:

The pet store may have a policy of selling puppies only with leashes, but any customer will say that it does offer puppies because a leashed puppy is still a puppy, even though it is not offered on a “stand-alone” basis.

Despite the Court’s mighty labors to prove otherwise, …the telecommunications component of cable-modem service retains such ample independent identity that it must be regarded as being on offer–especially when seen from the perspective of the consumer or the end user, which the Court purports to find determinative.

Since the majority opinion in Brand X was based primarily on the doctrine of “administrative deference” derived from the 1984 Supreme Court case Chevron U.S.A., Inc. v. Natural Resources Defense Council, Inc., one would hope and expect that the DC Circuit Court judges hearing today’s oral arguments would remember what Justice Thomas wrote in that majority opinion: “If a statute is ambiguous, and if the implementing agency’s construction is reasonable, Chevron requires a federal court to accept the agency’s construction of the statute, even if the agency’s reading differs from what the court believes is the best statutory interpretation.”

When the majority’s Chevron-base deference is coupled with Justice Scalia’s simple but clear and commonsensical analogies to pizza and puppies, it’s hard for me to imagine a strong legal basis for the Circuit Court (or the Supreme Court if it ends up ruling on the case) to rule against the FCC’s Title II-based Open Internet Order. Perhaps today’s oral arguments will provide some additional clues as to whether I’m right or wrong about that (Update: downloadable audio of the oral arguments is here (wireline) and here (wireless, First Amendment, Forbearance). h/t @haroldfeld, whose initial response to today’s arguments is here.

Tags: , ,


Aleks Yankelevich’s First Blog Post (Chipotle, Market Definition, and Digital Inequality)

by

Growing up, my parents, brother, and I usually avoided restaurants. For my parents, this was initially out of necessity; as Soviet refugees, they did not have the financial means to eat out. However, even having achieved a modicum of success, my parents are not generally in the habit of frequenting restaurants, having perhaps out of a lifetime habit, developed a taste for home cooking. Restaurants are exclusively for special occasions.

Thus, having never eaten at a Chipotle Mexican Grill, they were sufficiently impressed by the restaurant’s façade to wish to eat there, but only when the grand occasion merits such an extravagant excursion. Their two sons were informed as such. Naturally, my brother and I (perhaps spoiled as we are) jumped at the chance to poke fun at our parents for placing Chipotle on a pedestal. This is, after all, a restaurant chain that is victim to some serious defecation humor, not Eleven Madison Park.

For a number of months, my parents were subjected to text messages and Facebook or Instagram posts with visuals of me or my brother outside various Chipotle restaurants, posing next to Chipotle ads, and in one instance, wearing a Chipotle t-shirt (I have no idea how that shirt found its way into my wardrobe). My parents responded, saying things like (and I could not make this up), “I wish someone would take us to that dream place.”

However, recently, my mother sent a group text directing the family to a news report about dozens of confirmed E.Coli cases related to Chipotle (even the FDA got involved) and asking for alternative dining suggestions. The text responses, in order, were as follows:

Me: California Tortilla
My Wife: Taco Bell
My Brother: Sushi
My Mother: Eating In (with picture of latest home cooked meal)
My Brother’s Girlfriend: Bacon

How does a reasonable individual interpret this chain of responses? As an economist with some regulatory and antitrust experience, I found the answer obvious. I sent the following group text (modified for concision): “Has anyone noticed that this text conversation has turned into the classic antitrust debate about appropriate market definition, with each subsequent family member suggesting a broader market?”

Surprisingly, no one else had noticed, but I was asked to unpack my statement a little bit (my mom sent a text that read: “English please.”).

The U.S. Department of Justice and the Federal Trade Commission’s Horizontal Merger Guidelines stipulate that market definition serves two roles in identifying potential competitive concerns. First, market definition helps specify the line of commerce (product) and section of the country (geography) in which a competitive concern arises. Second, market definition allows the Agencies to identify market participants and measure market shares and concentration.

As the Agencies point out, market definition focuses solely on demand substitution factors, i.e., on customer’s ability and willingness to substitute away from one product to another in response to a price increase or a corresponding non-price change (in the case of Chipotle, an E.Coli outbreak might qualify as a reduction in quality). Customers generally face a range of potential substitutes, some closer than others. Defining a market broadly to include relatively distant substitutes can lead to misleading market shares. As such, the Agencies may seek to define markets to be sufficiently narrow as to capture the relative competitive significance between substitute products. For some precision with this regard, I refer the reader to Section 4.1.1 of the Guidelines.

As for the group texts above, the reader can now infer how market definition was broadened by each subsequent family member. To reiterate:

Me: California Tortilla (Mexican food in a similar quality dining establishment to Chipotle.)
My Wife: Taco Bell (Mexican . . . inspired . . . dining out, generally.)
My Brother: Sushi (Dining out, generally.)
My Mother: Eating In (Dining, generally.)
My Brother’s Girlfriend: Bacon (Eating.)

Why is market definition relevant to the Quello Center at Michigan State University? As the Center’s website suggests, the Center seeks to stimulate and inform debate on media, communication and information policy for our digital age. One area where market definition plays a role with this regard is within the Quello Center’s broad interest in research about digital inequality.

Digital inequality represents a social inequality with regard to access to or use of the Internet, or more broadly, information and communication technologies (ICTs). Digital inequalities can arise as a result of individualistic factors (income, age and other demographics) or contextual ones (competition where a particular consumer is most likely to rely on ICTs). Market definition is most readily observed in the latter.

For instance, consider the market for fixed broadband Internet. An immediate question that arises is the appropriate geographic market definition. If we rule out individuals’ ability to procure fixed broadband Internet at local hotspots (e.g., libraries, coffee shops) from the relevant market definition, then the relevant geographic market appears to be the home. This is unfortunately a major burden for researchers attempting to assess the state of fixed broadband competition and its potential impact on digital inequality because most market level data in use is at a much more aggregated level than the home. The problem is that when an aggregated market, say a zip code, contains multiple competitors, it is unclear how many of these competitors actually compete in the same home.

Thus far, most studies of fixed broadband competition have been hampered by the issue of geographic market definition. For instance, Xiao and Orazem (2011) extend Bresnahan and Reiss’s (1991, 1994) classic studies of entry and competition in the market for fixed broadband, albeit at the zip code level. Wallsten and Mallahan (2010) use tract level FCC Form 477 data to test the effects of competition on speeds, penetration, and prices. However, whereas there are approximately 42,000 zip codes and 73,000 census tracts in the United States, there are approximately 124 million households, which implies a fairly large amount of aggregation that can lead researchers to conclude that competition is stronger than it actually is.

Another question that arises is whether fixed broadband is too narrow a product market and if the appropriate market definition is simply broadband, which would include fixed as well as mobile broadband. Thus far, because of data limitations, most studies of wireline-wireless substitution have focused mainly on voice rather than on Internet use (e.g. Macher, Mayo, Ukhaneva, and Woroch, 2015; Thacker and Wilson, 2015) and so do not assess whether mobile has become a medium that can mitigate digital inequality. Prieger (2013) has made some headway into this issue by showing evidence that as late as 2010, mobile and fixed broadband were generally not complementary, and that mobile only broadband subscription was slightly more prevalent in rural areas. However, because of data limitations, Prieger does not estimate a demand system to determine whether fixed and mobile broadband are substitutes or complements as the voice substitution papers above do.

Luckily, NTIA’s State Broadband Initiative (SBI) and more recently, the FCC, have enhanced researchers’ ability to assess competition at a fairly granular level by providing fixed broadband coverage and speed data at the level of the census block. Similarly, new data on Internet usage from the U.S. Census should allow researchers to better tackle the wireline-wireless substitution issue as well. The FCC has also hopped on the speed test bandwagon by collaborating with SamKnows to measure both fixed and mobile broadband quality. In the former case, the FCC periodically releases the raw data and I am optimistic that at some point, mobile broadband quality data will be released as well (readers please correct me if I am glossing over some already publically available granular data on mobile broadband speed and other characteristics).

The Quello Center staff seeks to combine such data, along with other sources, to study broadband competition and its impact on digital inequality. We welcome your feedback and are presently on the lookout for potential collaborators interested in these issues.

 

Tags: , , , , , ,


Testing the Limits of Net Neutrality Rules

by

In the past few weeks we’ve seen both a wireless and wireline carrier launch new “zero rating” video streaming services that test the boundaries of the FCC’s net neutrality policy: T-Mobile’s Binge On and Comcast’s Stream TV.

According to published reports, FCC chairman Tom Wheeler has praised Binge On as “highly innovative” and “highly competitive,” while also noting that the Commission will continue to monitor the service under its “general conduct” rule. According to Ars Technica, an FCC spokesperson declined comment on Comcast’s Stream TV, which does not count against the company’s data caps.

The FCC’s reported response to the two services is not too surprising. While they share some similarities, they are also different in key respects. Among the differences that come initially to mind are:

In a blog post, Public Knowledge senior staff attorney John Bergmayer argues that Stream TV is subject to and violates the FCC’s Open Internet order as well as the consent decree Comcast agreed to as part of its NBC Universal acquisition. I’d recommend reading the post in full for anyone wanting a preview of legal arguments to be made in more formal channels by Public Knowledge and others likely to challenge Stream TV before the FCC and the courts.

According to Bergmayer:

Comcast maintains that “Stream TV is a cable streaming service delivered over Comcast’s cable system, not over the Internet.” But Stream TV is being delivered to Comcast broadband customers over their broadband connections, and is accessible on Internet-connected devices (that is, not just through a cable box). From a user’s perspective, it is identical to any other Internet service. Comcast’s argument is that if it offers its service only to Comcast customers and locates the servers that provide Stream TV on its own property, connected to its own network, that this exempts it from the Open Internet rules. This is an absurd position that would permit Comcast to discriminate in favor of any of its own services, and flies in the face of the Open Internet rules…

[I]t does not appear that Stream TV is an IP service like facilities-based VoIP. It is not available standalone; you need a broadband Internet access connection to access it. It is thus readily distinguishable from services like facilities-based VoIP. If Comcast offered Stream TV separately from broadband there would be a better case that it was more like traditional cable TV or a specialized service–but it does not.

Bergmayer also reviews some relevant language from Comcast’s NBC Universal consent decree, including:

“Comcast shall not offer a Specialized Service that is substantially or entirely comprised of Defendants’ affiliated content,” and…”[if] Comcast offers any Specialized Service that makes content from one or more third parties available … [it] shall allow any other comparable Person to be included in a similar Specialized Service on a nondiscriminatory basis.”

In an article in Multichannel News, Jeff Baumgartner previews what may be a core element of Comcast’s legal argument defending Stream TV:

“Stream TV is an in-home IP-cable service delivered over Comcast’s cable network, not over the public Internet,” Comcast said in a statement issued Thursday, the same day it launched Stream TV to its second market – Chicago. “IP-cable is not an ‘over-the-top’ streaming video service. Stream enables customers to enjoy their cable TV service on mobile devices in the home delivered over the managed cable network, without the need for additional equipment, like a traditional set-top-box.”

The FCC does address the idea in rules released in December 2014, which explain that “an entity that delivers cable services via IP is a cable operator to the extent it delivers those services as managed video services over its own facilities and within its footprint…IP-based service provided by a cable operator over its facilities and within its footprint must be regulated as a cable service not only because it is compelled by the statutory definitions; it is also good policy, as it ensures that cable operators will continue to be subject to the pro-competitive, consumer-focused regulations that apply to cable even if they provide their services via IP.”

In his blog post Bergmayer cites language from the Commission’s Open Internet order related to the provision of “Non-Broadband Internet Access Service Data Services.” In my view, a key sentence in that section of the order is “The Commission expressly reserves the authority to take action if a service is, in fact, providing the functional equivalent of broadband Internet access service or is being used to evade the open Internet rules.” On the face of it, I’m inclined to agree with Bergmayer that this appears to be the case with Comcast’s Stream TV, when coupled with its data cap policies and the reality of Comcast’s multifaceted market power in both distribution and content.

And, more generally, I think Bergmayer is correct that “Comcast’s program raises a host of issues under the Open Internet rules, the consent decree, and—most importantly—general principles of competition.”

The fact that Comcast is testing the bounds of the Commission’s new rules is not surprising, given its focus on maximizing shareholder value within a set of interrelated and dynamic markets in which it enjoys substantial market power, but faces significant challenges to its traditional revenue streams and growth prospects. In fact, I view it as helpful that Comcast is moving fairly quickly in this direction, since it is likely to force the FCC and the Courts to revisit yet again the question of how to craft communication policy that serves the public interest in the Internet age.

And, with the Commission having classified broadband access as a Title II service, my hope is that any court review of FCC action responding to Stream TV or similar services will consider substantive policy arguments (e.g., related to competition and the public interest) rather than simply ruling that the Commission cannot impose net neutrality rules absent a Title II classification of broadband access (which seemed to be the central message of the most recent DC Circuit Court ruling).

We are clearly moving into a world where the central element of our once heavily (and often clumsily) siloed communication infrastructure and policy (and arguably our economy and society as a whole) is IP connectivity. Though some believe the FCC has outlived its usefulness in that world, my own preference—at least for now—is that the Commission retain sufficient tools and authority to continue serving as the specialized regulatory agency responsible for setting ground rules that help ensure that the public interest is well served during and after this historic and vitally important transition from yesterday’s communication technology and industry structure to tomorrow’s.

Tags: , , , ,


Discussion of The Importance of Public Service #ChangeAgents

by

Following David A. Bray’s 21 September 2015 Quello Lecture on ‘The Importance of Public Service #ChangeAgents in Exponential Times’, David led a wide ranging discussion, available in this Webcast.

David Bray – The Importance of Public Service #ChangeAgents in Exponential Times – Discussion from Quello Center on Vimeo.

Abstract

Technology is rapidly changing our world, the 7 billion networked devices 
in 2013 will double 14 billion in 2015 to anywhere between 50 to 200 billion in 2020. The ability to work and collaborate securely anywhere, anytime, on any device will reshape public service. We must ensure security and privacy are baked-in at code development level, testing
from ground up and automating alerts. Legal code and digital code must work together, enabling more inclusive work across government workers, citizen-led contributions, and public private partnerships. All together, these actions will transform Public Service to truly be “We the (Mobile, Data-Enabled, Collaborative) People” working to improve our world.

Dr. David A. Bray is a 2015 Eisenhower Fellow, Visiting Associate on Cyber Security with the University of Oxford, and Chief Information Officer for the Federal Communications Commission.

He began working for the U.S. government at age 15 on computer simulations at a Department of Energy facility. In later roles he designed new telemedicine interfaces and space-based forest fire forecasting prototypes for the Department of Defense. From 1998-2000 he volunteered as an occasional crew lead with Habitat for Humanity International in the Philippines, Honduras, Romania, and Nepal while also working as a project manager with Yahoo! and a Microsoft partner firm. He then joined as IT Chief for the Bioterrorism Preparedness and Response Program at the U.S. Centers for Disease Control and Prevention, leading the program’s technology response to during 9/11, anthrax in 2001, Severe Acute Respiratory System in 2003, and other international public health emergencies. He later completed a PhD in Information Systems from Emory University and two post-doctoral associateships at MIT and Harvard in 2008.

In 2009, Dr. Bray volunteered to deploy to Afghanistan to help “think differently” on military and humanitarian issues and in 2010 became a Senior National Intelligence Service Executive advocating for increased information interoperability, cybersecurity, and protection of civil liberties. In 2012, Dr. Bray became the Executive Director for the bipartisan National Commission for Review of Research and Development Programs of the United States Intelligence Community, later receiving the National Intelligence Exceptional Achievement Medal. He received both the Arthur S. Flemming Award and Roger W. Jones Award for Executive Leadership in 2013. He also was chosen to be an Eisenhower Fellow to meet with leaders in Taiwan and Australia on multisector cyber strategies for the “Internet of Everything” in 2015.

Dr. Bray has served as the Chief Information Officer for the Federal Communications Commission, leading FCC’s IT Transformation since 2013. He was selected to serve as a member of the Council on Foreign Relations and as a Visiting Associate for the Cybersecurity Working Group on Culture at the University of Oxford in 2014. He also has been named one of the “Fed 100” for 2015 and the “Most Social CIO” globally for 2015, tweeting as @fcc_cio.

Tags: , , , , ,


Delighted to Host David Bray, CIO of the FCC

by

The Quello Center is very pleased to host a visit to MSU by Dr David Bray, the CIO of the FCC, and a recent recipient of an Eisenhower Fellow. He will be speaking at the College of Communication Arts & Sciences on Monday, 21 September 2015, in Room 191 at 3pm, giving one of this year’s Quello Lectures.

Dr David Bray

Dr David Bray

David has spoken recently on related topics, such as on how to reshape public service IT for the new digital era. His talk on Monday promises to be of special value to students considering careers in the public service. The title of his talk is ‘The Importance of Public Service #ChangeAgents in Exponential Times’. More information about David and his talk is available on our event site at: http://quello.msu.edu/event/changeagents-and-public-service-in-the-digital-age-by-david-bray/?instance_id=288

Join us at 3pm.

Tags: , , , , ,


Multiple Methods & Mutual Respect: Key Ingredients For Good Policy Research

by

I thought I’d write a short follow-up in response to the exchange of comments following my recent post on issues related to impacts of the FCC’s Open Internet order on ISP investment.

I very much appreciate the responses to my post, especially from Hal Singer and Mark Jamison, whose work was the target of my sometimes insufficiently respectful criticism. It helped me understand the substantive issues better and also reminded me that respectful dialog on important and controversial issues may not always be easy, but is certainly worth the effort…and that I’m still somewhat haltingly learning that lesson.

I especially appreciated the content and tone of Mark’s comment, including:

I won’t make the claim that my approach revealed reality and that yours did not. We have too little information for that. And even if we had sufficient data for a proper study, there would still be errors. That said, I would be glad to work with you and/or your colleagues on a study once sufficient data are available.

In my view this pretty well describes the aim of the Quello Center’s investigation of this policy issue: to gather as much useful data as we can and to apply to it a mix of the most useful modes of analysis to understand what’s going on and to refine the models we use to understand and predict policy outcomes. I hope to be part of that process, contributing my best skills and strengths, being humble enough to acknowledge their limits, and learning from others who have different expertise and perspectives.

Mark’s comment reminds me of the story about the blind men trying to describe the elephant, all of them describing it differently based on which part of the massive creature they were feeling with their hands. While I wouldn’t describe all of us focused on this issue as blind, I think it’s fair to say that we (and, as Mark notes, our methods) all suffer from some form of perceptual limitation. Some of us are nearsighted, others farsighted and perhaps others see clearly only with one eye…and occasionally we all may feel compelled to close our eyes to avoid seeing something that makes us very uncomfortable.

Though when it comes to policy research we may never be able to see and agree on “the truth,” my hope is that the Quello Center’s research team can be part of an effort to carefully study this and other policy “elephants” from as many angles as we can, and work together to understand their key dynamics, while at the same time remembering the value of respectful dialog, even when a voice inside our head might be telling us “that guy describing the elephant’s tail must be a fool or a scoundrel.”

Tags: , , , ,


A Reminder Why the Quello Center Net Neutrality Impact Study is Important

by

In the past week or so I’ve seen several articles that remind me how important the Quello Center’s empirically-grounded study of net neutrality impacts is for clarifying what these impacts will be—especially since net neutrality is one of those policy topics where arguments are often driven by ideology and/or competing financial interests.

As far as I can tell, this series of articles began with an August 25 piece written by economist Hal Singer and published by Forbes under the following headline: Does The Tumble In Broadband Investment Spell Doom For The FCC’s Open Internet Order? Per his Forbes bio, Singer is a principal at Economists Incorporated, a senior fellow at the Progressive Policy Institute, and an adjunct professor at Georgetown University’s McDonough School of Business.

Singer’s piece was followed roughly a week later by two op-ed pieces published on the American Enterprise Institute’s web site. The title of the first AEI piece, authored by Mark Jamison, was Title II’s real-world impact on broadband investment. This was followed a day later by Bronwyn Howell’s commentary Title II is hurting investment. How will – and should – the FCC respond?

What struck me about this series of op-ed pieces published by economists and organizations whose theoretical models and policy preferences appear to favor unregulated market structures was that their claims that “Title II is hurting investment” were all empirically anchored in Singer’s references to declines in ISP capital spending during the first half of 2015. As a member of the Quello Center’s research team studying the impacts of net neutrality, I was intrigued, and eager to dig into the CapEx data and understand its significance.

While my digging has only begun, what I found reminded me how much the communication policy community needs the kind of fact-based, impartial and in-depth empirical analysis the Quello Center has embarked upon, and how risky it is to rely on the kind of ideologically-driven analysis that too often dominates public policy debates, especially on contentious issues like net neutrality.

My point here is not to argue that there are clear signs that Title II will increase ISP investment, but rather that claims by Singer and others that there are already signs that it is hurting investment are not only premature, but also based on an incomplete reading of evidence that can be uncovered by careful and unbiased review of publicly available information.

I hope to have more to say on this topic in future posts, but will make a few points here.

The crux of Singer’s argument is based on his observation that capital spending had declined fairly dramatically for a number of major ISPs during the first half of 2015, dragging down the entire sector’s spending for that period (though its not clear from the article, my sense is that Singer’s reference to “all” wireline ISPs refers to the industry’s larger players and says nothing about investment by smaller companies and the growing ranks of publicly and privately owned FTTH-based competitors). He then briefly reviews and dismisses potential alternative explanations for these declines, concluding that their only other logical cause is ISPs’ response to the FCC’s Open Internet Order (bolding is mine):

AT&T’s capital expenditure (capex) was down 29 percent in the first half of 2015 compared to the first half of 2014. Charter’s capex was down by the same percentage. Cablevision’s and Verizon’s capex were down ten and four percent, respectively. CenturyLink’s capex was down nine percent. (Update: The average decline across all wireline ISPs was 12 percent. Including wireless ISPs Sprint and T-Mobile in the sample reduces the average decline to eight percent.)..

This capital flight is remarkable considering there have been only two occasions in the history of the broadband industry when capex declined relative to the prior year: In 2001, after the dot.com meltdown, and in 2009, after the Great Recession. In every other year save 2015, broadband capex has climbed, as ISPs—like hamsters on a wheel—were forced to upgrade their networks to prevent customers from switching to rivals offering faster connections.

What changed in early 2015 besides the FCC’s Open Internet Order that can explain the ISP capex tumble? GDP grew in both the first and second quarters of 2015. Broadband capital intensity—defined as the ratio of ISP capex to revenues—decreased over the period, ruling out the possibility that falling revenues were to blame. Although cord cutting is on the rise, pay TV revenue is still growing, and the closest substitute to cable TV is broadband video. Absent compelling alternatives, the FCC’s Order is the best explanation for the capex meltdown.

I haven’t had a chance to carefully review the financial statements and related earnings material of all the companies cited by Singer, but did take a quick look at this material for AT&T and Charter since, as he notes, they experienced by far the largest percentage drop in spending.  What I found doesn’t strike me as supporting his conclusion that the decline was network neutrality-driven.  Instead, in both cases it seems to pretty clearly reflect the end of major investment projects by both companies and related industry trends that seem to have nothing to do with the FCC’s Open Internet order.

My perspective on this is based on statements made by company officials during their second quarter 2015 earnings calls, as well as capex-related data in their financial reporting.

During AT&T’s earnings call, a Wall Street analyst asked the following question: “[T]he $18 billion in CapEx this year implies a nice downtick in the U.S. spending, what’s driving that? Are you finding that you just don’t need to spend it or are you sort of pushing that out to next year?” In his response to the question, John Stephens, the company’s CFO, made no mention of network neutrality or FCC policy decisions. Instead he explained where the company was in terms of key wireless and wireline strategic network investment cycles (bolding is mine):

Well, I think a couple of things. And the simplest thing is to say [is that the] network team did a great job in getting the work done and we’ve got 300, nearly 310 million POPs with LTE right now. And we are putting our spectrum to use as opposed to building towers. And so that aspect of it is just a utilization of spectrum we own and capabilities we have that don’t require as much CapEx. Secondly, the 57 million IP broadband and what is now approximately 900,000 business customer locations passed with fiber. Once again, the network guys have done a great job in getting the Project VIP initiatives completed. And when they are done…the additional spend isn’t necessary, because the project has been concluded not for lack of anything, but for success.

Later on in the call, another analyst asked Stephens “[a]s you look out over the technology roadmap, like 5G coming down the pipeline, do you anticipate that we will see another period of elevated investment?”

While Stephens pointed to a potential future of moderated capital spending, he made no reference to network neutrality or FCC policy, focusing instead on the investment implications of the company’s (and the industry’s) evolution to software-defined networks.

I would tell you that’s kind of a longer term perspective. What we are seeing is our move to get this fiber deep into the network and getting LTE out deep into the wireless network and the solutions that we are finding in a software-defined network opportunity, we see a real opportunity to actually strive to bring investments, if you will, lower or more efficient from historical levels. Right now, I will tell you that this year’s investment is going to be in that $18 billion range, which is about 15%. We are certainly – we are not going to give any guidance with regard to next year or the year after. And we will give an update on this year’s guidance, if and when in our analyst conference if we get that opportunity. With that being said, I think there is a real opportunity with some of the activities are going on in software-defined networks on a longer term basis to actually bring that in capital intensity to a more modest level.

Charter’s large drop in capital spending appears to be driven by a similar “investment cycle” dynamic. During its 2Q15 earnings call, CFO Christopher Winfrey noted that Charter’s year-over-year decline in total CapEx “was driven by the completion of All-Digital during the fourth quarter of last year,” referring to the company’s migration of its channel lineup and other content to an all-digital format.

A review of the company’s earnings call and financial statements suggests that a large portion of the “All-Digital” capital spending was focused on deploying digital set-top boxes to Charter customers, resulting in a precipitous decline in the “customer premise equipment” (CPE) category of CapEx. According to Charter’s financial statements, first-half CPE-related CapEx fell by more than half, or $341 million, from $626 million to $285 million. Excluding this sharp falloff in CPE spending driven by the end of Charter’s All-Digital conversion, the remainder of the company’s capital spending was actually up 3% during the first half of 2015. And this included a 7% increase in spending on “line extensions,” which Charter defines as “network costs associated with entering new service areas.” It seems to me that, if Charter was concerned that the Commission’s Open Internet order would weaken its business model, it would be cutting rather than increasing its investment in expanding the geographic scope of its network.

To understand the significance of Charter’s spending decline, I think it’s important to note that its 29% decline in first half total CapEx was driven by a 54% decline in CPE spending, and that the company’s non-CPE investment—including line extensions—actually increased during that period.  I found it odd that, even as he ignored this key dynamic for Charter, Singer seemed to dismiss the significance of Comcast’s CapEx increase during the same period by noting that it was “attributed to customer premises equipment to support [Comcast’s] X1 entertainment operating system and other cloud-based initiatives.”

I also couldn’t help notice that, in his oddly brief reference to the nation’s largest ISP, Singer ignored the fact that every category of Comcast’s capital spending increased by double-digits during the first half of 2015, including its investment in growth-focused network infrastructure, which expanded 24% from 2014 levels.  Comcast’s total cable CapEx was up 18% for the first half of the year, while at Time Warner Cable, the nation’s second largest cable operator, it increased 16%.

While these increases may have nothing to do with FCC policy, they seem very difficult to reconcile with Singer’s strongly-assserted argument, especially when coupled with the above discussion of company-specific reasons for large CapEx declines for AT&T and Charter.  As that discussion suggests, the reality behind aggregated industry numbers (especially when viewed through a short-term window of time) is often more complex and situation-specific than our economic models and ideologies would like it to be.  This may make our research harder and messier to do at times, but certainly not less valuable.  It also speaks to the value of longitudinal data collection and analysis, to better understand both short-term trends and those that only become clear over a longer term.  That longitudinal component is central to the approach being taken by the Quello Center’s study of net neutrality impacts.

One last general point before closing out this post. I didn’t see any reference in Singer’s piece or the AEI-published follow-ups to spending by non-incumbent competitive providers, including municipally and privately owned fiber networks that are offering attractive combinations of speed and price in a growing number of markets around the country. While this category of spending may be far more difficult to measure than investments by large publicly-owned ISPs, it may be quite significant in relation to public policy, given its potential impact on available speeds, prices and competitive dynamics.

Expect to see more on this important topic and the Quello Center’s investigation of it in later posts, and please feel free to contribute to the discussion via comments on this and/or future posts.

Tags: , , , , ,