Missing the Internet


Monday, February 13th, 2017

A BBC reporter, Rachel Nuwer, wrote a nice piece on what would it mean to people if the Internet stopped working. It was entitled “What if the Internet Stopped for a Day“. I stressed that there are some empirical cases, such as a power outage in NYC, and the pager blackout across the US, that provide some concrete evidence of possible outcomes, and I was impressed how well she embedded these cases and more in a well developed article. I recommend it.

Bill Dutton



No Comments

Quello professor signs book contract with Oxford University Press


Wednesday, January 25th, 2017

Professor Bill Dutton, Director of the Quello Center, in the College of Communication Arts and Sciences, has signed a contract with Oxford University Press for a book on his concept of the Fifth Estate. He has been speaking and conducting research over the last decade on the role of the Internet in empowering a Fifth Estate that can hold other ‘estates’ accountable, including the press, as the Fourth Estate.

Bill Dutton in Quello Meeting Room

The book will develop the concept of the Fifth Estate, provide empirical evidence of its rise, and its implications across nearly every sector of society. While a growing tide of criticism is focused on the role of social media and the Internet in fueling everything from populism to fake news, the Fifth Estate provides a powerful response to the critics. Bill’s work shows the many strategies of individuals of the Fifth Estate for enabling greater accountability and communicative power to create a more pluralistic structure of social control not only in politics, but also, in nearly every institutional setting of everyday life.

Tags: , , , , , ,

No Comments

Media and Information Policy Issues


Tuesday, January 24th, 2017

From discussions in courses and within the Quello Center Advisory Board, the Center has been developing a set of key issues tied to media, communication and information policy and practice. We’d welcome you thoughts on issues we’ve missed or issues noted that do not merit more sustained research and debate. Your feedback on this list would be most welcome, and will be posted as comments on this post.

Quello Advisory Board Meeting

I. Innovation-led Policy Issues

New Developments around Robotics and Artificial Intelligence: What are the implications for individual control, privacy, and security? Security is no longer so clearly a cyber issue as cyber security increasingly shapes the physical world of autonomous vehicles, drones, and robots.

Internet of Things (IoT): With tens of billions of things moving online, how can individuals protect their privacy and safety and well being as their environments are monitored and controlled by their movement through space? There are likely to be implications for urban informatics, transportation and environmental systems, systems in the household, and worn (wearables above). A possible focus within this set would be on developments in households.

Wearables: What appears to be an incremental step in the IoT space could have major implications across many sectors, from health to privacy and surveillance.

The Future of Content Delivery: Content delivery, particularly around broadcasting of film and television, in the digital age: technology, business models, and social impact of the rapidly developing ecosystem, such as on localism, diversity, and quality.

Free (and Open Source) Software: The prominence and future of free as well as open source software continues to evolve. Are rules, licensing, and institutional support, such as around the Free Software Foundation, meeting the needs of this free software community?

Big Data: How can individuals protect their privacy in the age of computational analytics and increasing capture of personal data and mass surveillance? What policies or practices can be developed to guide data collection, analysis, and public awareness?

Encryption: Advances in encryption technologies at a time of increasing threats to the privacy of individual communications, such as email, could lead to a massive uptake of tools to keep private communications private. How can this development be accelerated and spread across all sectors of the Internet community?

Internet2: Just as the development of the Internet within academia has shaped the future of communications, so might the next generation of the Internet – so-called Internet2 – have even greater implications in shaping the future of research and educational networking in the first instance, but public communications in the longer-term. Who is tracking its development and potential implications?

Other Contending Issues: Drones, Cloud computing, …

II. Problem-led Initiatives

Transparency: Many new issues of the digital age, such as concerns over privacy and surveillance, are tied to a lack of transparency. What is being done with your data, by whom, and for what purposes? In commercial and governmental settings, many public concerns could be addressed to a degree through the provision of greater transparency, and the accountability that should follow.

Censorship and Internet Filtering: Internet filtering and censorship was limited to a few states at the turn of the century. But over the last decade, fueled by fear of radical extremist content, and associated fears of self-radicalization, censorship has spread to most nation states. Are we entering a new digital world in which Internet content filtering is the norm? What can be done to mitigate the impact on freedom of expression and freedom of connection?

Psychological Manipulation: Citizen and consumers are increasingly worried about the ways in which they can be manipulated by advertising, (fake) news, social media and more that leads them to vote, buy, protest, or otherwise act in ways that the purveyors of the new propaganda of the digital age would like. While many worried about propaganda around the mass media, should there be comparable attention given to the hacking of psychological processes by the designers of digital media content? Is this a critical focus for consumer protection?

(In)Equities in Access: Inequalities in access to communication and information services might be growing locally and globally, despite the move to digital media and ICTs. The concept of a digital divide may no longer be adequate to capture these developments.

Privacy and Surveillance: The release of documents by Edward Snowden has joined with other events to draw increasing attention to the threats of mass unwarranted surveillance. It has been an enduring issue, but it is increasingly clear that developments heretofore perceived to be impossible are increasingly feasible and being used to monitor individuals. What can be done?

ICT4D or Internet for Development: Policy and technology initiatives in communication to support developing nations and regions, both in emergency responses, such as in relation to infectious diseases, or around more explicit economic development issues.

Digital Preservation: Despite discussion over more than a decade, it merits more attention, and stronger links with policy developments, such as ‘right to forget’. ‘Our cultural and historical records are at stake.’

III. Enduring Policy Issues Reshaped by Digital Media and Information Developments

Media Concentration and the Plurality of Voices: Trends in the diversity and plurality of ownership, and sources of content, particularly around news. Early work on media concentration needs new frameworks for addressing global trends on the Web, with new media, in print media, automated text generation, and more.

Diversity of Content: In a global Internet context, how can we reasonably quantify or address issues of diversity in local and national media? Does diversity become more important in a digital age in which individuals will go online or on satellite services if the mainstream media in a nation ignore content of interest to their background?

Privacy and Privacy Policy: Efforts to balance security, surveillance and privacy, post-Snowden, and in wake of concerns over social media, and big data. White House work in 2014 on big data and privacy should be considered. Policy and practice in industry v government could be a focus. Is there a unifying sector specific perspective?

Freedom of Expression: New and enduring challenges to expression in the digital age.

IV. Changing Media and Information Policy and Governance

Communication Policy: Rewrite of the 1934 Communications Act, last up-dated in 1996: This is unlikely to occur in the current political environment, but is nevertheless a critical focus.

Universal Access v Universal Service: With citizens and consumers dropping some traditional services, such as fixed line phones, how can universal service be best translated into the digital age of broadband services?

Network Neutrality: Should there be Internet fast lanes and more? Efforts to ensure the fair treatment of content, from multiple providers, through regulation has been one of the more contentious issues in the USA. To some, the issue has been ‘beaten to death’, but it has been brought to life again through the regulatory initiatives of FCC Chairman Wheeler, and more recently with the new Trump Administration, where the fate of net neutrality is problematic. Can we research the implications of this policy?

Internet Governance and Policy: Normative and empirical perspectives on governance of the Internet at the global and national level. Timely issue critical to future of the Internet, and a global information age, and rise of national Internet policy initiatives.

Acknowledgements: In addition to the Quello Advisory Board, special thanks to some of my students for their stimulating discussion that surfaced many of these issues. Thanks to Jingwei Cheng, Bingzhe Li, and Irem Yildirim, for their contributions to this list.

Tags: , , , , , , ,

No Comments

The Idea of a Cyber Security Mindset


Tuesday, January 24th, 2017

What is a cyber security mindset and why is it important?

Quello’s Professor of Media and Information Policy has just published an article in Internet Policy Review, a journal on Internet regulation, entitled ‘Fostering a Cyber Security Mindset’. It seeks to introduce the concept and suggest ways in which research on who has such a mindset and what difference it can make to cyber security can be furthered. It is available free online at:

Dutton, William. (2017), ‘Fostering a Cyber Security Mindset’, Internet Policy Review, 6(1): DOI: 10.14763/2017.1.443

Tags: , , , ,

No Comments

Something to consider before restructuring the FCC . . .


Wednesday, January 18th, 2017

The Chief Economist of the Federal Communications Commission is a temporary position—with a term of a year or so of late—typically bestowed on economists with impressive credentials and experience related to media or telecommunications. Having worked at the FCC long enough to overlap with several chief economists, I noticed an interesting pattern. Many join the FCC full of hope—capable as they are—that they will reform the agency to better integrate “economic thinking” into regular policy decisions, but to quote a former colleague, “leave the agency with their sense of humor intact.”

I have heard many a former FCC economist rail against the lack of economic thinking at the FCC, with some former chief economists going very much on the record to do so (for instance, see here and here). Others (not necessarily affiliated with the FCC) have gone as far as to point out that much of what the FCC does or attempts to do is duplicative of the competition policies of the Department of Justice and Federal Trade Commission. These latter points are not a secret. The FCC publicly says so in every major transaction that it approves.

For example, in a transaction that I have had the pleasure to separately write about with one of the FCC’s former chief economists and a number of other colleagues, AT&T’s acquisition of former competitor Leap Wireless (see here and here), the FCC wrote (see ¶ 15):

Our competitive analysis, which forms an important part of the public interest evaluation, is informed by, but not limited to, traditional antitrust principles. The Commission and the Department of Justice (“DOJ”) each have independent authority to examine the competitive impacts of proposed communications mergers and transactions involving transfers of Commission licenses.

This standard language can be found in the “Standard of Review” section in any major FCC transaction order. The difference is that whereas the DOJ reviews telecom mergers pursuant to Section 7 of the Clayton Act, the FCC’s evaluation encompasses the “broad aims of the Communications Act.” From a competition analysis standpoint, a major difference is that if the DOJ wishes to stop a merger, “it must demonstrate to a court that the merger may substantially lessen competition or tend to create a monopoly.” In contrast, parties subject to FCC review have the burden of showing that the transaction, among other things, will enhance existing competition.

Such duplication and the alleged lack of economics at the FCC has led a number of individuals to suggest that the FCC should be restructured and some of its powers curtailed, particularly with respect to matters that are separately within the purview of the antitrust agencies. In particular, recently, a number of individuals in Donald Trump’s FCC transition team have written (read here) that Congress “should consider merging the FCC’s competition and consumer protection functions with those of the Federal Trade Commission, thus combining the FCC’s industry expertise and capabilities with the generic statutory authority of the FTC.”

I do not completely disagree—I would be remiss if I did not admit that the transition team makes a number of highly valid points in its comments on “Modernizing the Communications Act.” However, as Harold Feld, senior VP of Public Knowledge recently pointed out, efforts to restructure the FCC present a relatively “radical” undertaking and my main motivation in writing this post is to highlight Feld’s point by reminding readers of a recent court ruling.

In 2007—well before its acquisition of DIRECTV and its offer of unlimited data to customers who bundle its AT&T and DIRECT services—AT&T offered mobile wireless customers unlimited data plans. AT&T later phased out these plans except for customers who were “grandfathered”—those customers who signed up for an unlimited plan while it was available and never switched to an alternative option. In October 2011, perhaps worried about the implications of unlimited data in a data hungry world, AT&T reduced speeds for grandfathered customers on legacy plans whose monthly data usage surpassed a certain threshold—a practice that the FTC refers to as data throttling.

The FTC filed a complaint against AT&T under Section 5 of the FTC Act, alleging that customers who had been throttled by AT&T experienced drastically reduced service, but were not adequately informed of AT&T’s throttling program. As part of its complaint, the FTC claimed that AT&T’s actions violated the FTC Act and sought a permanent injunction on throttling and other equitable relief as deemed necessary by the Court.

Now here is where things get interesting: AT&T moved to dismiss on the basis that it is exempt as a “common carrier.” That is, AT&T claimed that the appropriate act that sets out jurisdiction over its actions is the Communications Act, and not the FTC Act. Moreover, AT&T’s position was that an entity with common carrier status cannot be regulated under the section that the FTC brought to this case (§ 45(a)), even when it is providing services other than common carriage services. This led one of my former colleagues to joke that this would mean that if AT&T were to buy General Motors, then it could use false advertising to sell cars and be exempt from FTC scrutiny.

The District Court for the Northern District of California happened to consider this matter after the FCC reclassified mobile data from a non-common carriage service to a common carriage service (in its Open Internet Order), but before the reclassification had gone into effect. The Court concluded that contrary to AT&T’s arguments, “the common carrier exception applies only where the entity has the status of common carrier and is actually engaging in common carrier activity.” Moreover, it denied AT&T’s motion because AT&T’s mobile data service was not regulated as common carrier activity by the FCC when the FTC suit was filed. However, in August 2016, this decision was reversed on appeal by the U.S. Court of Appeals for the Ninth Circuit (see here), which ruled that the common carrier exemption was “status based,” not “activity based,” as the lower court had determined.

Unfortunately, this decision leaves quite a regulatory void. To my knowledge, the FCC does not have a division of Common Carrier Consumer Protection (CCCP), and I doubt that any reasonable individual familiar with FCC practice would interpret the Open Internet Order as an attempted FCC power grab to attempt to duplicate or supplant FTC consumer protection authority. Indeed, the FCC articulated quite the reverse position by recently filing an Amicus Curiae Brief in support of the FTC’s October 2016 Petition to the Ninth Circuit to have the case reheard by the full court.

So what’s my point? Well first, the agencies are not intentionally attempting to step on each other’s toes. By and large, the FCC understands the role of the FTC and the DOJ and vice versa. Were AT&T to acquire General Motors, it is highly probable that given the state of regulation as it stands, employees at the FCC would find it preferable if the FTC continued to oversee General Motors’ advertising practices. A related stipulation applies to the FCC’s competition analysis. Whereas the analysis may be similar to that of the antitrust agencies, it is motivated at least in part by the FCC’s unique mission to establish or maintain universal service, which can lead to different decisions being made in the same case (for instance, whereas the DOJ did not challenge AT&T’s acquisition of Leap Wireless, the FCC imposed a number of conditions to safeguard against loss of service).

Of course, one could argue that confusion stemming from the above case might have been avoided had the FCC never had authority over common carriage in the first place. But if making that argument, one must be cognizant of the fact that although the FTC Act predates the Communications Act of 1934, prior to 1934, it was the Interstate Commerce Act, not the FTC Act, that lay out regulations for common carriers.  In other words, legislative attempts to rewrite the Communications Act will necessitate changes in various other pieces of legislation in order to assure that there are no voids in crucial protections to competition and consumers. Thus, to bolster Harold Feld’s points: those wishing to restructure the FCC need to do so being fully aware of what the FCC actually does and doesn’t do, they must take heed of all the subtleties underlying the legislation that lays the groundwork for the various agencies, and they should be mindful of potential for interpretation and reinterpretation under the common law aspects of our legal system.

Tags: , , , , ,

No Comments

Ruth Shillair Joining Quello Research Team


Saturday, January 7th, 2017

Ruth Shillair is joining the Quello Center’s research team as a Research Assistant in this Spring Semester to support our work on cybersecurity, which is linked to the Oxford Global Cyber Security Capacity Center (GCSEC). She is working with Bill Dutton on an analysis that builds on his concept of a cyber security mindset and another analysis that focuses on the outcomes of national cyber security capacity building: Can we see capacity having a positive, independent impact on cyber security?

Ruth Shillair

Ms. Shillair is a doctoral student in the Media and Information Department at MSU. Her research has focused on cyber security, such as in working with the Online Safety for the Ages (OSA) project with Professors Bob LaRose, Nora Rifkin, Saleem Alhabaash, and Sheila Cotten, which focuses on generational differences in online safety behaviors, particularly in the area of online banking.

Ruth has been recognized at MSU, such as in being awarded with one of the Department’s PhD Academic Merit Awards, and an ‘outstanding doctoral student research’ award. She also participated in the Oxford Internet Institute’s (OII) Summer Doctoral Program (SDP). As Bill Dutton, Director of the Quello Center noted: “We are very lucky to have Ruth onboard as her expertise in cyber security and quantitative analysis is going to help us leap ahead on our cyber security research.”

Tags: , , , , , , , , ,

No Comments

Is Apple Lost? by A. Michael Noll


Thursday, December 29th, 2016

Is Apple Lost?

A. Michael Noll

December 29, 2016

© 2016 AMN

Has Apple been too successful – and overly arrogant in believing only it knows what is best for its customers? Will Apple become the next Yahoo, slowly sinking into oblivion?

Has innovation for Apple become abandoning things, such as leaving out the audio mini-jack in its iPhones? The original iPod was a great music player with its fabulous click-wheel interface – ingenious. But Apple abandoned the iPod click-wheel, rather than updating this product with solid-state storage. Will Apple soon abandon all its iPods? If so, it would be a great opportunity for Sony to acquire the iPod product line and continue to innovate with new features and storage.

The iTunes program tries to do everything: music player, iPhone synchronizer, and iTunes store access. It is challenging to do all these well in one huge program. The different purposes should be different programs, but with sharing across them.

The iWatch promised much – but what did it deliver? I have yet to see someone using one. And the need to recharge it every day is a big chore. The iWatch seems to be just an extension of the iPhone.

Apple has become a one-product business: the iPhone. It is challenging to survive today as a one-product company. Apple’s complete product line (other than the iMac) would easily fit in a backpack. Apple is not a diverse product company – it has become a niche company.

Amazon, meanwhile, is innovating and expanding, such as its new voice-activated Echo product. This clearly is the kind of innovative product I would have expected from Apple. Meanwhile Apple’s iTV remains a challenge to discover what it actually does and how to use it.

Has the Apple that was the past innovator become today a copycat, such as the rumors that it too is working on a driverless car? More significantly, is Apple itself driverless and has it lost its way? Apple possibly needs new directions – a return to innovation – or a re-invigoration of the current paths.

Apple should renew a commitment to legacy products, such as the click-wheel iPod, updating them with newer technology and enthusing their original excitement. Give consumers more control over how things are displayed and used; and change the attitude that Apple knows best.

A. Michael Noll

Tags: , , , , , ,

No Comments

James Quello on Broadcast Station Managers, Engineers and Sales


Sunday, December 18th, 2016

“You may have heard that an engineer is a person who knows a great deal about very little, and who goes along learning more and more about less and less until finally he knows practically everything about nothing. A salesman, on the other hand, is a person who knows very little about many things and keeps learning less and less about more an more, until he knows practically nothing about everything. Of course, a station manager starts out knowing everything about everything, but ends up knowing nothing about anything, because of his association with engineers and salemen.”

– James H. Quello, 11 October 1974

James H. Quello

Tags: , , , , , ,

1 Comment

Colleagues Toast Completion of WILMA Reports


Saturday, December 17th, 2016

Aleks Yankelevich and Mitch Shapiro toast (with new Quello mugs!) the completion of their two reports, both of which were central to a major Quello Center project on Wireless Innovation in Last Mile Access (WILMA). Aleks led the report on regulatory issues surrounding key spectrum of value to wireless, and Mitch led the report on business strategy case studies of wireless initiatives. Both reports will be released in the coming months when reviews are completed.

Tags: , , , , ,

No Comments

Undesirable Incentives in the Incentive Auction (w. Emily Schaal)


Saturday, December 10th, 2016

Following the 2016 U.S. Presidential election, in a letter to FCC Chairman Wheeler, Republicans urged the FCC to avoid “controversial items” during the presidential transition.  Shortly thereafter, the Commission largely scrubbed its Nov. 17 agenda resulting in perhaps the shortest Open Commission Meeting in recent history.  Start at 9:30 here for some stern words from Chairman Wheeler in response.  Viewers are urged to pay particular attention to an important history and civics lesson from the Chairman in response to a question at 17:20 (though this should not indicate our agreement with everything that the Chairman says).

So what is the Commission to do prior to the transition?  According to the Senate Committee on Commerce, Science, and Transportation, the FCC can “focus its energies” on “many consensus and administrative matters.”  Presumably, this includes the FCC’s ongoing incentive auction, now set for its fourth round of bidding, and subject to its own controversies, with dissenting votes on major items released in 2014 (auction rules and policies regarding mobile spectrum) by Republican Commissioners concerned about FCC bidding restrictions and “market manipulation,” along with a statement by a Democratic Commissioner saying that FCC bidding restrictions did not go far enough.

The Incentive Auction

Initially described in the 2010 National Broadband Plan, the Incentive Auction is one of the ways in which the FCC is attempting to meet modern day demands for video and broadband services.  The FCC describes the auction for a broad audience in some detail here and here.  In short, the auction was intended to repurpose up to 126 megahertz of TV band spectrum, primarily in the 600 MHz band, for “flexible use” such as that relied on by mobile wireless providers to offer wireless broadband.  The auction consists of two separate but interdependent auctions—a reverse auction used to determine the price at which broadcasters will voluntarily relinquish their spectrum usage rights and a forward auction used to determine the price companies are willing to pay for the flexible use wireless licenses.


What makes this auction particularly complicated is a “repackaging” process that connects the reverse and forward auction.  The current licenses held by broadcast television stations are not necessarily suitable for the type of contiguous blocks of spectrum that are necessary to set up and expand regional or nationwide mobile wireless networks.  As such, repackaging involves reorganizing and assigning channels to the remaining broadcast television stations—that remain operational post-auction—in order to clear spectrum for flexible use.

The economics and technical complexities underlying this auction are well described in a recent working paper entitled “Ownership Concentration and Strategic Supply Reduction,” by Ulrich Doraszelski, Katja Seim, Michael Sinkinson, and Peichun Wang (henceforth Doraszelski et al. 2016) now making its way through major economic conferences (Searle, AEA).  As the authors point out with regard to the repackaging process (p. 6):

[It] is visually similar to defragmenting a hard drive on a personal computer.  However, it is far more complex because many pairs of TV stations cannot be located on adjacent channels, even across markets, without causing unacceptable levels of interference.  As a result, the repackaging process is global in nature in that it ties together all local media markets.

With regard to the reverse auction, Doraszelski et al. (2016) note that (p. 7):

[T]he auction uses a descending clock to determine the cost of acquiring a set of licenses that would allow the repacking process to meet the clearing target.  There are many different feasible sets of licenses that could be surrendered to meet a particular clearing target given the complex interference patterns between stations; the reverse auction is intended to identify the low-cost set . . . if any remaining license can no longer be repacked, the price it sees is “frozen” and it is provisionally winning, in that the FCC will accept its bid to surrender its license.

The idea is that the FCC should minimize the total cost of licenses sold on the reverse auction while making sure that its nationwide clearing target is satisfied.  As Doraszelski et al. (2016) note, the incentive auction has various desirable properties.  Of particular note is strategy proofness (see Milgrom and Segal 2015), whereby it is (weakly) optimal for broadcast license owners to truthfully reveal each station’s value as a going concern in the event that TV licenses are separately owned.

Strategic Supply Reduction

However, the author’s main concern in their working paper is that in spite of strategy proofness, the auction rules do not prevent firms that own multiple broadcast TV licenses from potentially engaging in strategic supply reduction.  As Doraszelski et al. (2016) show, this can lead to some fairly controversial consequences in the reverse auction that might compound any issues that could arise (e.g., decreased revenue) due to bidding restrictions in the forward auction.  Specifically, the authors find that multi-license holders are able to earn large rents from a supply reduction strategy where they strategically withhold some of their licenses from the auction to drive up the closing price for the remaining licenses they own.

The incentive auction aside, strategic supply reduction is a fairly common phenomenon in standard economic models of competition.  Consider for instance a typical model of differentiated product competition (or the Cournot model of homogenous product competition).  In each of these frameworks, firms’ best response strategies lead them to set prices or quantities such that the quantity sold is below the “perfectly competitive” level and prices are above marginal cost—thus, firms individually find it optimal to keep quantity low to make themselves (and consequently, their competitors) better off than under perfect competition.

In the incentive auction, a multi-license holder that withdraws a license from the auction could similarly increase the price for the remaining broadcast TV licenses that it owns (as well as the price of other broadcast TV license owners).  However, in contrast to the aforementioned economic models, in which firms effectively reduce supply by underproducing, a firm engaging in strategic supply reduction is left with a TV station that it might have otherwise sold in the auction.  The firm is OK with this if the gain from raising the closing price for other stations exceeds the loss from continuing to own a TV station instead of selling it into the auction.

Example 1

Consider the following highly stylized example of strategic supply reduction: There are two broadcasters, B1 and B2, in a market where the FCC needs to clear three stations (the reverse auction clearing target) and there are three different license “qualities,” A, B, and C, for which broadcasters have different reservation prices and holdings as follows:

B1 Quantity B2 Quantity Reservation Price
A 1 2 10
B 1 0 6
C 2 1 2

Suppose that the auctioneer does not distinguish between differences in licenses (this is a tremendous simplification relative to the real world).  Consider a reverse descending clock auction in which the auctioneer lowers its price in decrements of $2 starting at $10 (so $10 at time 1, $8 at time 2, and so on until the auction ends), and ceases to lower its price as soon as it realizes that any additional licensee drop outs would not permit it to clear its desired number of stations (as would for instance happen when quality A and B licenses drop out).  Suppose that a broadcaster playing “truthfully” that is indifferent between selling its quality license and dropping out remains in the auction (so that for instance, A quality licenses are not withdrawn until the price falls from $10 to $8).

In a reverse descending clock auction in which broadcasters play “naïve” strategies, each broadcaster would offer all of their licenses and drop some from consideration as the price decreases over time. However, there is another “strategic” option, in which B1 withholds a quality C license from the auction (B1 can do so by either overstating its reservation price for this license—say claiming that it is $10—or by not including it in the auction to begin with):

Naive Strategic
B1 B2 B1 B1 B2
Offered Offered Offered Withheld Offered
A 1 2 1 2
B 1 1
C 2 1 1 1 1
Licenses Auctioned 7 6

The results of the naïve bidding versus the strategic bidding auction are quite different.  In the naïve bidding auction, the auctioneer can continue to lower its price down to $4 at which point B1 pulls out its B quality license and the auction is frozen (further drop outs would not permit the desired number of licenses to be cleared).  Each broadcaster earns $4 for each quality C license with B1 earning a profit of 2×($4-$2)=$4.

Suppose instead that broadcaster B1 withheld one quality C license.  Then the auction would stop at $8 (because there are only three licenses left as soon as A quality licenses are withdrawn).  Each broadcaster now earns $8 per license sold, with B1 earning a profit of ($8-$6)+($8-$2)=$8.  Moreover, B2 benefits from B1’s withholding, earning profit of $6 instead of $2, as in the naïve bidding case.  The astute reader will notice that B1 could have done even better by withholding its B quality license instead!  This is a result of our assumption that the auctioneer treats all cleared licenses equally, which is not true in the actual incentive auction.  Finally, notice that even though B2 owns three licenses in this example, strategic withholding could not have helped it more than B1’s strategic withholding did unless it colluded with B1 (this entails B2 to withhold its quality A licenses and B1 to withhold both quality C licenses).

Evidence of Strategic Supply Reduction

Doraszelski et al. (2016) explain that certain types of geographic markets and broadcast licenses are more suitable for strategic supply reduction.  They write:

First, ideal markets from a supply reduction perspective are [those] in which the FCC intends to acquire a positive number of broadcast licenses and that have relatively steep supply curves around the expected demand level.  This maximizes the impact of withholding a license from the auction on the closing price . . .  Second, suitable groups of licenses consist of sets of relatively low value licenses, some with higher broadcast volume to sell into the auction and some with lower broadcast volume to withhold.

What is perhaps disconcerting is the fact that Doraszelski et al. (2016) have found evidence indicating that certain private equity firms spent millions acquiring TV licenses primarily from failing or insolvent stations in distress, often covering the same market and in most instances on the peripheries of major markets along the U.S. coasts.  Consistent with their model, the authors found that many of the stations acquired had high broadcast volume and low valuations.

Upon performing more in depth analysis that attempts to simulate the reverse auction using ownership data on the universe of broadcast TV stations together with FCC data files related to repacking—the rather interesting details of which we would encourage our audience to read— Doraszelski et al. (2016) conclude that strategic supply reduction is highly profitable.  In particular, using fairly conservative tractability assumptions, the authors found that simulated total payouts increased from $17 billion under naïve bidding to $20.7 billion with strategic supply reduction, with much of that gain occurring in markets in which private equity firms were active.

Example 2

Suppose that in our example above that the quality C stations held by broadcaster B1 were initially under the control of two separate entities, call these B3 and B4.  Then, if B1, B2, B3, and B4 were to participate in the auction, strategic withholding on the part of B1 would no longer benefit it.  However, B1 could make itself better off by purchasing one, or potentially both of the individual C quality licenses held by B3 and B4.  Consider the scenario where B1 offers to buy B3’s license.  B3 is willing to sell at $4 or more, the amount it will earn under naïve bidding in the auction and Bertrand style competition between B3 and B4 will keep B1 from offering more than that.  With a single C quality license, B1 can proceed to withhold either its B or C quality license, raise the price to $8, and benefit both itself, and the other broadcasters who make a sale in the auction.

This result, whether realized by the FCC ex-ante or not, is problematic for several reasons.  First, it raises the prospect that revenues raised in the forward auction will not be sufficient to meet payout requirements in the reverse auction.  As is, this has already occurred three times, with the FCC having had lowered its clearance target to 84 megahertz from the initial 126 megahertz; though we caution that the FCC is currently not permitted to release data regarding the prices at which different broadcasters drop out of the auction, so we cannot verify whether final prices in earlier stages of the reverse auction were impacted by strategic supply reduction.  Second, as is the case with standard oligopoly models, strategic supply reduction is beneficial for sellers, but not so for buyers or consumers.

Third, strategic supply reduction by private equity firms raises questions about the proper role and regulation of such firms.  The existence of such firms is generally justified by their role in providing liquidity to asset markets.  However, strategic supply reduction seems to contradict this role, particularly so if withheld stations are not put to good use—something Doraszelski et al. (2016) don’t deliberate on.  Moreover, strategic supply reduction relies on what antitrust agencies often term as unilateral effects—that is, supply reduction is individually optimal and does not rely on explicit or tacit collusion.  However, whereas antitrust laws are intended to deal with cases of monopolization and collusion, it does not seem to us that they can easily mitigate strategic supply reduction.

Doraszelski et al. (2016) propose a partial remedy that does not rely on the antitrust laws: require multi-license owners to withdraw licenses in order of broadcast volume from highest to lowest.  Their simulations show that this leads to a substantial reduction in payouts from strategic bidding (and a glance at Example 1 suggests that it would be effective in preventing strategic supply reduction there as well).  Although this suggestion has unfortunately come too late for the FCC’s Incentive Auction we hope (as surely do the authors) that it will inform future auctions abroad hoping to learn from the U.S. experience.

This post was written in collaboration with Emily Schaal, a student at The College of William and Mary who is pursuing work in mathematics and economics.  Emily and I previously worked together at the Federal Communications Commission, where she provided invaluable assistance to a team of wireless economists.  

Tags: , , , ,

No Comments