Thursday, December 29th, 2016
Is Apple Lost?
A. Michael Noll
December 29, 2016
© 2016 AMN
Has Apple been too successful – and overly arrogant in believing only it knows what is best for its customers? Will Apple become the next Yahoo, slowly sinking into oblivion?
Has innovation for Apple become abandoning things, such as leaving out the audio mini-jack in its iPhones? The original iPod was a great music player with its fabulous click-wheel interface – ingenious. But Apple abandoned the iPod click-wheel, rather than updating this product with solid-state storage. Will Apple soon abandon all its iPods? If so, it would be a great opportunity for Sony to acquire the iPod product line and continue to innovate with new features and storage.
The iTunes program tries to do everything: music player, iPhone synchronizer, and iTunes store access. It is challenging to do all these well in one huge program. The different purposes should be different programs, but with sharing across them.
The iWatch promised much – but what did it deliver? I have yet to see someone using one. And the need to recharge it every day is a big chore. The iWatch seems to be just an extension of the iPhone.
Apple has become a one-product business: the iPhone. It is challenging to survive today as a one-product company. Apple’s complete product line (other than the iMac) would easily fit in a backpack. Apple is not a diverse product company – it has become a niche company.
Amazon, meanwhile, is innovating and expanding, such as its new voice-activated Echo product. This clearly is the kind of innovative product I would have expected from Apple. Meanwhile Apple’s iTV remains a challenge to discover what it actually does and how to use it.
Has the Apple that was the past innovator become today a copycat, such as the rumors that it too is working on a driverless car? More significantly, is Apple itself driverless and has it lost its way? Apple possibly needs new directions – a return to innovation – or a re-invigoration of the current paths.
Apple should renew a commitment to legacy products, such as the click-wheel iPod, updating them with newer technology and enthusing their original excitement. Give consumers more control over how things are displayed and used; and change the attitude that Apple knows best.
Sunday, December 18th, 2016
“You may have heard that an engineer is a person who knows a great deal about very little, and who goes along learning more and more about less and less until finally he knows practically everything about nothing. A salesman, on the other hand, is a person who knows very little about many things and keeps learning less and less about more an more, until he knows practically nothing about everything. Of course, a station manager starts out knowing everything about everything, but ends up knowing nothing about anything, because of his association with engineers and salemen.”
– James H. Quello, 11 October 1974
Saturday, December 17th, 2016
Aleks Yankelevich and Mitch Shapiro toast (with new Quello mugs!) the completion of their two reports, both of which were central to a major Quello Center project on Wireless Innovation in Last Mile Access (WILMA). Aleks led the report on regulatory issues surrounding key spectrum of value to wireless, and Mitch led the report on business strategy case studies of wireless initiatives. Both reports will be released in the coming months when reviews are completed.
Saturday, December 10th, 2016
Following the 2016 U.S. Presidential election, in a letter to FCC Chairman Wheeler, Republicans urged the FCC to avoid “controversial items” during the presidential transition. Shortly thereafter, the Commission largely scrubbed its Nov. 17 agenda resulting in perhaps the shortest Open Commission Meeting in recent history. Start at 9:30 here for some stern words from Chairman Wheeler in response. Viewers are urged to pay particular attention to an important history and civics lesson from the Chairman in response to a question at 17:20 (though this should not indicate our agreement with everything that the Chairman says).
So what is the Commission to do prior to the transition? According to the Senate Committee on Commerce, Science, and Transportation, the FCC can “focus its energies” on “many consensus and administrative matters.” Presumably, this includes the FCC’s ongoing incentive auction, now set for its fourth round of bidding, and subject to its own controversies, with dissenting votes on major items released in 2014 (auction rules and policies regarding mobile spectrum) by Republican Commissioners concerned about FCC bidding restrictions and “market manipulation,” along with a statement by a Democratic Commissioner saying that FCC bidding restrictions did not go far enough.
The Incentive Auction
Initially described in the 2010 National Broadband Plan, the Incentive Auction is one of the ways in which the FCC is attempting to meet modern day demands for video and broadband services. The FCC describes the auction for a broad audience in some detail here and here. In short, the auction was intended to repurpose up to 126 megahertz of TV band spectrum, primarily in the 600 MHz band, for “flexible use” such as that relied on by mobile wireless providers to offer wireless broadband. The auction consists of two separate but interdependent auctions—a reverse auction used to determine the price at which broadcasters will voluntarily relinquish their spectrum usage rights and a forward auction used to determine the price companies are willing to pay for the flexible use wireless licenses.
What makes this auction particularly complicated is a “repackaging” process that connects the reverse and forward auction. The current licenses held by broadcast television stations are not necessarily suitable for the type of contiguous blocks of spectrum that are necessary to set up and expand regional or nationwide mobile wireless networks. As such, repackaging involves reorganizing and assigning channels to the remaining broadcast television stations—that remain operational post-auction—in order to clear spectrum for flexible use.
The economics and technical complexities underlying this auction are well described in a recent working paper entitled “Ownership Concentration and Strategic Supply Reduction,” by Ulrich Doraszelski, Katja Seim, Michael Sinkinson, and Peichun Wang (henceforth Doraszelski et al. 2016) now making its way through major economic conferences (Searle, AEA). As the authors point out with regard to the repackaging process (p. 6):
[It] is visually similar to defragmenting a hard drive on a personal computer. However, it is far more complex because many pairs of TV stations cannot be located on adjacent channels, even across markets, without causing unacceptable levels of interference. As a result, the repackaging process is global in nature in that it ties together all local media markets.
With regard to the reverse auction, Doraszelski et al. (2016) note that (p. 7):
[T]he auction uses a descending clock to determine the cost of acquiring a set of licenses that would allow the repacking process to meet the clearing target. There are many different feasible sets of licenses that could be surrendered to meet a particular clearing target given the complex interference patterns between stations; the reverse auction is intended to identify the low-cost set . . . if any remaining license can no longer be repacked, the price it sees is “frozen” and it is provisionally winning, in that the FCC will accept its bid to surrender its license.
The idea is that the FCC should minimize the total cost of licenses sold on the reverse auction while making sure that its nationwide clearing target is satisfied. As Doraszelski et al. (2016) note, the incentive auction has various desirable properties. Of particular note is strategy proofness (see Milgrom and Segal 2015), whereby it is (weakly) optimal for broadcast license owners to truthfully reveal each station’s value as a going concern in the event that TV licenses are separately owned.
Strategic Supply Reduction
However, the author’s main concern in their working paper is that in spite of strategy proofness, the auction rules do not prevent firms that own multiple broadcast TV licenses from potentially engaging in strategic supply reduction. As Doraszelski et al. (2016) show, this can lead to some fairly controversial consequences in the reverse auction that might compound any issues that could arise (e.g., decreased revenue) due to bidding restrictions in the forward auction. Specifically, the authors find that multi-license holders are able to earn large rents from a supply reduction strategy where they strategically withhold some of their licenses from the auction to drive up the closing price for the remaining licenses they own.
The incentive auction aside, strategic supply reduction is a fairly common phenomenon in standard economic models of competition. Consider for instance a typical model of differentiated product competition (or the Cournot model of homogenous product competition). In each of these frameworks, firms’ best response strategies lead them to set prices or quantities such that the quantity sold is below the “perfectly competitive” level and prices are above marginal cost—thus, firms individually find it optimal to keep quantity low to make themselves (and consequently, their competitors) better off than under perfect competition.
In the incentive auction, a multi-license holder that withdraws a license from the auction could similarly increase the price for the remaining broadcast TV licenses that it owns (as well as the price of other broadcast TV license owners). However, in contrast to the aforementioned economic models, in which firms effectively reduce supply by underproducing, a firm engaging in strategic supply reduction is left with a TV station that it might have otherwise sold in the auction. The firm is OK with this if the gain from raising the closing price for other stations exceeds the loss from continuing to own a TV station instead of selling it into the auction.
Consider the following highly stylized example of strategic supply reduction: There are two broadcasters, B1 and B2, in a market where the FCC needs to clear three stations (the reverse auction clearing target) and there are three different license “qualities,” A, B, and C, for which broadcasters have different reservation prices and holdings as follows:
|B1 Quantity||B2 Quantity||Reservation Price|
Suppose that the auctioneer does not distinguish between differences in licenses (this is a tremendous simplification relative to the real world). Consider a reverse descending clock auction in which the auctioneer lowers its price in decrements of $2 starting at $10 (so $10 at time 1, $8 at time 2, and so on until the auction ends), and ceases to lower its price as soon as it realizes that any additional licensee drop outs would not permit it to clear its desired number of stations (as would for instance happen when quality A and B licenses drop out). Suppose that a broadcaster playing “truthfully” that is indifferent between selling its quality license and dropping out remains in the auction (so that for instance, A quality licenses are not withdrawn until the price falls from $10 to $8).
In a reverse descending clock auction in which broadcasters play “naïve” strategies, each broadcaster would offer all of their licenses and drop some from consideration as the price decreases over time. However, there is another “strategic” option, in which B1 withholds a quality C license from the auction (B1 can do so by either overstating its reservation price for this license—say claiming that it is $10—or by not including it in the auction to begin with):
The results of the naïve bidding versus the strategic bidding auction are quite different. In the naïve bidding auction, the auctioneer can continue to lower its price down to $4 at which point B1 pulls out its B quality license and the auction is frozen (further drop outs would not permit the desired number of licenses to be cleared). Each broadcaster earns $4 for each quality C license with B1 earning a profit of 2×($4-$2)=$4.
Suppose instead that broadcaster B1 withheld one quality C license. Then the auction would stop at $8 (because there are only three licenses left as soon as A quality licenses are withdrawn). Each broadcaster now earns $8 per license sold, with B1 earning a profit of ($8-$6)+($8-$2)=$8. Moreover, B2 benefits from B1’s withholding, earning profit of $6 instead of $2, as in the naïve bidding case. The astute reader will notice that B1 could have done even better by withholding its B quality license instead! This is a result of our assumption that the auctioneer treats all cleared licenses equally, which is not true in the actual incentive auction. Finally, notice that even though B2 owns three licenses in this example, strategic withholding could not have helped it more than B1’s strategic withholding did unless it colluded with B1 (this entails B2 to withhold its quality A licenses and B1 to withhold both quality C licenses).
Evidence of Strategic Supply Reduction
Doraszelski et al. (2016) explain that certain types of geographic markets and broadcast licenses are more suitable for strategic supply reduction. They write:
First, ideal markets from a supply reduction perspective are [those] in which the FCC intends to acquire a positive number of broadcast licenses and that have relatively steep supply curves around the expected demand level. This maximizes the impact of withholding a license from the auction on the closing price . . . Second, suitable groups of licenses consist of sets of relatively low value licenses, some with higher broadcast volume to sell into the auction and some with lower broadcast volume to withhold.
What is perhaps disconcerting is the fact that Doraszelski et al. (2016) have found evidence indicating that certain private equity firms spent millions acquiring TV licenses primarily from failing or insolvent stations in distress, often covering the same market and in most instances on the peripheries of major markets along the U.S. coasts. Consistent with their model, the authors found that many of the stations acquired had high broadcast volume and low valuations.
Upon performing more in depth analysis that attempts to simulate the reverse auction using ownership data on the universe of broadcast TV stations together with FCC data files related to repacking—the rather interesting details of which we would encourage our audience to read— Doraszelski et al. (2016) conclude that strategic supply reduction is highly profitable. In particular, using fairly conservative tractability assumptions, the authors found that simulated total payouts increased from $17 billion under naïve bidding to $20.7 billion with strategic supply reduction, with much of that gain occurring in markets in which private equity firms were active.
Suppose that in our example above that the quality C stations held by broadcaster B1 were initially under the control of two separate entities, call these B3 and B4. Then, if B1, B2, B3, and B4 were to participate in the auction, strategic withholding on the part of B1 would no longer benefit it. However, B1 could make itself better off by purchasing one, or potentially both of the individual C quality licenses held by B3 and B4. Consider the scenario where B1 offers to buy B3’s license. B3 is willing to sell at $4 or more, the amount it will earn under naïve bidding in the auction and Bertrand style competition between B3 and B4 will keep B1 from offering more than that. With a single C quality license, B1 can proceed to withhold either its B or C quality license, raise the price to $8, and benefit both itself, and the other broadcasters who make a sale in the auction.
This result, whether realized by the FCC ex-ante or not, is problematic for several reasons. First, it raises the prospect that revenues raised in the forward auction will not be sufficient to meet payout requirements in the reverse auction. As is, this has already occurred three times, with the FCC having had lowered its clearance target to 84 megahertz from the initial 126 megahertz; though we caution that the FCC is currently not permitted to release data regarding the prices at which different broadcasters drop out of the auction, so we cannot verify whether final prices in earlier stages of the reverse auction were impacted by strategic supply reduction. Second, as is the case with standard oligopoly models, strategic supply reduction is beneficial for sellers, but not so for buyers or consumers.
Third, strategic supply reduction by private equity firms raises questions about the proper role and regulation of such firms. The existence of such firms is generally justified by their role in providing liquidity to asset markets. However, strategic supply reduction seems to contradict this role, particularly so if withheld stations are not put to good use—something Doraszelski et al. (2016) don’t deliberate on. Moreover, strategic supply reduction relies on what antitrust agencies often term as unilateral effects—that is, supply reduction is individually optimal and does not rely on explicit or tacit collusion. However, whereas antitrust laws are intended to deal with cases of monopolization and collusion, it does not seem to us that they can easily mitigate strategic supply reduction.
Doraszelski et al. (2016) propose a partial remedy that does not rely on the antitrust laws: require multi-license owners to withdraw licenses in order of broadcast volume from highest to lowest. Their simulations show that this leads to a substantial reduction in payouts from strategic bidding (and a glance at Example 1 suggests that it would be effective in preventing strategic supply reduction there as well). Although this suggestion has unfortunately come too late for the FCC’s Incentive Auction we hope (as surely do the authors) that it will inform future auctions abroad hoping to learn from the U.S. experience.
This post was written in collaboration with Emily Schaal, a student at The College of William and Mary who is pursuing work in mathematics and economics. Emily and I previously worked together at the Federal Communications Commission, where she provided invaluable assistance to a team of wireless economists.
Thursday, December 8th, 2016
The Quello Center is off and running in creating a digital archive of James H. Quello’s papers. Our archive team includes myself, having never created such an archive, plus Anne Marie Salter at the Center, Valeta Winsloff from Media and Information who supports our design work and blogging, Scout Calvert with the MSU Library, who is orchestrating this project, and Lauren E. Lincoln-Chavez, who has hands on experience in developing archives and special collections, and is based in Detroit.
The collection contains over 1,000 papers, including speeches, statements, letters, and remarks by James Quello during his long tenure as an FCC Commissioner. To this we will be adding our collection of photographs, and videos, as well as photos of his many awards and honors. This promises to be another of the many fun and rewarding projects of the Center.
The archive will be part of our WordPress blog and publicly accessible to anyone who might want a view of over two decades at the FCC through the words of one of its longest serving and most colorful commissioners. I read one of his papers from 1974 saying the he is willing to forgive journalists for getting things wrong at times (before there was a term ‘fake news’) in order to protect freedom of the press, and I imagine he would say the same thing about the users of social media today.
Generally, sifting through this collection is addictive as you follow the history of such issues as the fairness doctrine, cross-ownership rules, and more. I’ll keep you posted on our progress.
Thursday, December 1st, 2016
As a member of their advisory board, I would also like to invite scholarly and original submissions that broadly relate to the 2017 conference theme on “Social Media for Social Good or Evil.” The organizers welcome both quantitative and qualitative work which crosses interdisciplinary boundaries and expands our understanding of the current
and future trends in social media research. See the call for proposals at
Monday, November 21st, 2016
The Information and Media PhD program at Michigan State University seeks outstanding students who wish to join a unique interdisciplinary program of study at the intersection of the social sciences and technical systems. The faculty develop and apply research about media and society and evolving information and communication technologies to important problems. The program engages students to become active scholars, teachers, and leaders in the media and information fields.
The PhD program is offered jointly by the Department of Advertising + Public Relations, the School of Journalism, and the Department of Media and Information, and gives students access to fifty PhD faculty with research interests that span important current and emerging issues in media and information studies. Students get involved early on in projects, complementing theoretical coursework with hands-on research experiences.
Particularly strong research interests of our faculty include:
• Internet Studies
• Social media and social computing
• Human-computer interaction
• Socio technical systems and collective intelligence
• Management information systems
• ICT and health
• Information and Communication and Development (ICTD)
• Games and meaningful play
• Media effects on individuals and society
• Media, information and Internet policy, with links to Quello Center
The deadline for applications for the Fall 2017 cohort is January 1, 2017. In addition, we invite applications throughout the year as we accept students into the PhD program on a rolling basis. Steps to apply are detailed at http://cas.msu.edu/misphd/.
All of our current students are supported by graduate teaching and research assistantships with generous stipends of $2000+ per month, tuition remission, and health benefits. University fellowships, dissertation completion fellowships, summer research fellowships, and stipends for travel to academic conferences are available for students.
Over three-fourths of our graduates are hired into faculty positions at four-year. They are found in departments of mass media, journalism, advertising, public relations, and information studies across the United States and around the world. Others go on to careers in public service and business.
The 2015 QS World University rankings place MSU 6th in the world and 5th in North America in communication and media studies. The National Communication Association (NCA), in their most recent doctoral program reputation study, ranked MSU’s Ph.D. programs as No. 1 in educating researchers in communication technology, and in the top four in mass communication. Michigan State University ranked third in frequency of faculty publication in communication in a study reported in The Electronic Journal of Communication in 2012.
East Lansing and the greater Lansing area offer a vibrant cultural environment with easy access to a variety of outdoor activities and the scenic beauty of our state year-round. Blending urban and sub-urban living, it is one of the nation’s most affordable places to complete a doctoral program in media and information studies.
To learn more, see our web page, at: http://cas.msu.edu/programs/graduate-studies/apply/
Monday, October 24th, 2016
Christine L. Borgman’s Quello Lecture on ‘Motivations for Sharing and Reusing Data:
Complexities and Contradictions in the Use of a Digital Data Archive’ presented for the Quello Center, Michigan State University, on October 5, 2016. The talk draws on her research with DANS, the Data Archiving and Networked Services of the Netherlands, and the UCLA Center for Knowledge Infrastructures.
Researchers face competing challenges for access to their data. One is the pressure to make their data open in response to mandates from funding agencies, journals, and science policy makers. Second is the lack of resources – human, technical, economic, and institutional – to make their data open. Third is that good reasons exist to maintain control of their data, whether to protect the confidentiality of human subjects, to gain competitive advantage over other researchers, or the sheer difficulty of extracting data from the contexts in which they originated. Researchers are encouraged – or required – to contribute their data to archives, yet surprisingly little is known about the uses and users of digital data archives, about relationships between users and the staff of data archives, or how these behaviors vary by discipline, geographic region, policy, and other factors. Digital data archives are not a single type of institution, however. They vary widely in organizational structure, mission, collection, funding, and relationships to their users and other stakeholders. This talk draws upon an exploratory study of DANS, the Data Archiving and Networked Services of the Netherlands. We mined transaction logs to draw samples of contributors to DANS and consumers of DANS data (Borgman, Scharnhorst, Van den Berg, Van de Sompel, & Treloar, 2015) and then conducted interviews with DANS archivists, contributors, and consumers to examine who contributes data to DANS and why, who consumes data from DANS and why, and what roles archivists play in acquiring and disseminating data. Early findings suggest that motivations are complex, varied, and often contradictory, and that the uses and users of DANS are far more diverse than anticipated. Implications of these findings, which draw upon the premises of the presenter’s recent book Big Data, Little Data, No Data: Scholarship in the Networked World (2015), raise concerns for stakeholders in research data such as scholars, students, librarians, funding agencies, policy makers, publishers, and the public.
Borgman, C. L. (2015). Big Data, Little Data, No Data: Scholarship in the Networked World. Cambridge MA: MIT Press.
Borgman, C. L., Scharnhorst, A., Van den Berg, H., Van de Sompel, H., & Treloar, A. (2015). Who uses the digital data archive? An exploratory study of DANS. Presented at the Association for Information Science and Technology, St Louis, MO: Information Today.
Christine L. Borgman, Distinguished Professor and Presidential Chair in Information Studies at UCLA, is the author of more than 250 publications in information studies, computer science, and communication. These include three books from MIT Press: Big Data, Little Data, No Data: Scholarship in the Networked World (2015), winner of the 2015 American Publishers Award for Professional and Scholarly Excellence (PROSE Award) in Computing and Information Sciences; Scholarship in the Digital Age: Information, Infrastructure, and the Internet (2007); and From Gutenberg to the Global Information Infrastructure: Access to Information in a Networked World (2000). The latter two books won the Best Information Science Book of the Year award from the Association for Information Science and Technology (ASIST). Professor Borgman is Chair of the Committee to Visit the Harvard Library and Co-Chair of the CODATA-ICSTI Task Group on Data Citation and Attribution. She is a member of the Library of Congress Scholars Council; the Board of Directors of the Electronic Privacy Information Center (EPIC); the Council of the Interuniversity Consortium for Political and Social Research (ICSPR); the CLARIAH International Advisory Panel; the advisory board to Authorea; and is a Fellow of the American Association for the Advancement of Science and of the Association for Computing Machinery. At UCLA, she directs the Center for Knowledge Infrastructures with funding from the Alfred P. Sloan Foundation and other sources.
Sunday, October 23rd, 2016
by A. Michael Noll
October 23, 2016
© 2016 AMN
One can only wonder in amazement over the Hollywood fever that seems to afflict communications business in the United States. The proposed acquisition of Time Warner by AT&T shows that the disease is still flourishing.
Time Warner creates and owns content. It is a media entertainment company, owning such content as CNN and Warner Brothers. The former Baby Bells had a strange fascination with Hollywood and the world of content. They did not seem content with their monopolistic control of the conduit, and today do not seem content with their duopolistic control of the conduit. AT&T – the former Baby Bell SBC – wants both: conduit and content, although these businesses are quite different in terms of such areas as technology, economics, and management. The proposed acquisition of Time Warner by AT&T simply does not make sense, other than as a manifestation of past weirdness.
A little over 30 years ago, AT&T broke apart the Bell System by divesting the regional Bell telephone companies. Then, in 1991 AT&T acquired NCR in an attempt to enter the computer business. AT&T knew nothing of computers, and about five years later spun off the computer operations. AT&T’s acquisitions had become a revolving door, and were clear evidence of nonsensical strategic planning.
Ultimately, AT&T itself became such a thin shell of its past grandeur that SBC Communications, which then cloaked itself in the AT&T identity, acquired it. There was hope that the nonsense that had plagued AT&T would be left behind in the acquisition. However, the proposed acquisition of Time-Warner by AT&T indicates that the weird behavior and nonsense is still there.
Back in the early videotex days of the 1980s, AT&T was content with providing terminals and the conduit, while Knight-Ridder provided the database and the content. It seems that the past history is yet again being ignored. AT&T recently expanded its conduit by acquiring the satellite TV business of DirecTV. AT&T did not have the broadband needed for the local delivery of TV and thus had to acquire it. Verizon’s FiOS has that bandwidth. With AT&T’s proposed foray into Hollywood, how will Verizon respond?
Perhaps government regulators will decide that the ownership of content and conduit by the duopolistic AT&T is not acceptable. If so, then AT&T will be saved from its own illness. If AT&T has such funds to waste in the pursuit of Hollywood, then perhaps instead it should decrease its rates and more truly compete in its duopolistic businesses – and invest in improving its infrastructure.
Saturday, October 22nd, 2016
Johannes Bauer on Meaningful Play 2016
Thanks to Brian, Casey and Carrie (the three co-chairs of Meaningful Play 2016) as well as all other faculty and students (among them Valeta, Will, Andrew, Luke, Jeremy, Ricardo, Robby, Wei, Constantinos, and many others) who were involved in organizing the conference, the program committee, and the onsite logistics! Beth gave an inspiring keynote that concluded the conference on a high note! I was equally impressed by the quality of theoretical and applied research and the innovative nature of the many game projects reported and exhibited.
The conference was a great forum for the growing number of MSU researchers with a shared interest in games to interact and network with other MSU researchers and with the attendees from the US and abroad. Until this conference, I was not fully aware of the size and diversity of the group of MSU researchers. I interacted with individuals from the Colleges of Social Science; Education; Arts and Letters; Lyman Briggs; and our college (and am sure there probably were more). Also rewarding to see that several of them are graduates of the Serious Games Certificate Program.
Johannes Bauer, Chair
Department of Media and Information