Faculty and staff of the Quello Center will be actively engaged in this year’s Telecommunication Policy Research Conference (TPRC). The following papers on the schedule for the 45th TPRC Research Conference on Communications, Information, and Internet Policy, at George Mason University in Arlington, Virginia:
“Social Shaping of the Politics of Internet Search and Networking: Moving Beyond Filter Bubbles, Echo Chambers, and Fake News,” by William H. Dutton and Bianca C. Reisdorf (presenter), Quello Center, Michigan State University; Elizabeth Dubois, Department of Communication, University of Ottawa; and Grant Blank, Oxford Internet Institute, University of Oxford.
“Race and Digital Inequality: Policy Implications,” by C.H. Rhinesmith, Simmons College (presenter), and B.C. Reisdorf, Quello Center.
“Price-Cap Regulation of Firms That Supply Their Rivals,” Omar A. Nayeem, Deloitte Tax; and Aleksandr Yankelevich, Quello Center (presenter).
“Cyber Security Capacity: Does it Matter?” by William H. Dutton, Quello Center; Sadie Creese, Computer Science, Oxford University; Ruth Shillair, Quello Center (presenter), Maria Bada, Oxford Martin, University of Oxford; Taylor Roberts US Dept of Management and Budget.
“Regulating the Open Internet: Past Developments and Emerging Challenges,” by Kendall J. Koning, Department of Media and Information, Michigan State University (presenter); and Aleksandr Yankelevich, Quello Center.
We hope you can join the conference and provide feedback on our papers.
Dear Colleagues and Friends of the Quello Center:
Since my last up-date, our Postdoctoral Researcher, Bianca C. Reisdorf, and Assistant Research Professor, Aleksandr Yankelevich, have come onboard. In collaboration with our Research Associates and Assistants, they have enabled us to move forward on new research proposals with some early success.
Developing Research Foci: Digital Inequalities and Net Neutrality
The mission of the James and Mary Quello Center is to conduct high-quality research that will stimulate and inform debate on media, communication, and information policy for our digital age. A wide range of policy issues have been identified for study, but two general areas have emerged from our early work, which focus on:
Our plan to develop a natural experiment to assess the impact of net neutrality rulings has drawn a number of faculty together across the campus in shaping some preliminary research, and proposals which we hope to submit in the coming months.
Two proposals have been accepted, and several others are submitted or underway to study digital divides and inequalities in Detroit, Michigan, and across the United States.
Researching Locally to Speak Globally
The Quello Center is moving ahead in focusing greater attention on new Internet and digital age policy issues with an even more multi-disciplinary set of researchers and strong additions to our remarkable Advisory Board. We address issues that arise from: problems such as risks to privacy and freedom of expression; innovations such as around the Internet of Things and wearables; policies such as net neutrality, price cap regulation of access services, and universal broadband; and contexts, such as issues in cities like Detroit, and in households. To do so, we draw from theoretical perspectives, such as the Fifth Estate, sociological and communication perspectives on information inequalities, work on the ecology of games as well as game theoretical economics; and from innovative empirical approaches, such as a novel design for a national broadband availability dataset.
Over this last year, the Center has found a number of local developments that present clear opportunities to pursue issues that are of nationwide and global concern. This has led us to anchor more of our research locally, such as in looking at digital divides in Michigan and Detroit, and in developing ideas for new research on the use of wireless spectrum for last mile access, and for experiments addressing digital inequalities and the future of public broadcasting. In these areas, we plan to work with the local public broadcasting station, WKAR, and faculty across the university. Together, we can realize the opportunities created by MSU choosing to forgo the FCC’s incentive auction of spectrum in favor of turning the station and its spectrum into an even greater resources for research, teaching, and service, such as through an MSU partnership announced with Detroit public broadcasting to create more educational programming.
Quello Center Seminars and Lectures at MSU, in Washington DC & Worldwide
The Center organizes and promotes an active stream of roundtables, seminars, and lectures to stimulate discussion of policy and regulatory issues. Recent lectures and events have focused on Internet policy and regulation, network neutrality, social media and reputation management, digital inequalities, and social accountability. In addition to holding events at the Center and in Washington DC, we have been speaking at a variety of other universities, conferences, and events organized by others. For example, Quello helps support the Telecommunication Policy Research Conference (TPRC), and the director has spoken recently in Canada, Argentina, Denmark, South Africa, Hong Kong, China, Japan, and Mexico.
A list of past and forthcoming events are available at: http://quello.msu.edu/events/ and videos of many of our events are available on Vimeo at: https://vimeo.com/quellocenter.
Selected Working Papers on Research, Policy and Practice
All of our research reports, working papers, and publications are listed on our Web site at: http://quello.msu.edu/publications/. A set of papers that illustrate the range of our work includes:
Bauer, J. M. and Dutton, W. H. (2015), ‘The New Cyber Security Agenda,’ for the World Bank Development Report. Available at SSRN: http://ssrn.com/abstract=2614545 or http://dx.doi.org/10.2139/ssrn.2614545
Dutton, W.H. and Graham, M. (2014), Society and the Internet (Oxford University Press).
Dutton, W. H. (2015), ‘Multistakeholder Governance?,’ for the World Bank Development Report. Available at SSRN: http://ssrn.com/abstract=2615596 or http://dx.doi.org/10.2139/ssrn.2615596
Reisdorf, B. C., & Groselj, D. (2015). ‘Internet (non-) Use Types and Motivational Access: Implications for Digital Inequalities Research,’ New Media & Society, Online First.
Reisdorf, B. C., & Jewkes, Y. (2016). ‘(B)Locked Sites: Cases of Internet Use in Three British Prisons,’ Information, Communication & Society, 1-16.
UNESCO (2015), Keystones to Foster Inclusive Knowledge Societies. Paris: UNESCO.
The English version is available at: http://unesdoc.unesco.org/images/0023/002325/232563E.pdf.
Yankelevich, A., & Vaughan, B. `Price-Match Announcements in a Consumer Search Duopoly.’ Forthcoming at Southern Economic Journal.
Access to the Work of the Quello Center
Over the past year, we have also made strides to providing numerous ways to keep in touch with the Quello Center’s work. In addition to this newsletter, we have a:
• Quello Center Blog
• Quello Facebook Page
• Twitter handle @QuelloCenter
• Working Paper Series on SSRN
• Videos of most of many of our lectures and seminars on Vimeo
Thank you again, and please keep in touch. Follow the work and ideas of the Quello Center on Twitter or Facebook, and write to the Center at Quello@msu.edu if you have any questions, suggestions, or wish to be added to our email list.
William Dutton, Director
Quello Professor of Media and Information Policy
Whether you are new to net neutrality and want to better understand the concept or a seasoned researcher who wants an update regarding open questions, I encourage you to read a recent working paper entitled, “Net Neutrality: A Fast Lane to Understanding the Trade-Offs,” by Shane Greenstein, Martin Peitz, and Tommaso Valletti, a group of economists with a track record researching and writing about Internet economics. Although the article is rather recent, I believe it presents a very good starting point for those interested in taking a deeper dive into both specific theoretical and general empirical issues revolving net neutrality.
In this blog post, I attempt to outline the article for perspective readers and provide a few potentially useful links. Although I abstract completely from the math and intuition behind the results, the article is extremely straightforward with this regard.
A good starting point for a discussion of net neutrality begins with an understanding of the uses of the Internet. As the authors see it, there are four relevant categories of use for the Internet:
Although much economic research tends to abstract from the technical issues revolving use of the Internet, many studies of net neutrality implicitly model the third variant above and the authors follow suit. This makes up the bulk of modern Internet traffic: for instance, together, Netflix, Youtube, and Amazon Prime have consistently made up approximately 50 percent of all North American Internet traffic as of late.
There are three common arrangements for moving data from content providers to users:
The authors focus on two definitions of net neutrality: (1) prohibition of payment from content providers to Internet service providers (referred to as one-sided pricing whereby ISPs can only charge consumers) and (2) prohibition of prioritization of traffic with or without compensation. As Johannes Bauer and Jonathan Obar point out, these are not the only alternatives for governing the Internet (see Bauer and Obar 2014). In a simple world with no competition and homogeneous users, the authors suggest that net neutrality does not affect profits or consumer surplus. A number of real world considerations are taken into account, and the potential ramification of imposing net neutrality are suggested as follows.
The authors caution against broad policy prescriptions, and rightly so, given the present ambiguity surrounding the impacts of net neutrality. Along the way, the authors inspire a number of open empirical questions that might help policy makers.
I suspect that the first two questions are fairly difficult to answer from an economics perspective because in large part they depend on significant insider knowledge about contracting among market participants. The Quello staff and I are presently contemplating how to rigorously answer questions (3) and (4). We are very interested in your feedback.
Having appreciated my colleague Aleks’ Yankelevich’s creative use of a “food” metaphor to explain an important aspect of economic analysis, I thought it fitting, on the day of oral arguments in the legal challenge to the FCC’s Open Internet Order, to consider another effective use of such a metaphor: Supreme Court Justice Antonin Scalia’s dissent in the Brand X case. Whereas the majority opinion in that case deferred to an earlier FCC ruling that Internet access was an “information” rather than a “telecommunication” service, Scalia–joined by two liberal justices, Ruth Bader Ginsburg and David Souter–argued that the majority’s view was akin to accepting a claim by the owner of a pizzeria that it delivered pizza, but didn’t “offer pizza delivery service.”
Below are some excerpts from Scalia’s dissent that I find most significant in terms of how the DC Circuit (and perhaps later, the Supreme Court) should and will rule in the latest challenge to the FCC’s Open Internet Order, which is the first in which the Commission has treated Internet access as a Title II “telecommunication” service rather than an “information” service.
The first sentence of the FCC ruling under review reads as follows: “Cable modem service provides high-speed access to the Internet, as well as many applications or functions that can be used with that access, over cable system facilities”…Does this mean that cable companies “offer” high-speed access to the Internet? Surprisingly not, if the Commission and the Court are to be believed.
It happens that cable-modem service is popular precisely because of the high-speed access it provides, and that, once connected with the Internet, cable-modem subscribers often use Internet applications and functions from providers other than the cable company. Nevertheless, for purposes of classifying what the cable company does, the Commission (with the Court’s approval) puts all the emphasis on the rest of the package (the additional “applications or functions”). It does so by claiming that the cable company does not “offe[r]” its customers high-speed Internet access because it offers that access only in conjunction with particular applications and functions, rather than “separate[ly],” as a “stand-alone offering…”
There are instances in which it is ridiculous to deny that one part of a joint offering is being offered merely because it is not offered on a “stand-alone” basis…If, for example, I call up a pizzeria and ask whether they offer delivery, both common sense and common “usage”…would prevent them from answering: “No, we do not offer delivery–but if you order a pizza from us, we’ll bake it for you and then bring it to your house.” The logical response to this would be something on the order of, “so, you do offer delivery.” But our pizza-man may continue to deny the obvious and explain, paraphrasing the FCC and the Court: “No, even though we bring the pizza to your house, we are not actually “offering” you delivery, because the delivery that we provide to our end users is ‘part and parcel’ of our pizzeria-pizza-at-home service and is ‘integral to its other capabilities.’”… Any reasonable customer would conclude at that point that his interlocutor was either crazy or following some too-clever-by-half legal advice.
In short, for the inputs of a finished service to qualify as the objects of an “offer” (as that term is reasonably understood), it is perhaps a sufficient, but surely not a necessary, condition that the seller offer separately “each discrete input that is necessary to providing . . . a finished service…”
Shifting his analogy from pizza to puppies, Justice Scalia adds:
The pet store may have a policy of selling puppies only with leashes, but any customer will say that it does offer puppies because a leashed puppy is still a puppy, even though it is not offered on a “stand-alone” basis.
Despite the Court’s mighty labors to prove otherwise, …the telecommunications component of cable-modem service retains such ample independent identity that it must be regarded as being on offer–especially when seen from the perspective of the consumer or the end user, which the Court purports to find determinative.
Since the majority opinion in Brand X was based primarily on the doctrine of “administrative deference” derived from the 1984 Supreme Court case Chevron U.S.A., Inc. v. Natural Resources Defense Council, Inc., one would hope and expect that the DC Circuit Court judges hearing today’s oral arguments would remember what Justice Thomas wrote in that majority opinion: “If a statute is ambiguous, and if the implementing agency’s construction is reasonable, Chevron requires a federal court to accept the agency’s construction of the statute, even if the agency’s reading differs from what the court believes is the best statutory interpretation.”
When the majority’s Chevron-base deference is coupled with Justice Scalia’s simple but clear and commonsensical analogies to pizza and puppies, it’s hard for me to imagine a strong legal basis for the Circuit Court (or the Supreme Court if it ends up ruling on the case) to rule against the FCC’s Title II-based Open Internet Order. Perhaps today’s oral arguments will provide some additional clues as to whether I’m right or wrong about that (Update: downloadable audio of the oral arguments is here (wireline) and here (wireless, First Amendment, Forbearance). h/t @haroldfeld, whose initial response to today’s arguments is here.
I thought I’d write a short follow-up in response to the exchange of comments following my recent post on issues related to impacts of the FCC’s Open Internet order on ISP investment.
I very much appreciate the responses to my post, especially from Hal Singer and Mark Jamison, whose work was the target of my sometimes insufficiently respectful criticism. It helped me understand the substantive issues better and also reminded me that respectful dialog on important and controversial issues may not always be easy, but is certainly worth the effort…and that I’m still somewhat haltingly learning that lesson.
I especially appreciated the content and tone of Mark’s comment, including:
I won’t make the claim that my approach revealed reality and that yours did not. We have too little information for that. And even if we had sufficient data for a proper study, there would still be errors. That said, I would be glad to work with you and/or your colleagues on a study once sufficient data are available.
In my view this pretty well describes the aim of the Quello Center’s investigation of this policy issue: to gather as much useful data as we can and to apply to it a mix of the most useful modes of analysis to understand what’s going on and to refine the models we use to understand and predict policy outcomes. I hope to be part of that process, contributing my best skills and strengths, being humble enough to acknowledge their limits, and learning from others who have different expertise and perspectives.
Mark’s comment reminds me of the story about the blind men trying to describe the elephant, all of them describing it differently based on which part of the massive creature they were feeling with their hands. While I wouldn’t describe all of us focused on this issue as blind, I think it’s fair to say that we (and, as Mark notes, our methods) all suffer from some form of perceptual limitation. Some of us are nearsighted, others farsighted and perhaps others see clearly only with one eye…and occasionally we all may feel compelled to close our eyes to avoid seeing something that makes us very uncomfortable.
Though when it comes to policy research we may never be able to see and agree on “the truth,” my hope is that the Quello Center’s research team can be part of an effort to carefully study this and other policy “elephants” from as many angles as we can, and work together to understand their key dynamics, while at the same time remembering the value of respectful dialog, even when a voice inside our head might be telling us “that guy describing the elephant’s tail must be a fool or a scoundrel.”
In the past week or so I’ve seen several articles that remind me how important the Quello Center’s empirically-grounded study of net neutrality impacts is for clarifying what these impacts will be—especially since net neutrality is one of those policy topics where arguments are often driven by ideology and/or competing financial interests.
As far as I can tell, this series of articles began with an August 25 piece written by economist Hal Singer and published by Forbes under the following headline: Does The Tumble In Broadband Investment Spell Doom For The FCC’s Open Internet Order? Per his Forbes bio, Singer is a principal at Economists Incorporated, a senior fellow at the Progressive Policy Institute, and an adjunct professor at Georgetown University’s McDonough School of Business.
Singer’s piece was followed roughly a week later by two op-ed pieces published on the American Enterprise Institute’s web site. The title of the first AEI piece, authored by Mark Jamison, was Title II’s real-world impact on broadband investment. This was followed a day later by Bronwyn Howell’s commentary Title II is hurting investment. How will – and should – the FCC respond?
What struck me about this series of op-ed pieces published by economists and organizations whose theoretical models and policy preferences appear to favor unregulated market structures was that their claims that “Title II is hurting investment” were all empirically anchored in Singer’s references to declines in ISP capital spending during the first half of 2015. As a member of the Quello Center’s research team studying the impacts of net neutrality, I was intrigued, and eager to dig into the CapEx data and understand its significance.
While my digging has only begun, what I found reminded me how much the communication policy community needs the kind of fact-based, impartial and in-depth empirical analysis the Quello Center has embarked upon, and how risky it is to rely on the kind of ideologically-driven analysis that too often dominates public policy debates, especially on contentious issues like net neutrality.
My point here is not to argue that there are clear signs that Title II will increase ISP investment, but rather that claims by Singer and others that there are already signs that it is hurting investment are not only premature, but also based on an incomplete reading of evidence that can be uncovered by careful and unbiased review of publicly available information.
I hope to have more to say on this topic in future posts, but will make a few points here.
The crux of Singer’s argument is based on his observation that capital spending had declined fairly dramatically for a number of major ISPs during the first half of 2015, dragging down the entire sector’s spending for that period (though its not clear from the article, my sense is that Singer’s reference to “all” wireline ISPs refers to the industry’s larger players and says nothing about investment by smaller companies and the growing ranks of publicly and privately owned FTTH-based competitors). He then briefly reviews and dismisses potential alternative explanations for these declines, concluding that their only other logical cause is ISPs’ response to the FCC’s Open Internet Order (bolding is mine):
AT&T’s capital expenditure (capex) was down 29 percent in the first half of 2015 compared to the first half of 2014. Charter’s capex was down by the same percentage. Cablevision’s and Verizon’s capex were down ten and four percent, respectively. CenturyLink’s capex was down nine percent. (Update: The average decline across all wireline ISPs was 12 percent. Including wireless ISPs Sprint and T-Mobile in the sample reduces the average decline to eight percent.)..
This capital flight is remarkable considering there have been only two occasions in the history of the broadband industry when capex declined relative to the prior year: In 2001, after the dot.com meltdown, and in 2009, after the Great Recession. In every other year save 2015, broadband capex has climbed, as ISPs—like hamsters on a wheel—were forced to upgrade their networks to prevent customers from switching to rivals offering faster connections.
What changed in early 2015 besides the FCC’s Open Internet Order that can explain the ISP capex tumble? GDP grew in both the first and second quarters of 2015. Broadband capital intensity—defined as the ratio of ISP capex to revenues—decreased over the period, ruling out the possibility that falling revenues were to blame. Although cord cutting is on the rise, pay TV revenue is still growing, and the closest substitute to cable TV is broadband video. Absent compelling alternatives, the FCC’s Order is the best explanation for the capex meltdown.
I haven’t had a chance to carefully review the financial statements and related earnings material of all the companies cited by Singer, but did take a quick look at this material for AT&T and Charter since, as he notes, they experienced by far the largest percentage drop in spending. What I found doesn’t strike me as supporting his conclusion that the decline was network neutrality-driven. Instead, in both cases it seems to pretty clearly reflect the end of major investment projects by both companies and related industry trends that seem to have nothing to do with the FCC’s Open Internet order.
My perspective on this is based on statements made by company officials during their second quarter 2015 earnings calls, as well as capex-related data in their financial reporting.
During AT&T’s earnings call, a Wall Street analyst asked the following question: “[T]he $18 billion in CapEx this year implies a nice downtick in the U.S. spending, what’s driving that? Are you finding that you just don’t need to spend it or are you sort of pushing that out to next year?” In his response to the question, John Stephens, the company’s CFO, made no mention of network neutrality or FCC policy decisions. Instead he explained where the company was in terms of key wireless and wireline strategic network investment cycles (bolding is mine):
Well, I think a couple of things. And the simplest thing is to say [is that the] network team did a great job in getting the work done and we’ve got 300, nearly 310 million POPs with LTE right now. And we are putting our spectrum to use as opposed to building towers. And so that aspect of it is just a utilization of spectrum we own and capabilities we have that don’t require as much CapEx. Secondly, the 57 million IP broadband and what is now approximately 900,000 business customer locations passed with fiber. Once again, the network guys have done a great job in getting the Project VIP initiatives completed. And when they are done…the additional spend isn’t necessary, because the project has been concluded not for lack of anything, but for success.
Later on in the call, another analyst asked Stephens “[a]s you look out over the technology roadmap, like 5G coming down the pipeline, do you anticipate that we will see another period of elevated investment?”
While Stephens pointed to a potential future of moderated capital spending, he made no reference to network neutrality or FCC policy, focusing instead on the investment implications of the company’s (and the industry’s) evolution to software-defined networks.
I would tell you that’s kind of a longer term perspective. What we are seeing is our move to get this fiber deep into the network and getting LTE out deep into the wireless network and the solutions that we are finding in a software-defined network opportunity, we see a real opportunity to actually strive to bring investments, if you will, lower or more efficient from historical levels. Right now, I will tell you that this year’s investment is going to be in that $18 billion range, which is about 15%. We are certainly – we are not going to give any guidance with regard to next year or the year after. And we will give an update on this year’s guidance, if and when in our analyst conference if we get that opportunity. With that being said, I think there is a real opportunity with some of the activities are going on in software-defined networks on a longer term basis to actually bring that in capital intensity to a more modest level.
Charter’s large drop in capital spending appears to be driven by a similar “investment cycle” dynamic. During its 2Q15 earnings call, CFO Christopher Winfrey noted that Charter’s year-over-year decline in total CapEx “was driven by the completion of All-Digital during the fourth quarter of last year,” referring to the company’s migration of its channel lineup and other content to an all-digital format.
A review of the company’s earnings call and financial statements suggests that a large portion of the “All-Digital” capital spending was focused on deploying digital set-top boxes to Charter customers, resulting in a precipitous decline in the “customer premise equipment” (CPE) category of CapEx. According to Charter’s financial statements, first-half CPE-related CapEx fell by more than half, or $341 million, from $626 million to $285 million. Excluding this sharp falloff in CPE spending driven by the end of Charter’s All-Digital conversion, the remainder of the company’s capital spending was actually up 3% during the first half of 2015. And this included a 7% increase in spending on “line extensions,” which Charter defines as “network costs associated with entering new service areas.” It seems to me that, if Charter was concerned that the Commission’s Open Internet order would weaken its business model, it would be cutting rather than increasing its investment in expanding the geographic scope of its network.
To understand the significance of Charter’s spending decline, I think it’s important to note that its 29% decline in first half total CapEx was driven by a 54% decline in CPE spending, and that the company’s non-CPE investment—including line extensions—actually increased during that period. I found it odd that, even as he ignored this key dynamic for Charter, Singer seemed to dismiss the significance of Comcast’s CapEx increase during the same period by noting that it was “attributed to customer premises equipment to support [Comcast’s] X1 entertainment operating system and other cloud-based initiatives.”
I also couldn’t help notice that, in his oddly brief reference to the nation’s largest ISP, Singer ignored the fact that every category of Comcast’s capital spending increased by double-digits during the first half of 2015, including its investment in growth-focused network infrastructure, which expanded 24% from 2014 levels. Comcast’s total cable CapEx was up 18% for the first half of the year, while at Time Warner Cable, the nation’s second largest cable operator, it increased 16%.
While these increases may have nothing to do with FCC policy, they seem very difficult to reconcile with Singer’s strongly-assserted argument, especially when coupled with the above discussion of company-specific reasons for large CapEx declines for AT&T and Charter. As that discussion suggests, the reality behind aggregated industry numbers (especially when viewed through a short-term window of time) is often more complex and situation-specific than our economic models and ideologies would like it to be. This may make our research harder and messier to do at times, but certainly not less valuable. It also speaks to the value of longitudinal data collection and analysis, to better understand both short-term trends and those that only become clear over a longer term. That longitudinal component is central to the approach being taken by the Quello Center’s study of net neutrality impacts.
One last general point before closing out this post. I didn’t see any reference in Singer’s piece or the AEI-published follow-ups to spending by non-incumbent competitive providers, including municipally and privately owned fiber networks that are offering attractive combinations of speed and price in a growing number of markets around the country. While this category of spending may be far more difficult to measure than investments by large publicly-owned ISPs, it may be quite significant in relation to public policy, given its potential impact on available speeds, prices and competitive dynamics.
Expect to see more on this important topic and the Quello Center’s investigation of it in later posts, and please feel free to contribute to the discussion via comments on this and/or future posts.
On April 1 the Information Technology & Innovation Foundation (ITIF) held an event to discuss its new report entitled “How Techno-Populism Is Undermining Innovation.” The thrust of the report was to contrast the dangers of what it describes as “tech populism” with the virtues of what it calls “tech progressivism.”
The report begins with:
There was a time when technology policy was a game of “inside baseball” played mostly by wonks from government agencies, legislative committees, think tanks, and the business community. They brought sober, technical expertise and took a methodical approach to advancing the public interest on complex issues such as intellectual property rights in the digital era or electronic surveillance of telecommunications networks. But those days are gone. Tech policy debates now are increasingly likely to be shaped by angry, populist uprisings—as when a stunning four million submissions flooded into the Federal Communications Commission in response to its request for public comment on the issue of net neutrality; or when a loose coalition of protesters staged a dramatic blackout of popular websites in January 2012 to halt legislation that was intended to curb online piracy.
The authors seem to consider the mass-scale FCC comments and grassroots coalition building on tech issues as dangerous and destructive, in ways I find difficult to recognize:
Populism draws its strength from individuals’ fears, misunderstandings, or distrust, appealing to the prejudices of crowds and relying on demagoguery, distortion, and groupthink. Tech populists focus on maximizing self-interest and personal freedom, even if it comes at the expense of broader public interests.
I find the last reference to the “broader public interests” especially strange, since most of those I know who support net neutrality rules and strong privacy protections (whether expert or non-expert) strike me as genuinely very concerned about the public interest.
While there is plenty of room for thoughtful and respectful debate about how best to serve the public interest, the paper’s heavy use of straw-man arguments strikes me as an unfortunate example of the “demagoguery, distortion and groupthink” it condemns among those who seek to bring more citizens into the public policy arena (though exercised with a different style and mix of debating techniques).
The paper later notes that:
To be clear, the problem with technology policy debates is not that they have become more open and participatory, but rather that many, if not most of those who are choosing to engage in these debates do so from a position of fear, anger, or misunderstanding.
I strongly agree that communication policy debates should be based on facts, logic and a focus on the public interest. But I think the paper is pretty biased in how it assigns responsibility for relying on “fear, anger and misunderstanding” (perhaps a close relative of FUD).
Related to this is the paper’s suggestion that it is irrational to embrace the “populist” view that:
[E]lites, especially big business and big government, will prevent useful rules from being established—or, if those rules are established, will find ways to bypass them at the expense of the broader public. They distrust the private sector because they believe corporations are driven purely by profit, and they distrust the public sector because they believe government is ineffectual and overbearing.
While this so-called “populist view” might be more accurate with a bit more elaboration and nuance, I disagree with the report’s suggestion that it is far from the mark in describing the reality of the political economy we’ve experienced in this country over the past several decades. When I consider actions taken and statements made by government officials (e.g., related to the Iraq War, NSA activities, financial reform, etc.) and some large corporations (e.g., in their lobbying and PR efforts to restrict municipal fiber network projects, neuter financial reform, etc.) I see valid, readily documentable reality-based reasons for distrust. And, to use the report’s own language, I’d rank these powerful institutions as among the most skilled and well-resourced purveyors of “fear, anger and misunderstanding.” They can, after all, afford to hire the most skilled practitioners of FUD, “truthiness” and other communication black arts.
MSU’s Quello Center is launching a study of the impact of net neutrality.
With the support for net neutrality regulations at the FCC, and in the White House, the debate should quickly move from theoretical speculation to empirical realities: What will be the actual impact of net neutrality regulation?
The net neutrality debate has galvanized a wide variety of stakeholders in opposing camps around the wisdom of this regulation on the future of a global, open and secure Internet. Proponents argue that net neutrality will keep the Internet open and in line with its early vision by not advantaging those who can pay for fast lanes, while opponents have raised numerous concerns about the role regulation could play in constraining efficiency, competition, investment, and innovation of the Internet and patterns of its use by individuals, households, business and industry. It has become a politically and commercially contentious issue that has become increasingly partisan and commensurately over simplified around competing positions. However, from all sides of this debate, the implications are expected to be of major importance to the future of the Internet in the US, but also globally, as other nations will be influenced by policy and regulatory shifts in the United States.
It is therefore important that claims about the value and risk of net neutrality become a focus of independent empirical research. In many ways, the FCC’s decision on net neutrality presents an opportunity for a natural experiment that will provide real evidence on the actual role that net neutrality will play for actors across the Internet and telecommunication industries, but also users and consumers of Internet services.
Academic research needs to be analytically skeptical and seek to challenge taken-for-granted assumptions on both sides of the debate with empirical research and analysis. The Quello Center is well positioned to conduct this research. It was established by an endowment in honor of former FCC Commissioner, James H. Quello, to study media and information policy in a neutral and dispassionate way. The Center’s endowment provides the independence and wherewithal to launch this project with an eye towards expansion of the project if justified by the support of its Advisory Committee, sponsorship and other sources of funding, such as foundations concerned with the social and economic futures of the Internet.
The project will be led by Professor Bill Dutton, the new Director of the Quello Center. Before taking this position, Bill was founding Director of the Oxford Internet Institute and Professor of Internet Studies at the University of Oxford. Other MSU and Quello faculty involved in this project include:
Staff of the Quello Center, including Mitchell Shapiro, and an Assistant Research Professor for whom a new search is underway, will be committed to this project, and we will develop collaborations with faculty and practitioners with an interest in supporting and joining this research initiative.
The Quello Center welcomes expressions of support and offers of collaboration or sponsorship on what is an important albeit complex and challenging issue for policy research. If you wish to comment on, or support this research initiative, please contact Bill Dutton, or any of the faculty associates.
Contact: Professor Dutton at Quello@msu.edu
In a blog entry here yesterday I described FCC Chairman Wheeler’s Title II proposal as “replanting the roots” of communication policy in the digital age. Shortly after I posted it, the Commission released a four-page summary of the proposed “New Rules for Protecting the Open Internet.” Not surprisingly, the document triggered a barrage of public responses from a range of interested parties on both sides of the issue.
After reviewing the FCC’s Fact Sheet and some of these responses, I found myself puzzled about claims regarding risks and problems associated with Title II classification. So I thought I’d invite comments to help clarify what those risks and problems really are.
In yesterday’s post I focused on:
Today I want to focus more on questions of near-term strategy, tactics and risks related to the Commission’s proposed Title II action, and invite comments that clarify how and why the proposed Title II classification is problematic. It’s a claim I’ve heard often, but have difficulty understanding.
Here’s how I see it:
[Update: shortly after this was written, the FCC released details about Chairman Wheeler’s “Protecting the Open Internet” proposal, which will be discussed here in later posts]
With the FCC expected to classify broadband access as a Title II common carrier service, while also preempting state restrictions on municipally-owned access networks, the Commission’s February 26 meeting is poised to launch a new era in U.S. communication policy.
To appreciate the significance of the Commission’s impending Title II decision, it’s useful to step back from the drama and details of today’s regulatory and market battles, and consider the agency’s upcoming vote from a historical perspective, starting with the Communications Act of 1934. I’d suggest that, viewed from that perspective, the FCC’s decision to treat broadband access under Title II is an attempt to replant the roots of communication law in the fertile ground of today’s First Amendment-friendly technology.
The Act’s stated purpose was:
“to make available, so far as possible, to all the people of the United States a rapid, efficient, nationwide, and worldwide wire and radio communication service with adequate facilities at reasonable charges.”
Given the relatively primitive technology of that era, the 1934 Act adopted different regulatory schemes for wireless broadcasting and wireline telephony, each designed to accommodate the technical constraints of the industry it was to regulate. Wireless broadcasting, constrained by technical interference among a cacophony of competing “voices,” was addressed by a system of exclusive licensing. This gave a relative handful of licensees First Amendment megaphones of unprecedented reach and power, in exchange for a vague and difficult-to-enforce set of “public interest” obligations.
Unwieldy at best, enforcement of broadcasting’s public interest regulations was largely abandoned in the 1980s under the Reagan Administration, which viewed deregulation as a much needed and broadly applicable solution to the nation’s economic problems. From that perspective, the best way to serve the public interest was, in most cases, to rely on the “magic of the market.” To the Administration’s first FCC Chair, Mark Fowler, the powerful broadcast and cable media were just more markets needing a healthy dose of deregulation. As he famously put it, television was “a toaster with pictures.”