April 16th, 2016
One argument against federal funding to support special access and community broadband networks—or potentially any infrastructure project—is that the federal government “can’t afford it,” especially given the widely held belief that it should prioritize balancing the federal budget and paying down the federal debt.
My suggestion to those holding this view (or being confused and/or intimidated by it in public policy debates) is to begin examining the extensive literature related to Modern Monetary Theory (MMT), perhaps starting with the selection of material to which I provide links at the end of this post (some of which are scholarly in nature, others geared more toward the layperson).
I certainly don’t expect this single blog post to convince skeptics of the validity of MMT, but will discuss it a bit more before moving on to other perspectives that inform the policy approaches I’m attempting to develop here.
One of the most central and policy-significant concepts of MMT is that what we consider to be the federal government’s “deficit” and “debt” are not the equivalent of the debts carried by private households and businesses (or, for that matter, individual states). The key difference—and one with major policy implications—is that the federal government is the “issuer” of our nation’s currency (and thus cannot “run out of dollars”), whereas the rest of us are “users” of that currency (and definitely can run out of dollars). This doesn’t mean that the federal deficit and federal spending levels don’t matter at all, it just means that how they matter isn’t the same as how household and business debts matter. As MMT economist Bill Mitchell put it in a long blog post that I excerpted in a much shorter one (bolding is mine):
[A] nation will have maximum fiscal space:
1) If it operates with a sovereign currency; that is, a currency that is issued by the sovereign government and that is not pegged to foreign currencies; and
2) If it avoids incurring debt in foreign currencies, and avoids guaranteeing the foreign currency debt of domestic entities (firms, households, or state, province, or city debts).
Under these conditions, the national government can always afford to purchase anything that is available for sale in its own currency. This means that if there are unemployed resources, the government can always mobilize them – putting them to productive use – through the use of fiscal policy. Such a government is not revenue-constrained, which means it does not face the financing constraints that a private household or firm faces in framing their expenditure decision.
To put it as simply as possible – this means that if there are unemployed workers who are willing to work, a sovereign government can afford to hire them to perform useful work in the public interest. From a macroeconomic efficiency argument, a primary aim of public policy is to fully utilize available resources.
Back in 2012 I discussed MMT in a number of posts on my personal blog. Another post that strikes me as especially relevant to this discussion is entitled Understanding and Embracing the Sovereign Currency Opportunity. It discusses a post by Dan Kervick on the New Economic Perspectives blog, which I thought did a good job of describing the nature of what I refer to as the “sovereign currency opportunity,” and its relevance to broadband and other infrastructure-related policies.
As Kervick explains:
MMT argues that [what we refer to as a federal budget deficit] should be recognized as the normal operating condition of an intelligent national government pursuing public purposes in an effective way, at least when that government is a sovereign currency issuer that lets its currency float freely on foreign exchange markets. If the government is running a deficit in its currency, then the non-governmental sectors of the economy are running a surplus in that currency and their net stock of financial assets in that currency is growing. If the government is running a surplus, on the other hand, then the net stock of financial assets in the non-governmental sectors is decreasing. We expect a growing economy to be increasing its financial asset stocks, and so we should expect government deficits as a matter of course.
A related critique of public investment in infrastructure is that it will crowd-out private investment. But, as if often the case with special access and local broadband networks, if the private sector entities best positioned to make that investment (mainly because they operated for decades as competitively and financially protected monopolies) require financial returns that lead to the economic harms suggested by both CFA’s and ASR’s analyses, then I’d argue that so-called crowding out of that investment is likely to be a good thing for the economy and society as a whole (I discuss factors related to the interaction between “shareholder value” and “social value” here and here).
 I’ll briefly note here that several of Bernie Sanders’ key economic advisers (including Stephanie Kelton, who recently served as the Democrat’s Chief Economist on the Senate Budget Committee chaired by Sanders) appreciate the relevance of MMT to today’s policy debates, including the expanded fiscal space it opens up to federal governments that are issuers of sovereign currencies (which, btw, is sadly no longer the case for nations that use the Euro as their currency). So, even though Sanders typically balances his ambitious infrastructure investment and other proposals with offsetting tax revenue, an understanding of MMT makes it clear that this is not necessary in the way that most politicians and voters (and still too many economists) appear to believe that it is.]
 For those interested in more information about MMT, I’d recommend the following, in rough descending order of sophistication and time required to digest them: 1) a recently published textbook entitled Modern Monetary Theory and Practice – an Introductory Text, by economists Bill Mitchell and Randall Wray; 2) a Levy Institute working paper entitled Modern Money Theory 101, A Reply to Critics, authored by Wray and Eric Tymoigne; 3) Wray’s MMT Primer, including a link to the published version and the original blog-based discussions on which it was based; 4) my own first exposure to MMT, Seven Deadly Innocent Frauds of Economic Policy, by Warren Mosler; 5) a layperson-friendly graphics-rich e-book entitled Diagrams & Dollars, Modern Money Illustrated, by J.D. Alt (available as a Kindle e-book or a somewhat abridged two-part blog post) and, for those with only a few minutes of time; a very brief excerpt from early MMT textbook draft material that I cited in a 2012 blog post because I thought it succinctly summarized several key points)
As I discussed in an earlier post, the Consumer Federation of America (CFA) recently released a paper by its Director of Research, Mark Cooper, which made the case that the FCC’s decision to deregulate special access in 1999 was premature and has resulted in large-scale economic harm, including an estimated $150 billion over the past five years. Cooper’s analysis focused on two elements of harm: 1) the direct cost associated with non-competitive excess-profit-extracting pricing and; 2) the indirect economic costs associated with this pricing regime.
As it turns out, a few days after Cooper presented an overview of his analysis at a New America Foundation event, a paper was published by Economists Inc. Written by EI principal Hal Singer and, according to its cover page, funded at least in part by USTelecom, the nation’s ILEC trade association, the EI paper approached the issue from a different perspective, as explained in its executive summary:
This paper seeks to model the likely impact of the FCC’s recent effort to preserve and extend its special access rules on broadband deployment, as telcos transition from TDM-based copper networks to IP-based fiber networks to serve business broadband customers. The deployment impact of expanded special access rules can be measured as the difference between (1) how many buildings would have been lit with fiber by telcos in the absence of the rules and (2) how many buildings will be lit with fiber by telcos in the presence of the rules. With an estimate of the cost per building, the deployment impact can be converted into an investment impact. And with estimates of broadband-specific multipliers, the fiber-to-the-building network investment impact can be converted into job and output effects.
The executive summary also highlights the study’s key findings:
In the absence of any new regulation (the “Baseline Case”), an ILEC is predicted to increase business-fiber penetration… from 10 to 20 percent over the coming years…Next, we model a scenario where special-access price regulation extends to the ILECs’ fiber networks. Assuming this scenario reduces an ILEC’s expected Ethernet revenue by 30 percent—the typical price effect associated with prior episodes of price-cap regulation and unbundling—the model predicts that ILEC will increase business-fiber penetration from 10 to 14 percent (compared to 20 percent in the Baseline Case)…Thus, the special access obligations under this scenario result in a 55 percent reduction in an ILEC’s CapEx relative to the Baseline Case….Thus, expansion of special access price regulation to Ethernet services is predicted to reduce ILEC fiber-based penetration by 67,300 buildings nationwide—a result that is hard to reconcile with the FCC’s mandate to encourage broadband deployment.
Singer then considers the spillover effects of this reduced ILEC investment in fiber infrastructure. Using “a jobs multiplier of approximately 20 jobs per million dollars of broadband investment” and “a fiber-construction output multiplier of 3.12,” Singer estimates the resulting economic harm of FCC special access rules to be an annual loss of 43,560 jobs and $3.4 billion in economic output over a five-year period.
It’s worth noting that Singer’s estimate of $17 billion in economic losses over a five year period due to imposition of special access rules is considerably lower than Cooper’s estimate of $150 billion in economic harm from the unregulated status quo in today’s special access market. While Singer and others will likely take issue with Cooper’s assumptions and estimates, the latter’s paper seems to, at the very least, make a strong case that the economic benefits and harms associated with different special access regulatory regimes don’t only flow in the direction analyzed by Singer, and that policymakers would be wise to carefully consider a full array of harms and benefits associated with alternative regulatory approaches.
An opportunity to explore new policy, funding, ownership models
My sense is that both of these studies raise valid points about the types of economic harm associated with different approaches to (de)regulating special access (and other telecommunications) markets.
I also believe that valuable perspective on this issue can be gained from a review of of ASR Analytics’ estimates of economic benefits resulting from BTOP investments in fiber infrastructure (some of which I discussed in a recent post). Not only does the ASR study do a good job of applying prior knowledge and accepted methods in analyzing broadband-related economic impacts, it also suggests to me that, rather than getting caught up in the details of the Cooper/Singer and related debates, a more useful approach is to take a step back from the quantitative details of these dueling studies, and consider broadband public policy from a “public infrastructure” perspective.
In a follow-up post I outline a research project designed to build on the knowledge base developed by ASR’s study of the Comprehensive Community Infrastructure (a.k.a., “middle mile fiber”) component of the BTOP program.
In addition, I’ve prepared several other posts that try to explain some of the threads of scholarship that inform my own view of how—especially in cases lacking sufficient competition—special access and last mile access networks can deliver the most social value if treated as public infrastructure.
An annotated list of links to these posts is provided below. I’d encourage anyone involved and/or interested in policy debates related to issues such as special access, community broadband, network neutrality and universal service to review these posts and perhaps also explore the sources they refer to:
a) the relevance of Modern Monetary Theory (a.k.a. Functional Finance) to policymaking related to federal financial support for investments in telecommunications and other infrastructure;
b) the demand-side analysis of infrastructure resources laid out by Brett Frischmann in his 2012 book, Infrastructure: The Social Value of Shared Resources, and the Internet- and telecom-related policies it suggests;
c) the analytical framework developed by author Marjorie Kelly in her book Owning Our Future, which highlights key differences between what Kelly refers to as “generative” vs. “extractive” ownership models. One post reviews Kelly’s key concepts and considers AT&T as an example of extractive ownership of telecommunications infrastructure. A second post considers how Kelly’s framework applies to the role of community-owned broadband networks in the Internet access sector, and suggests research questions related to this that I believe are worthy of further investigation.
Elizabeth A. Kirley presented a talk for the Quello Center that addressed alternative approaches to protecting reputations online. Professor Adam Candeub served as a respondent. So much is said about protecting reputations online that it is brilliant to have a thoughtful and well informed discussion of international agreements on human rights, national legal doctrines, and online reputation.
Entitled ‘Trashed: A Comparative Exploration of Law’s Relevance to Online Reputation’, through case studies, Dr. Elizabeth Kirley explores the cultural and historical influences that have resulted in very distinct legal regimes and political agenda. Her central thesis is that digital speech is sufficiently different in kind from offline speech that it calls for a more 21st century response to the harms it can inflict on our reputational privacy.
Dr Elizabeth Kirley is a 2015-16 Postdoctoral Fellow at the Nathanson Centre for Transnational Human Rights, Crime and Security at Osgoode Hall Law School, York University in Toronto and a frequent lecturer in issues raised by digital speech, technology crimes and robotic journalism. Recent research and presentation activities include the European University Institute, Florence; the Oxford Internet Institute, Oxford UK; the American Graduate School of Paris; Ecole des hautes etudes commerciales de Paris; Sciences-Po University in Paris; Osnabruck University in Germany; and the Limerick School of Law, Ireland. She is a barrister and solicitor and called to the Ontario bar.
Professor Adam Candeub is on the Law Faculty at Michigan State University, and a Research Associate with the Quello Center. He was an attorney-advisor for the Federal Communications Commission (FCC) in the Media Bureau and previously in the Common Carrier Bureau, Competitive Pricing Division. From 1998 to 2000, Professor Candeub was a litigation associate for the Washington D.C. firm of Jones, Day, Reavis & Pogue, in the issues and appeals practice.
In the past week or so I’ve seen several articles that remind me how important the Quello Center’s empirically-grounded study of net neutrality impacts is for clarifying what these impacts will be—especially since net neutrality is one of those policy topics where arguments are often driven by ideology and/or competing financial interests.
As far as I can tell, this series of articles began with an August 25 piece written by economist Hal Singer and published by Forbes under the following headline: Does The Tumble In Broadband Investment Spell Doom For The FCC’s Open Internet Order? Per his Forbes bio, Singer is a principal at Economists Incorporated, a senior fellow at the Progressive Policy Institute, and an adjunct professor at Georgetown University’s McDonough School of Business.
Singer’s piece was followed roughly a week later by two op-ed pieces published on the American Enterprise Institute’s web site. The title of the first AEI piece, authored by Mark Jamison, was Title II’s real-world impact on broadband investment. This was followed a day later by Bronwyn Howell’s commentary Title II is hurting investment. How will – and should – the FCC respond?
What struck me about this series of op-ed pieces published by economists and organizations whose theoretical models and policy preferences appear to favor unregulated market structures was that their claims that “Title II is hurting investment” were all empirically anchored in Singer’s references to declines in ISP capital spending during the first half of 2015. As a member of the Quello Center’s research team studying the impacts of net neutrality, I was intrigued, and eager to dig into the CapEx data and understand its significance.
While my digging has only begun, what I found reminded me how much the communication policy community needs the kind of fact-based, impartial and in-depth empirical analysis the Quello Center has embarked upon, and how risky it is to rely on the kind of ideologically-driven analysis that too often dominates public policy debates, especially on contentious issues like net neutrality.
My point here is not to argue that there are clear signs that Title II will increase ISP investment, but rather that claims by Singer and others that there are already signs that it is hurting investment are not only premature, but also based on an incomplete reading of evidence that can be uncovered by careful and unbiased review of publicly available information.
I hope to have more to say on this topic in future posts, but will make a few points here.
The crux of Singer’s argument is based on his observation that capital spending had declined fairly dramatically for a number of major ISPs during the first half of 2015, dragging down the entire sector’s spending for that period (though its not clear from the article, my sense is that Singer’s reference to “all” wireline ISPs refers to the industry’s larger players and says nothing about investment by smaller companies and the growing ranks of publicly and privately owned FTTH-based competitors). He then briefly reviews and dismisses potential alternative explanations for these declines, concluding that their only other logical cause is ISPs’ response to the FCC’s Open Internet Order (bolding is mine):
AT&T’s capital expenditure (capex) was down 29 percent in the first half of 2015 compared to the first half of 2014. Charter’s capex was down by the same percentage. Cablevision’s and Verizon’s capex were down ten and four percent, respectively. CenturyLink’s capex was down nine percent. (Update: The average decline across all wireline ISPs was 12 percent. Including wireless ISPs Sprint and T-Mobile in the sample reduces the average decline to eight percent.)..
This capital flight is remarkable considering there have been only two occasions in the history of the broadband industry when capex declined relative to the prior year: In 2001, after the dot.com meltdown, and in 2009, after the Great Recession. In every other year save 2015, broadband capex has climbed, as ISPs—like hamsters on a wheel—were forced to upgrade their networks to prevent customers from switching to rivals offering faster connections.
What changed in early 2015 besides the FCC’s Open Internet Order that can explain the ISP capex tumble? GDP grew in both the first and second quarters of 2015. Broadband capital intensity—defined as the ratio of ISP capex to revenues—decreased over the period, ruling out the possibility that falling revenues were to blame. Although cord cutting is on the rise, pay TV revenue is still growing, and the closest substitute to cable TV is broadband video. Absent compelling alternatives, the FCC’s Order is the best explanation for the capex meltdown.
I haven’t had a chance to carefully review the financial statements and related earnings material of all the companies cited by Singer, but did take a quick look at this material for AT&T and Charter since, as he notes, they experienced by far the largest percentage drop in spending. What I found doesn’t strike me as supporting his conclusion that the decline was network neutrality-driven. Instead, in both cases it seems to pretty clearly reflect the end of major investment projects by both companies and related industry trends that seem to have nothing to do with the FCC’s Open Internet order.
My perspective on this is based on statements made by company officials during their second quarter 2015 earnings calls, as well as capex-related data in their financial reporting.
During AT&T’s earnings call, a Wall Street analyst asked the following question: “[T]he $18 billion in CapEx this year implies a nice downtick in the U.S. spending, what’s driving that? Are you finding that you just don’t need to spend it or are you sort of pushing that out to next year?” In his response to the question, John Stephens, the company’s CFO, made no mention of network neutrality or FCC policy decisions. Instead he explained where the company was in terms of key wireless and wireline strategic network investment cycles (bolding is mine):
Well, I think a couple of things. And the simplest thing is to say [is that the] network team did a great job in getting the work done and we’ve got 300, nearly 310 million POPs with LTE right now. And we are putting our spectrum to use as opposed to building towers. And so that aspect of it is just a utilization of spectrum we own and capabilities we have that don’t require as much CapEx. Secondly, the 57 million IP broadband and what is now approximately 900,000 business customer locations passed with fiber. Once again, the network guys have done a great job in getting the Project VIP initiatives completed. And when they are done…the additional spend isn’t necessary, because the project has been concluded not for lack of anything, but for success.
Later on in the call, another analyst asked Stephens “[a]s you look out over the technology roadmap, like 5G coming down the pipeline, do you anticipate that we will see another period of elevated investment?”
While Stephens pointed to a potential future of moderated capital spending, he made no reference to network neutrality or FCC policy, focusing instead on the investment implications of the company’s (and the industry’s) evolution to software-defined networks.
I would tell you that’s kind of a longer term perspective. What we are seeing is our move to get this fiber deep into the network and getting LTE out deep into the wireless network and the solutions that we are finding in a software-defined network opportunity, we see a real opportunity to actually strive to bring investments, if you will, lower or more efficient from historical levels. Right now, I will tell you that this year’s investment is going to be in that $18 billion range, which is about 15%. We are certainly – we are not going to give any guidance with regard to next year or the year after. And we will give an update on this year’s guidance, if and when in our analyst conference if we get that opportunity. With that being said, I think there is a real opportunity with some of the activities are going on in software-defined networks on a longer term basis to actually bring that in capital intensity to a more modest level.
Charter’s large drop in capital spending appears to be driven by a similar “investment cycle” dynamic. During its 2Q15 earnings call, CFO Christopher Winfrey noted that Charter’s year-over-year decline in total CapEx “was driven by the completion of All-Digital during the fourth quarter of last year,” referring to the company’s migration of its channel lineup and other content to an all-digital format.
A review of the company’s earnings call and financial statements suggests that a large portion of the “All-Digital” capital spending was focused on deploying digital set-top boxes to Charter customers, resulting in a precipitous decline in the “customer premise equipment” (CPE) category of CapEx. According to Charter’s financial statements, first-half CPE-related CapEx fell by more than half, or $341 million, from $626 million to $285 million. Excluding this sharp falloff in CPE spending driven by the end of Charter’s All-Digital conversion, the remainder of the company’s capital spending was actually up 3% during the first half of 2015. And this included a 7% increase in spending on “line extensions,” which Charter defines as “network costs associated with entering new service areas.” It seems to me that, if Charter was concerned that the Commission’s Open Internet order would weaken its business model, it would be cutting rather than increasing its investment in expanding the geographic scope of its network.
To understand the significance of Charter’s spending decline, I think it’s important to note that its 29% decline in first half total CapEx was driven by a 54% decline in CPE spending, and that the company’s non-CPE investment—including line extensions—actually increased during that period. I found it odd that, even as he ignored this key dynamic for Charter, Singer seemed to dismiss the significance of Comcast’s CapEx increase during the same period by noting that it was “attributed to customer premises equipment to support [Comcast’s] X1 entertainment operating system and other cloud-based initiatives.”
I also couldn’t help notice that, in his oddly brief reference to the nation’s largest ISP, Singer ignored the fact that every category of Comcast’s capital spending increased by double-digits during the first half of 2015, including its investment in growth-focused network infrastructure, which expanded 24% from 2014 levels. Comcast’s total cable CapEx was up 18% for the first half of the year, while at Time Warner Cable, the nation’s second largest cable operator, it increased 16%.
While these increases may have nothing to do with FCC policy, they seem very difficult to reconcile with Singer’s strongly-assserted argument, especially when coupled with the above discussion of company-specific reasons for large CapEx declines for AT&T and Charter. As that discussion suggests, the reality behind aggregated industry numbers (especially when viewed through a short-term window of time) is often more complex and situation-specific than our economic models and ideologies would like it to be. This may make our research harder and messier to do at times, but certainly not less valuable. It also speaks to the value of longitudinal data collection and analysis, to better understand both short-term trends and those that only become clear over a longer term. That longitudinal component is central to the approach being taken by the Quello Center’s study of net neutrality impacts.
One last general point before closing out this post. I didn’t see any reference in Singer’s piece or the AEI-published follow-ups to spending by non-incumbent competitive providers, including municipally and privately owned fiber networks that are offering attractive combinations of speed and price in a growing number of markets around the country. While this category of spending may be far more difficult to measure than investments by large publicly-owned ISPs, it may be quite significant in relation to public policy, given its potential impact on available speeds, prices and competitive dynamics.
Expect to see more on this important topic and the Quello Center’s investigation of it in later posts, and please feel free to contribute to the discussion via comments on this and/or future posts.
Bill Dutton gave a keynote presentation for Social Media & Society 2015, an international conference held 27-29 July 2015 at the Rogers School of Management at Reyerson University, Toronto, Canada. An abstract, and links to the powerpoint and video of the talk are posted here.
Legal and regulatory initiatives shaped by moral panics over social media are a microcosm of many general threats to the vitality of a free, open and global Internet. The belief is widespread that social media and related Internet developments are unstoppable and beyond the control of governments and regulators across the world. However, initiatives afoot to address increasingly vocal public support for ‘doing something’ about concerns ranging from cyber-bullying to privacy, are pushing politicians and regulators to bring traditional approaches to media regulation to bear on social media and the Internet. These initiatives are unlikely to accomplish their intended aims but could well undermine the vitality of social media and the larger ecology of the Internet. Several types of response are critical. First, academics and practionners need to come forward with a regulatory model that is purpose built for social media and related applications of the Internet. Secondly, educational efforts need to be prioritized to help children and others learn how to use social media in more ethical, safe and effective ways. Thirdly, social media need to be designed in ways that enable users to hold other users more socially accountable for their actions.
Slides for the Talk are on Slideshare at: http://www.slideshare.net/WHDutton/society-meets-social-media-at-reyerson2015
Video of the Talk: https://ryecast.ryerson.ca/12/watch/9167.aspx
We have opened a search for a Quello Postdoctoral Fellow in Media and Information Policy at Michigan State University. [MSU Job Posting #1180] This is in addition to the search for an Assistant Research Professor.
William Dutton, the Quello Professor of Media and Information Policy and Director of the Quello Center in the Department of Media and Information at Michigan State University (MSU), is seeking to hire a Postdoctoral Fellow for a 1 year position, with the potential for renewal. The position is available beginning as soon as July 1, 2015. The postdoc will work with Professor Dutton on existing Quello research projects and in developing proposals for further research. Projects focus on media, information and Internet policy, regulation and governance, such as the Center’s Network Neutrality Impact Study. The appointment would enable candidates to pursue their own research of relevance to the Center as well as supporting ongoing Quello Center researchwork, with the potential for raising support for continuation beyond the first year.
Applicants should explain the relevance of their background and interests to the mission and work of the Quello Center. Candidates must have defended their dissertation prior to beginning the postdoctoral fellowship. A PhD is normally required in one of the many fields that contribute to the development and study of information, media and communication policy, such as Political Science, Law (where a J.D. degree is expected), Policy, Economics, Sociology, Psychology, Communication, New Media, Internet Studies, Information Studies, or a related field. Candidates must have: strong methodological training and skills of relevance to policy research, such as in modeling or specific social research methods; experience writing grant proposals; good organizational and time management skills; and evidence of the ability to work well as part of a team. The quality of prior publications and grant writing experience will be key in evaluating all applications.
The Postdoctoral Fellow is a 12 month, full-time appointment, with salary up to $45,000 depending upon qualifications. Benefits are also provided. See http://grad.msu.edu/pdo/ and http://www.hr.msu.edu/benefits/ for more information on postdoctoral training and benefits at MSU.
Questions may be addressed to the Director of the Quello Center at Quello@msu.edu, but the following application materials must be submitted via the MSU online system for job posting #1180 at the MSU Job Postings Web site: 1) a cover letter describing why you are interested in this position, and what training, skills, research and methodological background you would bring to the work of the Quello Center and this position; 2) an up-to-date and complete curriculum vitae; 3) one or two samples of your best work; and 4) the names and contact information for three references. The review of applications will begin immediately and continue until a suitable candidate is selected.
The Quello Center seeks to stimulate and inform debate on media, communication and information policy for our digital age. It pursues research that questions taken for granted assumptions about the implications of technology, policy and regulation, and seeks to collaborate with other centers of excellence in research on the social and economic implications of our digital age and the policy and management issues raised by these developments.Information about the Quello Center: http://quello.msu.edu
The Center is based in the Department of Media and Information, which is home to a dynamic, interdisciplinary faculty internationally known for their research on the uses and implications of information and communication technology and policy.
MSU is an affirmative-action, equal opportunity employer. MSU is committed to achieving excellence through cultural diversity. The university actively encourages applications and/or nominations of women, persons of color, veterans and persons with disabilities.
On April 1 the Information Technology & Innovation Foundation (ITIF) held an event to discuss its new report entitled “How Techno-Populism Is Undermining Innovation.” The thrust of the report was to contrast the dangers of what it describes as “tech populism” with the virtues of what it calls “tech progressivism.”
The report begins with:
There was a time when technology policy was a game of “inside baseball” played mostly by wonks from government agencies, legislative committees, think tanks, and the business community. They brought sober, technical expertise and took a methodical approach to advancing the public interest on complex issues such as intellectual property rights in the digital era or electronic surveillance of telecommunications networks. But those days are gone. Tech policy debates now are increasingly likely to be shaped by angry, populist uprisings—as when a stunning four million submissions flooded into the Federal Communications Commission in response to its request for public comment on the issue of net neutrality; or when a loose coalition of protesters staged a dramatic blackout of popular websites in January 2012 to halt legislation that was intended to curb online piracy.
The authors seem to consider the mass-scale FCC comments and grassroots coalition building on tech issues as dangerous and destructive, in ways I find difficult to recognize:
Populism draws its strength from individuals’ fears, misunderstandings, or distrust, appealing to the prejudices of crowds and relying on demagoguery, distortion, and groupthink. Tech populists focus on maximizing self-interest and personal freedom, even if it comes at the expense of broader public interests.
I find the last reference to the “broader public interests” especially strange, since most of those I know who support net neutrality rules and strong privacy protections (whether expert or non-expert) strike me as genuinely very concerned about the public interest.
While there is plenty of room for thoughtful and respectful debate about how best to serve the public interest, the paper’s heavy use of straw-man arguments strikes me as an unfortunate example of the “demagoguery, distortion and groupthink” it condemns among those who seek to bring more citizens into the public policy arena (though exercised with a different style and mix of debating techniques).
The paper later notes that:
To be clear, the problem with technology policy debates is not that they have become more open and participatory, but rather that many, if not most of those who are choosing to engage in these debates do so from a position of fear, anger, or misunderstanding.
I strongly agree that communication policy debates should be based on facts, logic and a focus on the public interest. But I think the paper is pretty biased in how it assigns responsibility for relying on “fear, anger and misunderstanding” (perhaps a close relative of FUD).
Related to this is the paper’s suggestion that it is irrational to embrace the “populist” view that:
[E]lites, especially big business and big government, will prevent useful rules from being established—or, if those rules are established, will find ways to bypass them at the expense of the broader public. They distrust the private sector because they believe corporations are driven purely by profit, and they distrust the public sector because they believe government is ineffectual and overbearing.
While this so-called “populist view” might be more accurate with a bit more elaboration and nuance, I disagree with the report’s suggestion that it is far from the mark in describing the reality of the political economy we’ve experienced in this country over the past several decades. When I consider actions taken and statements made by government officials (e.g., related to the Iraq War, NSA activities, financial reform, etc.) and some large corporations (e.g., in their lobbying and PR efforts to restrict municipal fiber network projects, neuter financial reform, etc.) I see valid, readily documentable reality-based reasons for distrust. And, to use the report’s own language, I’d rank these powerful institutions as among the most skilled and well-resourced purveyors of “fear, anger and misunderstanding.” They can, after all, afford to hire the most skilled practitioners of FUD, “truthiness” and other communication black arts.
In this two-part post I’m considering this same topic, but from a public policy perspective.
Viewed in very broad strokes, we have on one hand the potential benefits from what could be a new and attractively priced competitive option in the wireless sector. On the other hand, we have a range of complex and intertwined public policy issues related to the continued expansion of Comcast’s market power across multiple sectors of the communications industry, and the prospects for anti-competitive impacts of that expansion.
In Part 1 I focused on Comcast’s use of dual-SSID in-home gateways to deploy a network of millions of public access hotspots while: 1) charging customers $10 per month to lease these dual-use devices, which also provide them with a private in-home WiFi network; 2) using these customers’ electricity to power gateway devices that are also used as public-access hotspots; 3) activating the gateway’s public hotspot capability with an opt-out (vs. an opt-in) approach that has been criticized as “difficult to use or broken.”
Here in Part 2 I’m going to consider the competitive and public interest impacts of this strategy in the broader context of Comcast’s unique and synergistic mix of market power.
Who controls our “window on the world?”
As suggested in an earlier post, one emerging and important arena for competition is services that make it easier for customers to manage their media consumption across multiple fixed and portable devices, including large screen TVs, computer monitors, tablets and smartphones.
As the dominant provider of both wireline Internet access and traditional multichannel video, Comcast is well positioned to expand the scope of that dominance into this emerging “nomadic multiscreen multimedia” market. This is especially true if it can successfully integrate wireless connectivity and provide customers with a combination of connectivity, content and user-interface that can’t be matched by other companies that lack Comcast’s broad set of competitive tools and assets.
As public comments by Comcast executives have suggested, the company’s deployment of more than 8 million WiFi hotspots is a big step toward achieving the threshold level of wireless connectivity needed to support this kind of strategy (as discussed in an earlier post, ubiquitous coverage and seamless handoff capabilities are less necessary for this type of “nomadic” service).
The multisource, multiscreen user interface arena has attracted a range of large and small companies from related sectors, including tech giants like Apple, Google, Amazon and Sony, as well as Dish Network’s Sling TV, smaller players like Roku, and online content distributors like Netflix and Hulu. But, so far, none has been able to achieve a position as the dominant gateway to the widening world of online media (this recent Wall Street Journal article provides some perspective on the challenges in this arena for both companies and consumers).
Can Apple become “the new Comcast for the Internet?”
With Apple once again in negotiations with major TV content providers, the Washington Post’s Cecilia Kang raises the question of whether the creator of the iPod, iTunes, iPhone and iPad, which fundamentally transformed the music and mobile communications industries, might finally be ready to work similar magic in the TV business.
Television viewers have long yearned for the day they could get their favorite programs streamed online without having to pay a huge cable bill each month. That day has arrived — and it’s confusing…
Enter one big company — Apple — that wants to clear up all the confusion. If it succeeds, Apple could become the biggest gateway to online video — the new Comcast for the Internet. And it has more cash on hand than any of its rivals to secure the most-desired shows.
As others have done before, Kang is speculating that perhaps Apple’s expertise in user interface and product design, coupled with its extremely deep pockets, passionate user base and experience transforming other media and communication industries, may finally be a powerful enough combination to enable the company to become “the new Comcast for the Internet.”
In a Backchannel column, Harvard professor Susan Crawford expresses a different view of this issue, suggesting that, if regulators are not proactive, Comcast itself may become an even more dominant version of “the Comcast for the Internet.”
In this two-part post I’m going to consider this same topic, but from a public policy perspective.
Viewed in very broad strokes, we have on one hand the potential benefits from what could be a new and attractively priced competitive option in the wireless sector. On the other hand, we have a range of complex and intertwined public policy issues related to the continued expansion of Comcast’s market power across multiple sectors of the communications industry, and the prospects for anti-competitive impacts of that expansion.
Here in Part 1 I’m going to focus on Comcast’s use of dual-SSID in-home gateways to deploy a network of millions of public access hotspots while:
1) charging customers $10 per month to lease gateway devices that provide them with a private in-home WiFi network while also being used by Comcast as a public-access hotspot;
2) using these customers’ electricity to power these dual-use gateway devices;
3) activating the gateway’s public hotspot capability with an opt-out (vs. an opt-in) option that has been criticized as “difficult to use or broken.”
In Part 2 I’ll consider the competitive impacts of this strategy in the broader context of Comcast’s unique mix of market power in both the distribution and content sectors.
In my earlier post I noted that, in Comcast’s yearend earnings call, CFO Michael Angelakis told Wall Street analysts that Comcast’s investments in dual-SSID in-home WiFi gateway devices offers “great returns on their own and…seed us for different businesses that are attractive going forward.”
While clearly good for Comcast (as Angelakis’s comment suggests), a separate set of policy-related questions concerns whether the company’s approach to deploying, utilizing and paying for dual-SSID gateways provides net benefits to Comcast customers and the public interest.
Plaintiff Toyer Grear and daughter Joycelyn Harris of Alameda County, California, filed the suit on December 4…in US District Court in Northern California, seeking class action status on behalf of all Comcast customers who lease wireless routers that broadcast Xfinity Wi-Fi hotspots. “Without authorization to do so, Comcast uses the wireless routers it supplies to its customers to generate additional, public Wi-Fi networks for its own benefit,” the complaint states.
Grear and Harris allege that Comcast violated the US Computer Fraud and Abuse Act as well as California laws on unfair competition and computer data access and fraud. They claim that the public hotspots, broadcast from the same equipment used for subscribers’ private Wi-Fi networks, raise customers’ electricity costs and harm network performance…The lawsuit [also] claims that “unauthorized broadcasting of a secondary, public Wi-Fi network from the customer’s wireless router subjects the customer to potential security risks”…
While Comcast says the public hotspots use different bandwidth than is allocated to a customer’s home Internet service, the lawsuit argues that they can create wireless congestion in areas with many Wi-Fi networks.
Comcast acknowledges that there could be a performance hit because the Wi-Fi networks use shared spectrum, but it says it designed the system “to support robust usage” and that there should be only “minimal impact.”
Brodkin also cites complaints at online user forums that Comcast’s home hotspot opt-out functionality is “difficult to use or broken.” But he also notes that Comcast customers retain the option of purchasing their own modem and router to avoid having to deal with this issue.
I’d add to this my own recent experience when I raised this issue with a Comcast technician who came out to deal with problems I was having with my Internet service (mainly very slow and erratic speeds, especially when using the WiFi connection to Comcast’s dual-SSID gateway device).
MSU’s Quello Center is launching a study of the impact of net neutrality.
With the support for net neutrality regulations at the FCC, and in the White House, the debate should quickly move from theoretical speculation to empirical realities: What will be the actual impact of net neutrality regulation?
The net neutrality debate has galvanized a wide variety of stakeholders in opposing camps around the wisdom of this regulation on the future of a global, open and secure Internet. Proponents argue that net neutrality will keep the Internet open and in line with its early vision by not advantaging those who can pay for fast lanes, while opponents have raised numerous concerns about the role regulation could play in constraining efficiency, competition, investment, and innovation of the Internet and patterns of its use by individuals, households, business and industry. It has become a politically and commercially contentious issue that has become increasingly partisan and commensurately over simplified around competing positions. However, from all sides of this debate, the implications are expected to be of major importance to the future of the Internet in the US, but also globally, as other nations will be influenced by policy and regulatory shifts in the United States.
It is therefore important that claims about the value and risk of net neutrality become a focus of independent empirical research. In many ways, the FCC’s decision on net neutrality presents an opportunity for a natural experiment that will provide real evidence on the actual role that net neutrality will play for actors across the Internet and telecommunication industries, but also users and consumers of Internet services.
Academic research needs to be analytically skeptical and seek to challenge taken-for-granted assumptions on both sides of the debate with empirical research and analysis. The Quello Center is well positioned to conduct this research. It was established by an endowment in honor of former FCC Commissioner, James H. Quello, to study media and information policy in a neutral and dispassionate way. The Center’s endowment provides the independence and wherewithal to launch this project with an eye towards expansion of the project if justified by the support of its Advisory Committee, sponsorship and other sources of funding, such as foundations concerned with the social and economic futures of the Internet.
The project will be led by Professor Bill Dutton, the new Director of the Quello Center. Before taking this position, Bill was founding Director of the Oxford Internet Institute and Professor of Internet Studies at the University of Oxford. Other MSU and Quello faculty involved in this project include:
Staff of the Quello Center, including Mitchell Shapiro, and an Assistant Research Professor for whom a new search is underway, will be committed to this project, and we will develop collaborations with faculty and practitioners with an interest in supporting and joining this research initiative.
The Quello Center welcomes expressions of support and offers of collaboration or sponsorship on what is an important albeit complex and challenging issue for policy research. If you wish to comment on, or support this research initiative, please contact Bill Dutton, or any of the faculty associates.
Contact: Professor Dutton at Quello@msu.edu