Anatomy of the FCC’s Network Neutrality Rules, a Webcast by Adam Candeub

by

This Webcast is from a Quello Center seminar conducted by Adam Candeub, MSU Law Faculty and Co-Principal Investigator of the Network Neutrality Impact Study. Professor Candeub provides his preliminary views on the details of the FCC’s network neutrality rules. The title of the seminar was Anatomy of the FCC’s Network Neutrality Rules, by Adam Candeub @ Quello Center, which was held on 18 May 2015.

Protecting and Promoting an Open Internet by Adam Candeub from Quello Center on Vimeo.

The FCC approved Chairman Tom Wheeler’s network neutrality proposal on 26 February 2015. The 3-2 vote for approval has been called a ‘watershed victory for activists’ in support of an open Internet, but criticized by others as a risk to the vitality of the Internet. Whether you are a proponent, opponent or observer of the net neutrality concept, it is useful to have an understanding of the actual ruling. In early April, the FCC sent the proposed rules to the Federal Register for publication, expected in June.

Our law colleague, Professor Adam Candeub, had been reading through the rules and offered to share his thoughts and interpretations of the ruling at a Quello Center Seminar for the Net Neutrality Impact Study.

Biographical Sketch

Adam Candeub is Professor, and Director of the Intellectual Property, Information & Communications Law Program, at Michigan State University College of Law, which he joined in 2004, and Director of the Intellectual Property and Communications Law program at MSU. He is a key resource for the Quello Center in areas of media law and policy. He was an attorney-advisor for the Federal Communications Commission (FCC) in the Media Bureau and previously in the Common Carrier Bureau, Competitive Pricing Division. His work at the FCC involved him in critical decisions in communications law. From 1998 to 2000, Professor Candeub was a litigation associate for the Washington D.C. firm of Jones, Day, Reavis & Pogue, in the issues and appeals practice. He also has served as a corporate associate with Cleary, Gottlieb, Steen & Hamilton, also in Washington, D.C. Immediately following law school, he clerked for Chief Judge J. Clifford Wallace, U.S. Court of Appeals for the Ninth Circuit. While in law school, Professor Candeub was an articles editor for the University of Pennsylvania Law Review. He is well published in numerous law reviews. Professor Candeub’s scholarly interests focus on the intersection of regulation, economics, and communications law and policy.

Tags: , ,


Unlicensed Spectrum: More Capacity, Flexibility

by

In an earlier post I discussed the FCC’s recent decision to open up 150 MHz of spectrum in the 3550-3700 MHz band for unlicensed “General Authorized Access” usage as the lowest-priority usage category in a new three-tier model that includes protections for incumbent government users and provides for “Priority Access Licenses” assigned via auction.

In this post I’m going to briefly review two other spectrum bands that the FCC has recently moved to make more available for unlicensed use. Together these changes have potential to increase unlicensed spectrum capacity and flexibility in terms of network designs and business models.

5 GHz

In March 2014 the Commission adopted a Report and Order modifying the rules governing the operation of Unlicensed National Information Infrastructure (U-NII) devices operating in the 5 GHz band. The goal of the changes was to “significantly increase the utility of the 100 megahertz of spectrum” in the 5.150-5.250 GHz band and “streamline existing
rules and equipment authorization procedures for devices throughout the 5 GHz band.”

As a FCC’s March 31 press release explained:

Currently U-NII devices operate in 555 megahertz of spectrum in the 5 GHz band, and are used for Wi-Fi and other high-speed wireless connections…The rules adopted today remove the current restriction on indoor-only use and increase the permissible power which will provide more robust access in the 5.150-5.250 GHz band. This in turn will allow U-NII devices to better integrate with other unlicensed portions of the 5 GHz band to offer faster speeds and reduce congestion at crowded Wi-Fi hot spots such as airports and convention centers.

Broadcast White Space

While the 3.5 GHz and 5 GHz bands can provide substantial amounts of spectrum to augment the overcrowded 2.4 GHz and 900 MHz bands used for Wi-Fi and other unlicensed technologies, the propagation characteristics of these higher-frequency bands translate into limited geographic coverage per base station.

This contrasts with the so-called “White Space” spectrum available for unlicensed use in the sub-700 MHz broadcast band. Often referred to as “prime spectrum real estate,” the broadcast band enjoys relatively strong propagation characteristics. But, at the same time (not surprisingly, given its much-coveted status) it has relatively little free spectrum available for unlicensed use, especially in high-demand metro areas, which are served by relatively large numbers of broadcast stations.

The advancement of unlicensed White Space has been a slow process, dating back to 2002. In 2008 the FCC finally issued a set of rules for White Space operation in the broadcast band. This was followed by a series of refinements, including the authorization of “TV bands database systems” to support non-interfering White Space usage, starting with the Commission’s first authorization of a database operated by Spectrum Bridge in late 2011.

In early 2012 the White Space saga took another turn, when Congress passed a law authorizing the FCC to conduct spectrum auctions to reclaim parts of the TV spectrum for wireless users. This was followed in May 2014 by FCC rules for conducting an “incentive auction” designed to motivate broadcasters to voluntarily give up their spectrum in exchange for a portion of auction revenues. Given the complexity and sensitivity of this ambitious auction plan, it’s not too surprising that its scheduled date has been pushed back twice and is now planned for earlier 2016.

The planned incentive auction and related “repacking” of the broadcast band raised the prospect of a reduction in spectrum available for unlicensed White Space devices (WSD). In response to this, the FCC’s incentive auction rules revisited the Commission’s earlier plan for allocating spectrum to unlicensed use. Though less spectrum will now be available for this purpose, the new plan is designed to ensure that at least three or four 6 MHz channels are available on a nationwide basis, with significantly more spectrum likely to be available in some smaller and more rural markets, where existing high-speed connectivity options are particularly scarce.

This focus on providing a minimum amount of bandwidth nationwide reflects the view expressed by multiple commenters (e.g., see Reply Comments from the Open Technology Institute and Public Knowledge) that “the emergence of a mass market for unlicensed chips, devices and services” based on the 802.11af (“White-Fi” or “Super Wi-Fi”) standard will require the nationwide availability of at least three 6 MHz channels.

The May Incentive Auction Report and Order was followed in late September by a Notice of Proposed Rulemaking revising the Commission’s Part 15 rules governing unlicensed use. These rules loosened some restrictions on power levels and guard band requirements, a change welcomed by White Space advocates, but not by licensed users in adjacent spectrum.

****

Taken together, Commission’s actions to significantly expand the amount of unlicensed spectrum in frequency bands with diverse propagation characteristics should provide important technical capabilities to support the new generations of unlicensed providers and services discussed in this series of blog posts.

While the 3.5 GHz and 5 GHz bands will provide a substantial amount of new capacity for small-cell deployments, the more limited amount of White Space spectrum will support larger cells, reach longer distances, and provide much-improved in-building penetration by outdoor base stations. And, when combined, this mix of spectrum options should enable unlicensed service providers to architect next-generation networks that cost-effectively deliver significantly faster speeds and more extensive and reliable coverage.

Tags: , ,


The FCC’s 3.5 GHz Tiered-Use Plan: Paradigm Shift, Experiment or Both?

by

In a series of posts over the past two months I’ve discussed a range of initiatives aimed at using unlicensed spectrum to support the growing demand for wireless connectivity. To put these efforts in a forward-looking context, it’s helpful to get a sense of what changes are in the works in terms of expanding the amount of spectrum available for unlicensed use.

In this post I’m going to focus on the most recent development on this front: the FCC’s April 17 decision to make 150 MHz of spectrum (3550-3700 MHz) available for new licensed and unlicensed commercial use, while retaining protections for existing military and other incumbent users of this spectrum.

In a statement accompanying the Commission’s April 17 vote, Chairman Tom Wheeler described three key principles underlying the agency’s move to establish the Citizens Broadband Radio Service:

First, we are leveraging advances in computing technology to rely on an innovative Spectrum Access System to automatically coordinate access to the band. It’s the traditional frequency coordination role, but modernized using advanced technologies to maximize efficiency.

Second, we are using auctions to grant exclusionary interference protections only when the spectrum is actually scarce. Under our rules, anyone with a certified device can use the spectrum, sharing it with others. In areas where the spectrum is scarce, users can participate in an auction to seek a license to gain priority access to the band.

Third, in cooperation with our federal partners, we are creating a new way to share spectrum with federal users. By leveraging the Spectrum Access System and technologies to monitor and sense when a federal user is present, we can move toward true dynamic sharing of the band between federal and non-federal users.

As a reflection of this cooperation, the Commission, working with NTIA and Department of Defense spectrum users, reduced the latter’s coastal protection zones by roughly 77%.

In her statement, Commissioner Mignon Clyburn pointed to a “paradigm shift [in] the [FCC’s] move away from highly fragmented long term exclusive use licenses to shorter term Priority Access Licenses [PAL] with a rule to use it or share it with General Authorized Access users.”

These new regulatory approaches will create enough certainty to fuel investment in equipment for the 3.5 GHz band and the new PAL license will have lower administrative costs and allow for micro-targeted network deployments. Service providers will have flexibility in designing networks to address unique challenges posed by rural and other areas, and by using a Spectrum Access System database to dynamically assign frequencies in the band for both PAL licenses and GAA users, there will be more efficient use of spectrum in heavily populated areas.

Though the ruling was approved in part and concurred in part by the agency’s two Republican Commissioners, their statements described it not as a “paradigm shift” but rather as an “experiment” that may or may not succeed, and could have been improved in several respects.

For example, Commissioner Michael O’Rielly appeared to disagree with Clyburn about whether the new rules provided enough clarity and incentives for potential PAL licenses to invest:

I am concerned that some rules may hinder development of the Priority Access Licenses, known as PALs. I question whether auctioning PALs for three year terms with no renewal expectancy will create a meaningful incentive to entice auction participants. Similarly, while I thank the Chairman for agreeing to changes that facilitate PALs in areas where there is more than one auction bidder, I had hoped our rules would include a mechanism whereby any entity could receive a PAL even if mutually exclusive applications, which are necessary to trigger an auction, are not filed in a particular census tract. The Commission ought to encourage a diverse array of business models. Many entrepreneurs, even those living in rural communities, have told me of their strong preference for PALs, which they explain would ensure better reliability and quality of service. Our rules must not foreclose these prospective licensees from obtaining PALs just because they are the only one in a given census tract wanting priority access. We need to fix this in the near term.

And, according to Commissioner Ajit Pai:

This Order leaves many important details and complex questions to be resolved, including whether technologies will develop that can manage the complicated and dynamic interference scenarios that will result from our approach. It therefore remains to be seen whether we can turn today’s spectrum theory into a working reality. Moreover, exclusion zones still cover about 40% of the U.S. population, and we leave the door open for the introduction of new federal uses across the country, neither of which is ideal.

Regardless of which description—“paradigm shift” or “experiment”—is most apt, the Commission’s new approach to the 3.5 GHz band strikes me as a worthy effort to move beyond the longstanding spectrum management status-quo, and creatively use technology to explore new models that enable both licensed and unlicensed users to deliver more value from existing spectrum. And, even if some aspects of this new model do prove problematic, it should at least provide valuable lessons to inform future efforts to craft spectrum policy appropriate for the 21st century.

And some aspects of the 3.5 GHz rules remain subject to further refinement, pursuant to a Further Notice of Proposed Rulemaking also issued by the Commission.

Tags: , , ,


Shareholder value, the public interest & the Comcast/TWC deal

by

Over the past several days I’ve seen a number of post-mortems on the decision by Comcast to drop its bid to acquire Time Warner Cable after it became clear regulators weren’t going to approve the deal. Two items in particular caught my attention over the weekend: a piece  by Eric Lipton in the New York Times discussing Comcast’s not-so-successful lobbying effort in Congress, and an interview with Comcast Chairman and CEO Brian Roberts on Squawk Box, a program carried on CNBC, a cable network owned by Comcast since it acquired NBCUniversal roughly two years ago.

One of the things that struck me about the CNBC interview is that it clearly illustrates one perspective on the deal and Comcast’s impressive growth, and on the net value of regulation. I’d call this the “investor” perspective.  From this perspective, the key metrics for evaluating Comcast, its actions and external factors impacting the company (e.g., regulation) are tied directly to the company’s ability to “maximize shareholder value,” something Brian Roberts and his team have been very good at over the years.

In contrast, the focus of the Times piece was concerns about the merger’s likely impact on the public interest rather than on shareholder value.

Market power skews shareholder value away from public interest

While some (perhaps some libertarian-leaning economists and CNBC commentators) might equate these two values, I suspect most people (experts and non-experts alike) would agree they are not the same, nor always positively correlated.

In fact, I’d argue that shareholder value and the public interest are likely to be inversely correlated when the company in question wields extensive market (and political) power and has a history of using it aggressively to gain competitive advantage and additional market power.  All the more so when First Amendment issues are part of the equation, as is very much the case with regard to Comcast.

In a market with healthy competition and low barriers to entry, companies can only succeed if they satisfy their customers. In such markets I wouldn’t be surprised to find a meaningful correlation between shareholder value and the provision of high-quality service.

But, as FCC data (see graph on pg. 12) makes clear, many customers seeking high-speed Internet connections lack an attractive (or any) competitive option to cable-delivered broadband service.  And the cost of market entry into this very capital-intensive sector remains very high.

And, as someone who has been both a Comcast and AT&T Internet customer, and has visited many an online user forum, my view is that, even when there is a choice between these two industry giants (or their peers), switching from one to the other is akin to jumping from the frying pan into the fire. And even if you’re eager to make that jump, the transition may involve a series of frustrating interactions with the CSRs, IVRs, techs, wait-times, billing mistakes and equipment returns/pickups of not one, but two companies.  For an extreme—and hopefully rare—example of this type of experience, spend a few minutes listening to this recording of a Comcast customer attempting to drop his service.

This lack of attractive options and reluctance to jump through the hoops needed to switch between them may help explain why providers of Internet and bundled services often offer big rebates and steep short-term discounts to get customers to switch. Perhaps they’re hoping that, this time, a customer will stick around after the discount expires, since they’ll know that their only option at that point would be to jump back into the same frying pan they left a short while ago. Switching back and forth may be a game some consumers are willing to continue playing, but I suspect it’s too time-consuming and frustrating for most (at least it would be for me).  Most, I suspect, simply want fast and reliable speeds, and responsive customer service and tech support.  Unfortunately, providing that may cost a bit more than offering switching rebates and discounts (or so it seems based on companies’ actions).

An admirable focus on the public interest

To their credit, the FCC and Justice Department took seriously their responsibilities related to determining the competitive and public interest impacts of the proposed deal. And though Congress had no direct say in these decisions, it seems that many of its members also remained unconvinced that “what’s good for Comcast is good for the country,” even after months of heavy lobbying.  As Lipton reports in the Times.

Despite the distribution of $5.9 million in campaign contributions by the two companies during the 2014 election cycle, and the expenditure of an extraordinary $25 million on lobbying last year, no more than a handful of lawmakers signed letters endorsing the deal…Congress has no direct power to approve or disapprove any merger, but endorsements, particularly if they come from black and Hispanic leaders, can send a subtle but important message to regulators that the deal is in the public interest and should be cleared…

Lawmakers cited a variety of reasons as to why Comcast’s elaborate pitch failed to gain traction this time: The miserable customer service ratings the company earns, for instance, made politicians leery of helping it out. In addition, there were much more substantial antitrust concerns associated with this deal, and some members of Congress said they thought Comcast had failed to live up to its promises in the NBCUniversal deal, and so could not be trusted this time.

Other lawmakers and staff members on Capitol Hill, in interviews Friday, cited Comcast’s swagger in trying to promote this deal. They said they felt that Comcast was so convinced in the early stages that the deal would be approved that it was dismissing concerns about the transaction, or simply taking the conversation in a different direction when asked about them…

“They talked a lot about the benefits, and how much they were going to invest in Time Warner Cable and improve the service it provided,” said one senior Senate staff aide…“But every time you talked about industry consolidation and the incentive they would have to leverage their market power to hurt competition, they gave us unsatisfactory answers.”

Together, the CNBC interview and NYT article highlight the difference between a thoughtful and holistic perspective on communication policy and public policy in general, and what I’d call the CNBC/libertarian/Wall Street perspective (for an extreme example of the latter, see Rick Santelli’s infamous trading floor rant attacking “losers” seeking mortgage modifications while ignoring trillion dollar bank bailouts and Wall Street criminality)

Having observed Brian Roberts’ career since its early days, my sense is that he is an extremely capable strategist, manager and dealmaker, and also a person of integrity.  And he has plenty of reason to be proud of the company his father and he have built.  It’s been impressive to watch.

But I also believe that he sees his primary role as maximizing shareholder value, and his primary constituency as being Wall Street analysts and investors, not Comcast’s customers. This perspective might not trigger regulatory problems if his company didn’t enjoy high levels of market power in key bottlenecks sectors of the communications industry. But, as the FCC and DOJ rightly concluded, Comcast does wield such market power and was seeking to augment it significantly with the TWC deal.

Customer satisfaction as a key indicator

As one longstanding piece of evidence to support my view of Comcast’s priorities, I’d point to its history of being consistently among the lowest-ranked companies in its industry (and among all U.S. companies) in terms of customer satisfaction.

Though I can understand Squawkbox hosts choosing not to confront their “boss” with tough questions, I would have liked to see one of them ask him about why this prolonged history of poor customer service has not yet been remedied, and how much Comcast planned to spend to address this issue in the future. Instead, we see the discussion about what’s next for the company leading to Roberts’ comment that:

The deal was going to slightly increase our leverage. That is now not happening. So that opens up room for further stock buybacks. And I think that’s an area that certainly we’re open to thinking about and talking about with the board.

I would have liked to see Roberts instead (or at least also) say that investing heavily to improve customer service was something he was going to discuss with the board, and that he was seriously committed to turning his company into a leader rather than a laggard in satisfying its customers, as measured by independent surveys.

But, just as the hiker only had to outrun his fellow hiker, not the bear, Comcast, to augment its shareholder value, need only leverage the fact that its local access pipe is much faster than most of its telco competitors, and invest just enough to ensure that the poor quality of its customer service doesn’t outweigh its speed advantage for too many customers.

And even if Roberts actually did announce a seriously-funded customer service initiative (or a large scale commitment to all-fiber networks), Wall Street analysts would most likely respond with downgrades of its stock, and pressure to direct cash flow to buybacks and dividends rather than to improved customer service and investments that could yield positive externalities with great social value but uncertain prospects for monetization by the company.

This speaks to the difference between what Marjorie Kelly calls “generative” and “extractive” business and ownership models, which I wrote about in relation to Internet access here and here (and may write about on the Quello Center blog in the future).

Tags: , , , ,


A Viewpoint on the FCC Decision on Network Neutrality by A. Michael Noll

by

Net Neutrality by A. Michael Noll

Do not become confused over the debate over net neutrality. The definitions and principles are all quite basic – and simple.

There is a tendency to define the Internet by what services can be obtained over it, for example: email, telephone, video, information. But that approach to definition would work too for the old telephone network, which provided such services as: voice, data, fax, and information. This is not how to define the Internet — or the telephone network.

The network is how access to various services is obtained — it is not the services obtained.
Access should be equal to all — thus networks should be treated, and regulated, as common carriers. A big problem for conflict of interest occurs when the network provider is also a service provider, such as the cable companies, and that is why they should be prevented from doing both.

Decades ago, in the early 1970s, when the precursor of today’s Internet was invented, government made a big mistake in declaring this kind of network to be a computer-information service – and thus not to be regulated as a common carrier. And now that mistake has become obvious, even to those who lobbied for no regulation. In the end, it is all about video entertainment and the ability to use the Internet to go directly to the video providers, thereby bypassing cable TV.

Finally, the FCC has corrected the past – and the cable companies and Verizon will be gnashing their teeth and threatening chaos and doom. If the FCC has any courage left, it should tackle the separation of content (video entertainment) from the conduit (network access) – and in the cause of competition, break up the media goliaths.

© 2015 A. Michael Noll
February 26, 2015

Tags: , , , , ,


Launching the Net Neutrality Impact Study

by

MSU’s Quello Center is launching a study of the impact of net neutrality.

With the support for net neutrality regulations at the FCC, and in the White House, the debate should quickly move from theoretical speculation to empirical realities: What will be the actual impact of net neutrality regulation?

The net neutrality debate has galvanized a wide variety of stakeholders in opposing camps around the wisdom of this regulation on the future of a global, open and secure Internet. Proponents argue that net neutrality will keep the Internet open and in line with its early vision by not advantaging those who can pay for fast lanes, while opponents have raised numerous concerns about the role regulation could play in constraining efficiency, competition, investment, and innovation of the Internet and patterns of its use by individuals, households, business and industry. It has become a politically and commercially contentious issue that has become increasingly partisan and commensurately over simplified around competing positions. However, from all sides of this debate, the implications are expected to be of major importance to the future of the Internet in the US, but also globally, as other nations will be influenced by policy and regulatory shifts in the United States.

It is therefore important that claims about the value and risk of net neutrality become a focus of independent empirical research. In many ways, the FCC’s decision on net neutrality presents an opportunity for a natural experiment that will provide real evidence on the actual role that net neutrality will play for actors across the Internet and telecommunication industries, but also users and consumers of Internet services.

Academic research needs to be analytically skeptical and seek to challenge taken-for-granted assumptions on both sides of the debate with empirical research and analysis. The Quello Center is well positioned to conduct this research. It was established by an endowment in honor of former FCC Commissioner, James H. Quello, to study media and information policy in a neutral and dispassionate way. The Center’s endowment provides the independence and wherewithal to launch this project with an eye towards expansion of the project if justified by the support of its Advisory Committee, sponsorship and other sources of funding, such as foundations concerned with the social and economic futures of the Internet.

The project will be led by Professor Bill Dutton, the new Director of the Quello Center. Before taking this position, Bill was founding Director of the Oxford Internet Institute and Professor of Internet Studies at the University of Oxford. Other MSU and Quello faculty involved in this project include:

Staff of the Quello Center, including Mitchell Shapiro, and an Assistant Research Professor for whom a new search is underway, will be committed to this project, and we will develop collaborations with faculty and practitioners with an interest in supporting and joining this research initiative.

The Quello Center welcomes expressions of support and offers of collaboration or sponsorship on what is an important albeit complex and challenging issue for policy research. If you wish to comment on, or support this research initiative, please contact Bill Dutton, or any of the faculty associates.

Contact: Professor Dutton at Quello@msu.edu

Notes:

About the Quello Center
Quello Center Advisory Board

Tags: , , ,


Johannes Bauer on Communication Policy Processes in the US

by

This brief interview provides insights to key features of communication policy and regulatory processes in the US context. It is an interview with Professor Johannes Bauer, following a lecture he gave to visiting executives that allowed him to pursue these issues in depth. Discussing an overview of his more detailed presentation he gave to a Quello seminar, Professor Bauer argues that there is a new phase of experimentation around the development of principles and frameworks for the new media and information ecologies being shaped by the Internet and related innovations in information and communication technologies. Globally, many approaches are developing from the bottom-up, and there is, according to Professor Bauer, a distinctly American policy-making framework, which he outlines here.

Bauer Quello Interview from Quello Center on Vimeo.

Professor Bauer is Chair and Professor in the Department of Media and Information at Michigan State University. He is trained as an engineer and economist, holding MA and PhD degrees in economics from the Vienna University of Economics and Business Administration, Austria. While at MSU, he also had appointments as visiting professor at the Technical University of Delft, Netherlands (2000-2001), the University of Konstanz, Germany (Summer 2010), and most recently the University of Zurich, Switzerland (2012). Much of his research centers on policy issues critical to the Quello Center, such as around the regulation of telecommunications and the Internet, including work on net neutrality and cybersecurity.

Tags: ,


Community Fiber Networks: Bringing Competition and World-Class Infrastructure to American Cities

by

In the following video, President Obama announced several steps his Administration is taking to encourage municipally-owned broadband networks, as well as the rationale for taking them.

A few weeks after the president’s January 14 speech, the FCC announced it would be voting on a similar approach to municipal broadband at its February 26 meeting, where it will also vote on a proposal to classify broadband access as a Title II common carrier.  Since community broadband is a topic I hope to write about here in the future, and the Commission’s meeting is only two weeks away, I thought I’d share some initial thoughts on the subject, using the President’s plan and its significance as a focal point.

As always, feedback (especially from those who see this issue differently) is welcome…

(more…)

Tags: , , , , ,


Please Help Me Understand What’s Wrong with Title II

by

In a blog entry here yesterday I described FCC Chairman Wheeler’s Title II proposal as “replanting the roots” of communication policy in the digital age. Shortly after I posted it, the Commission released a four-page summary of the proposed “New Rules for Protecting the Open Internet.” Not surprisingly, the document triggered a barrage of public responses from a range of interested parties on both sides of the issue.

After reviewing the FCC’s Fact Sheet and some of these responses, I found myself puzzled about claims regarding risks and problems associated with Title II classification. So I thought I’d invite comments to help clarify what those risks and problems really are.

In yesterday’s post I focused on:

Today I want to focus more on questions of near-term strategy, tactics and risks related to the Commission’s proposed Title II action, and invite comments that clarify how and why the proposed Title II classification is problematic. It’s a claim I’ve heard often, but have difficulty understanding.

Here’s how I see it:

(more…)

Tags: , , , ,


Replanting the Roots of Communication Policy

by

[Update: shortly after this was written, the FCC released details about Chairman Wheeler’s “Protecting the Open Internet” proposal, which will be discussed here in later posts]

With the FCC expected to classify broadband access as a Title II common carrier service, while also preempting state restrictions on municipally-owned access networks, the Commission’s February 26 meeting is poised to launch a new era in U.S. communication policy.

To appreciate the significance of the Commission’s impending Title II decision, it’s useful to step back from the drama and details of today’s regulatory and market battles, and consider the agency’s upcoming vote from a historical perspective, starting with the Communications Act of 1934. I’d suggest that, viewed from that perspective, the FCC’s decision to treat broadband access under Title II is an attempt to replant the roots of communication law in the fertile ground of today’s First Amendment-friendly technology.

The Act’s stated purpose was:

“to make available, so far as possible, to all the people of the United States a rapid, efficient, nationwide, and worldwide wire and radio communication service with adequate facilities at reasonable charges.”

Given the relatively primitive technology of that era, the 1934 Act adopted different regulatory schemes for wireless broadcasting and wireline telephony, each designed to accommodate the technical constraints of the industry it was to regulate. Wireless broadcasting, constrained by technical interference among a cacophony of competing “voices,” was addressed by a system of exclusive licensing. This gave a relative handful of licensees First Amendment megaphones of unprecedented reach and power, in exchange for a vague and difficult-to-enforce set of “public interest” obligations.

Unwieldy at best, enforcement of broadcasting’s public interest regulations was largely abandoned in the 1980s under the Reagan Administration, which viewed deregulation as a much needed and broadly applicable solution to the nation’s economic problems. From that perspective, the best way to serve the public interest was, in most cases, to rely on the “magic of the market.” To the Administration’s first FCC Chair, Mark Fowler, the powerful broadcast and cable media were just more markets needing a healthy dose of deregulation.  As he famously put it, television was “a toaster with pictures.”

(more…)

Tags: , , , ,


Page 5 of 6
1 2 3 4 5 6