Washington Post debates how best to regulate the Internet
Ignoring Chairman Genachowski’s hair splitting of the word “Internet” when he claims he has no intention of regulating the Internet, the Washington Post made an earnest attempt to have a thoughtful debate on how the FCC should best regulate the Internet. But to have that thoughtful debate, the Post’s editorial is sorely in need of some basic fact checks.
The Post’s editorial immediately begins with the assertion that:
“net neutrality” — a commitment not to discriminate in the transmission of Internet content — has been a rule tacitly understood by Internet users and providers alike.”
That’s a very interesting assertion, but I’ve worked with Internet Service Providers both as a consumer and network engineer responsible business class Internet connectivity and I’ve never heard of such a “tacitly understood” rule until recently from people who don’t actually know how the Internet works. It’s also notable that the editors of the Washington Post later admitted in their editorial that this “rule” has never been codified anywhere.
Furthermore, I know that just the opposite is true and that broadband providers do offer faster and more prioritized data delivery to content and application providers i.e., Business to Customer (B2C) websites. These “prioritized and enhanced” paid peering agreements not only come with much faster dedicated pipes but they also offer implied priority due to lower latency or explicit priority due to a higher DiffServ (a router prioritization standard for the Internet) classification.
The Internet standards explicitly permits differentiation
Most importantly, the DiffServ Internet standard RFC 2475 – which is governed by the Internet Engineering Task Force (IETF), an international body responsible for regulating the Internet from the beginning – explicitly states that such commercial differentiation (what the Post refers to as “discrimination”) of either application type or by source is permitted. The document explicitly states:
This document defines an architecture for implementing scalable service differentiation in the Internet. A “Service” defines some significant characteristics of packet transmission in one direction across a set of one or more paths within a network. These characteristics may be specified in quantitative or statistical terms of throughput, delay, jitter, and/or loss, or may otherwise be specified in terms of some relative priority of access to network resources. Service differentiation is desired to accommodate heterogeneous application requirements and user expectations, and to permit differentiated pricing of Internet service.
So not only is there no codified rule against differentiation on the Internet by any regulatory body, there are codified rules that permit differentiation from the international body responsible for regulating the Internet.
The legal and economic case for differentiation
There’s an even more fundamental issue with the Washington Post’s use of the term “discrimination” and economist Dr. George Ford explains this best. By implying evil connotations, the Post ignores the accepted legal and economic definition of what constitutes a bad form of discrimination. A bad form of price discrimination (first degree price discrimination is what economists call this) is when different prices are charged for the same product or same service that fluctuates based on the customer’s willingness and ability to pay. But there is no established economic or legal principle that forbids charging different prices for different products.
Even the more burdensome regulatory framework of “common carrier” status for telecommunication services that the Internet was once regulated under would not forbid price differentiation on different products. Common carrier does not fundamentally forbid price differentiation and the United States Postal Service which is certainly a common carrier is a prime example. One would not expect to pay the same rates for “express” or “priority” or “first class” mail or parcel delivery so why would anyone expect the Internet to work without standard price differentiation?
Users have a right to access all websites at the same speed?
The Washington Post seems to be on a roll when it comes to inventing new rules and rights that no knowledgeable person has ever heard of. The Post claims that:
“for Internet users, having equal ease of access to all content has been perceived as a basic right”
Oh really? I’ve been an Internet user before most people even heard of the word Internet and I never knew that I had a “basic right” to access on websites and content at equal speeds. Anyone that regularly uses the Internet knows that this simply isn’t the case and that websites perform differently compared to each other and they perform differently at certain times of the day. Perhaps not everyone understands all the factors that go into these speed differences but most people know that some websites just plain suck compared to others and it doesn’t have anything to do with the ISP.
The reality is that there are lots of factors that go into the performance of websites. Without getting into too much detail, website performance can be summed up to the following factors.
- Server performance. A “server” is a computer dedicated to serving content or applications.
- Website platform e.g., WordPress, Apache or IIS, Linux or Windows.
- Database performance.
- Developer and web administrator competence.
- Delivery platform performance. This could be Content Delivery Network (CDN) based, Paid Peering, or traditional IP transit which is the small-scale solution that is slower and more expensive bandwidth than Paid Peering but doesn’t require a distributed server infrastructure that requires bigger scale.
The concern that ISP-enabled content delivery is too dangerous is completely unfounded given the fact that there are so many competitors. Why is it any more dangerous to allow ISPs to offer enhanced delivery services to B2C websites than CDN services like Akamai or LimeLight? If anything, the extra competition means more competitive options for websites to choose from and lower prices.
For the vast majority of websites on the Internet, the cost of bandwidth is the least of their worries since $50/month of server hosting service supports ten million webpage views per month. The real challenge is acquiring content, employing talented content creators, and paying for the advertising needed to attract ten million page views per month. Bandwidth costs for high resolution video is a lot more expensive but even a thousand 720P HD videos can be delivered at a bandwidth cost of 30 to 93 cents (priced lower at higher volume) and even full outsourcing to Vimeo costs $2.40 per 1000 720P videos delivered. For typical content that is worth anywhere between $1 to $40 per thousand page views to advertisers, the economics of using your own infrastructure may or may not make sense depending on scale and advertising value of the content.
For those who cannot justify their own infrastructure, content (no matter how mundane or stupid) is so valued that companies like Google will not only deliver the content over YouTube at no charge, they’ll even pay you for the honor of hosting that content if it is sufficiently popular. Just doing some zero budget videos of stupid “gangsta” tricks will earn some people over $150,000 from YouTube and 10 people make well over $100,000 to $300,000 a year as YouTube partners. Here at DigitalSociety we can’t afford to host video content so we host our channel at YouTube. We were approached by YouTube to be partners but we opted for just free content delivery without exposing or valued readers to additional ads.
So this notion that enhanced delivery services would somehow price out and prevent the “little guy” from being heard on the Internet is nonsense. It’s true that the little guy is priced out of hosting their own content for high resolution video as they have always been, but it’s an irrelevant concern because the little guy can get free delivery or even get Google to pay them for the honor of using their content to attract viewers. So at the end of the day, there is no logical argument against ISP enabled enhanced delivery any more than there is an argument against CDN enhanced delivery. If anything, the CDN enhanced delivery is vastly more harmful to non-enhanced content than ISP enhanced services.
There they go again with that “no FCC authority” assertion
A few months ago, FCC Commissioner Clyburn wrongly accused Internet Service Providers of claiming that the DC Circuit Court of Appeals stripped the FCC of any authority to regulate the ISPs. Not only did the ISPs argue just the opposite, that the FCC still has ancillary authority, Commissioner Clyburn made this accusation before a Free Press event which was one of the very groups guilty of perpetuating the no authority myth. Even more ironic is that the same Washington Post columnist Cecilia Kang who helped perpetuate the no authority myth was also the one that reported on Clyburn blaming the ISPs for this myth.
Ignoring the lessons of the past, this current Washington Post editorial goes on again to make the assertion that:
“in April, a court ruled that the Federal Communications Commission has no regulatory authority over Internet service providers.”
Sigh, I wonder what Commissioner Clyburn thinks about this.
The Post goes on to conclude that the FCC needs the authority to “nudge” ISPs to behave properly, but the facts seem to indicate that the FCC never lost their ability to influence ISPs. In fact, the FCC in 2005 managed to “nudge” Madison River Communications to end their blockade of the Vonage Internet Telephony service and “voluntarily” contribute $15,000 to the US Treasury despite the lack of Title II authority.
Furthermore, the FCC never lost against Comcast and they got every concession they sought even before the FCC issued a formal ruling against Comcast. Comcast’s controversial actions were done with good intentions but poor implementatino so they corrected it and complied fully with the FCC. The FCC issued no fine against Comcast but did issue a nasty ruling that was unjustified and unfounded. Were it not for the grossly inaccurate portrayal of Comcast’s actions in the FCC ruling and a flagrant disregard for due process, Comcast may very well have let it go instead of initiating a costly lawsuit on a seemingly moot ruling. The original complaint against Comcast was so weak that the complainants never even signed a formal complaint despite pleas from Commissioner Robert McDowell for a signed complaint.
The courts ultimately ruled in Comcast’s favor that the FCC failed to justify their authority in this particular case, but it did not issue a blanket statement that the FCC had no authority. So despite the mass hysteria over the DC Circuit ruling against the FCC’s handling of the Comcast case, the FCC seems to have plenty of “nudge” authority to get everything they want without even issuing a formal ruling.
The Washington Post’s conclusion
The Washington Post concluded that something like the Verizon-Google compromise would be a better route for Internet regulation. This conciliatory tone and effort to be somewhat amenable to both sides of the debate is certainly a huge improvement over the vulgar reactions in much of the media. Despite all its flaws and mis-characterization of the facts, the Washington Post editorial at least makes an effort to have a sensible debate but the devil is in the details which the Post is sorely lacking.
Without completely rehashing my analysis of the Verizon-Google compromise agreement, the Google-Verizon compromise gets many things right such as its acknowledgement that wireless is fundamentally different from wired networks. Its biggest mistake is that it presumes price and service differentiation to be illegal unless proven otherwise under some arbitrary definition of what is an “Internet” service and what isn’t even if it traverses the same broadband infrastructure. By dodging the important debate on whether to follow the established business practices of the Internet and allow for reasonable differentiation, the Verizon-Google agreement only pushes this important debate into the future even if it gets accepted as law. If these problems can be worked out, the Google-Verizon compromise may eventually morph into the right compromise.
Despite a lack of any evidence that there is impending harm and a long history of the FCC being able to “nudge” broadband providers, one can still make a reasonable argument that the FCC should have some authority to oversee broadband providers. If an ISP threatens a content provider with a degradation of service beyond the normal congestion they already face to get that content provider to host with the ISP, I see no problem with the FCC having a less than hospitable chat with that ISP. If the FCC wants to monitor existing best effort services to ensure that ISPs aren’t reducing its performance below what it is today or if the FCC wants to ensure a steady improvement in best effort Internet services, there might be good case to be made. The problem arises when the current FCC majority wants to issue a blanket prohibition against any enhanced or prioritized services that an ISP might want to offer a B2C website and that’s just lazy and harmful public policy.