Slippery slopes can be a logical fallacy, but they’re also a fact of regulatory life. See, for example, the congressional power to regulate Interstate commerce, which has become the congressional power to “regulate activity that is neither interstate nor commerce” (Justice Thomas, 2005).
Like Berin Szoka, I fear we’re going to be heading down that slippery slope in the year ahead. Two recent examples…
Tim O’Reilly’s “The War For the Web” is one of the most profound – and profoundly dangerous – articles of 2009. O’Reilly makes a profound point about the web’s tendency towards unity: “it is the design of systems that get better the more people use them, and … over time, such systems have a natural tendency towards monopoly.” But he also says “we are heading into a bloody period of competition that could be extremely unfriendly to the interoperable web as we know it today” and “We’re heading into a war for control of the web. And in the end, it’s more than that, it’s a war against the web as an interoperable platform.” There’s more than a whiff of “only regulators can save us now!” in that. I fear an environment where “the commons” is used to prevent competition; where “open” is not a option but a mandate – a tool to freeze platforms in place.
The New York Times gave us another good example of that slippery slope yesterday with an op-ed proposing “search neutrality”.
Today, search engines like Google, Yahoo and Microsoft’s new Bing have become the Internet’s gatekeepers, and the crucial role they play in directing users to Web sites means they are now as essential a component of its infrastructure as the physical network itself. The F.C.C. needs to look beyond network neutrality and include “search neutrality”: the principle that search engines should have no editorial policies other than that their results be comprehensive, impartial and based solely on relevance.
This isn’t the first time that’s come up. Ars Technica wrote about “Google Neutrality” a couple months ago. I’m not sure how that would even work. How can a search engine possibly avoid discriminating when the very nature of search results is ordinal?
Now, I have some sympathy for concerns that search engines (Google, Bing, Yahoo, Ask, etc) be more transparent – or at least responsive – regarding their practices. Nobody wants to build a business around a high traffic area, only to find traffic suddenly disappear and nobody answering the phone at the Department of Directions & Billboards.
But “they should do X” and “they should be required to do X” are very different statements. The latter statement – a demand for regulatory intervention – is a tiger….and tiger’s are unpredictable. As Winston Churchill said, “Dictators ride to and fro upon tigers which they dare not dismount.”
We’ve already gone a little way down this path with net neutrality, which purports to prevent things that nobody really wants to do, but will also have much larger, unintended consequences. I would hope the FCC will limit that to rules that simply increase transparency and prevent genuinely anti-competitive or harmful behavior.
Unfortunately, bad ideas like this are what happens when you mix your metaphors ride a tiger down a slippery slope. Technosailor is exactly right about a lot of “neutrality” policies ultimately being regulatory rent-seeking or attempts to turn competitors into commodities. Jim Harper is usually exactly right, and this is no exception…
Though good ironic comeuppance for Google, “search neutrality” regulation would ossify an innovative business and deprive consumers of the benefits of competition.