Why content stays narrow when broadband goes fat
I was chatting with a friend over the FCC announcement that it would cost $350B to bring every home fiber and some interesting thoughts came up which might upset some of my friends who want to see fiber to every home yesterday.
It occurred to me that we might just end up just like Japan where they have cheap fast shared 100 Mbps broadband for the residential market, but dedicated 100 Mbps data center bandwidth in Japan is extremely expensive (a lot more than the U.S.). The result is that you have all this fast broadband with no applications or content that will actually use it. The free video “tube” sites like Nico Nico Douga in Japan are rarely as “generous” with video bandwidth (2.25 Mbps on YouTube isn’t that generous by HD video standards) as their US counter parts. While I was researching for the paper “Need for speed“, a Japanese analyst told me earlier this year that only the subscription video services and premium porn sites offer 4 Mbps download streams (I swear to you this is how I know this). By comparison, US based services like Microsoft Xbox Live Marketplace has been offering 6 Mbps content for many years. This paper has some good information on the plateauing of bandwidth usage in Japan.
Note: Shared 100 Mbps broadband in Japan might cost less than $70 per month in Japan but a dedicated unlimited server circuit in Japan would cost $11,000 in Tokyo based on a quotation I received last month. The dedicated circuit prices in US data centers is much better at $1100 for a dedicated 100 Mbps connection but even that gives me great incentive to keep my application bandwidth at a minimum if I had to serve 100,000 people.
Now what happens if I wave a magic wand and all of a sudden every home had 100 Mbps broadband access like Japan tomorrow? Will the content providers instantly be jumping to provide us higher bandwidth content? Based on what history tells us, no. The applications and content that we have access to won’t change. YouTube for example stayed with 0.32 Mbps video streams as late as early 2008 and it was only in 2008 that they started officially offering “HQ” streams at 0.64 Mbps and then in 2009 came so called “HD” streams at 2.25 Mbps. But Google’s YouTube or someone else could have been offering 1.3 to 5 Mbps content as far back as 2003 because many broadband users could get those speeds but it didn’t happen. Why? Because server bandwidth costs are the limiting factor. Dedicated server capacity is much more expensive than client-side shared broadband capacity and it costs money to offer high bandwidth content which is difficult to justify for free content sites.
So when we really think about it, the performance of residential broadband compared to our ability to serve content based on true cost of dedicated bandwidth in the US is less unbalanced because it evolved through free market forces. Countries like Japan and Korea where the government artificially boosted Broadband performance with subsidies and tax incentives effectively have less balanced systems where the performance of the Broadband far exceeds the capability of the content/application companies.
What does this mean? It means the utility of fast bandwidth in Japan is limited because of the lack of cheap bandwidth in the rest of the network. P2P usage is higher because the “server” bandwidth is coming from residents with cheap bandwidth but even that had to be severely limited because the transit bandwidth costs on the Internet back haul were killing the ISPs and rampant piracy was murdering the content creators. Solution? They left the downstream “unlimited” with no caps but capped the upstream to a 2.5% duty cycle. So at most, a consumer would be willing to leave their P2P upstream at no more than 2 Mbps on average which means the P2P download speeds would be no more than 2 Mbps on average.
While I’m certainly not suggesting that we don’t need more bandwidth (we do and we are in the process of getting it), this concept of balanced bandwidth between client and server is food for thought as we debate whether proactive policies are needed to accelerate broadband performance and adoption curves. If we do decide that more proactive policies are needed, it is important to understand that Broadband is only half the equation on the Internet.