Home » Internet

Why content stays narrow when broadband goes fat

By 1 October 2009 10 Comments
I was chatting with someone over the FCC announcement that it would cost $350B to give every home fiber.  But thinking about it, we’ll just end up like Japan where they have cheap fast shared broadband for the residential market, but the dedicated server bandwidth in Japan and elsewhere is extremely expensive (even more so than the USA).  The result is that you have all this fast broadband with no applications or content that will actually use it.  Their “tube” sites are rarely as “generous” (2.5 Mbps on YouTube ain’t really generous)  as their US counter parts.  A Japanese analyst told me that only the premium porn sites offer 2+ Mbps download streams.
Now what happens after I wave a magic wand and all of a sudden every home had 100 Mbps broadband access like Japan?  Will the content providers instantly be jumping to provide us higher bandwidth content?  Based on historical trends, the applications and content that we have access to won’t change.  YouTube for example stayed with 0.32 Mbps video streams as late as early 2008.  It was only in 2008 that they started offering “HQ” streams at 0.64 Mbps and then in 2009 came so called “HD” streams at 2.25 Mbps.  Mind you that this is VERY generous compared to Japanese “tube” sites even though one would think that they would have the higher bandwidth content (I would assume because the server bandwidth in Japan is more expensive than it is in the US and their Internet dotcoms are less willing to bleed on bandwidth costs like the VC guzzling US dotcoms).  But Google’s YouTube or someone else could have been offering 1.3 to 5 Mbps content as far back as 2003 because many broadband users could get those speeds but it didn’t happen.  Why?  Because DEDICATED server capacity is way more expensive than client-side shared broadband capacity and it costs money to offer high bandwidth content.
So when we really think about it, the performance of residential broadband compared to server bandwidth costs in the US is less unbalanced because it evolved through free market forces.  Countries like Japan and Korea where they artificially jacked up Broadband performance with massive subsidies and tax incentives effectively have unbalanced systems where the performance of the Broadband far exceeds the capability of the content/application companies.
What does this mean?  It means the broadband bandwidth in Japan is largely sitting idle with the exception of P2P traffic.  But even P2P had to be severely limited because the transit bandwidth costs on the ISP’s upstream were killing the ISPs and rampant piracy was murdering the content creators.  Solution?  They left the downstream “unlimited” with no caps but capped the upstream to a 2.5% duty cycle.  So at most, a consumer would be willing to leave their P2P upstream at no more than 2 Mbps which means the P2P download speeds would be no more than 2 Mbps average.

I was chatting with a friend over the FCC announcement that it would cost $350B to bring every home fiber and some interesting thoughts came up which might upset some of my friends who want to see fiber to every home yesterday.

It occurred to me that we might just end up just like Japan where they have cheap fast shared 100 Mbps broadband for the residential market, but dedicated 100 Mbps data center bandwidth in Japan is extremely expensive (a lot more than the U.S.).  The result is that you have all this fast broadband with no applications or content that will actually use it.  The free video “tube” sites like Nico Nico Douga in Japan are rarely as “generous” with video bandwidth (2.25 Mbps on YouTube isn’t that generous by HD video standards)  as their US counter parts.  While I was researching for the paper “Need for speed“, a Japanese analyst told me earlier this year that only the subscription video services and premium porn sites offer 4 Mbps download streams (I swear to you this is how I know this).  By comparison, US based services like Microsoft Xbox Live Marketplace has been offering 6 Mbps content for many years.   This paper has some good information on the plateauing of bandwidth usage in Japan.

Note: Shared 100 Mbps broadband in Japan might cost less than $70 per month in Japan but a dedicated unlimited server circuit in Japan would cost $11,000 in Tokyo based on a quotation I received last month.  The dedicated circuit prices in US data centers is much better at $1100 for a dedicated 100 Mbps connection but even that gives me great incentive to keep my application bandwidth at a minimum if I had to serve 100,000 people.

Now what happens if I wave a magic wand and all of a sudden every home had 100 Mbps broadband access like Japan tomorrow?  Will the content providers instantly be jumping to provide us higher bandwidth content?  Based on what history tells us, no.  The applications and content that we have access to won’t change.  YouTube for example stayed with 0.32 Mbps video streams as late as early 2008 and it was only in 2008 that they started officially offering “HQ” streams at 0.64 Mbps and then in 2009 came so called “HD” streams at 2.25 Mbps.  But Google’s YouTube or someone else could have been offering 1.3 to 5 Mbps content as far back as 2003 because many broadband users could get those speeds but it didn’t happen.  Why?  Because server bandwidth costs are the limiting factor.  Dedicated server capacity is much more expensive than client-side shared broadband capacity and it costs money to offer high bandwidth content which is difficult to justify for free content sites.

So when we really think about it, the performance of residential broadband compared to our ability to serve content based on true cost of dedicated bandwidth in the US is less unbalanced because it evolved through free market forces.  Countries like Japan and Korea where the government artificially boosted Broadband performance with subsidies and tax incentives effectively have less balanced systems where the performance of the Broadband far exceeds the capability of the content/application companies.

What does this mean?  It means the utility of fast bandwidth in Japan is limited because of the lack of cheap bandwidth in the rest of the network.  P2P usage is higher because the “server” bandwidth is coming from residents with cheap bandwidth but even that had to be severely limited because the transit bandwidth costs on the Internet back haul were killing the ISPs and rampant piracy was murdering the content creators.  Solution?  They left the downstream “unlimited” with no caps but capped the upstream to a 2.5% duty cycle.  So at most, a consumer would be willing to leave their P2P upstream at no more than 2 Mbps on average which means the P2P download speeds would be no more than 2 Mbps on average.

While I’m certainly not suggesting that we don’t need more bandwidth (we do and we are in the process of getting it), this concept of balanced bandwidth between client and server is food for thought as we debate whether proactive policies are needed to accelerate broadband performance and adoption curves.  If we do decide that more proactive policies are needed, it is important to understand that Broadband is only half the equation on the Internet.

10 Comments »

  • Paul William Tenny said:

    I was chatting with a friend over the FCC announcement that it would cost $350B to bring every home fiber and some interesting thoughts came up which might upset some of my friends who want to see fiber to every home yesterday.

    So about $3,125 per home average with some homes costing a lot more, and some costing a lot less. Or roughly half of what we’ve already spent in Iraq. Call me nuts but even at 350 it still sounds like a bargain for what it would do for the country.

    Even if you can’t use it to its full potential, it’d still be the closest we may ever get to a future-proof upgrade.

    Although I’d probably want to sell my stock — if I had any — in every cable company and non-cellular telco. Wouldn’t need them anymore, would we.

  • Leonard Grace said:

    Why Cable-Telco’s Should Not Ignore FTTH!
    by Leonard Grace on October 23, 2009

    http://www.thecablepipeline.com

    Has Verizon seen past the critics of FTTH in realizing that it will ultimately be the (End Game)? Do not try to convince the Cable Industry like Comcast or Time Warner Cable, or Telco’s like AT&T, who have their own version of what the broadband landscape should look like, at least in the near term. Not to be too over optimistic; FTTH is costing Verizon on the front end.

    If you peer into where Broadband, Video, and Wireless is headed, both in the short and long-term scheme of things, one must wonder how their networks will stand up with the onslaught of applications, file sharing, video content, and femtocells. Their current platforms were engineered to carry data, not all of the above, and not at the speeds consumers, and businesses, will demand. Unfortunately, HFC relies heavily on a shared bandwidth with linear programming limiting its ability in offering a flexible bandwidth for broadband. Docsis 3 was supposed to eliminate this problem, but not to the extent of competing soundly with FTTH.

    My research indicates that FTTH is about 20% more expensive to deploy than HFC on the front end. However, the added costs are where the differences end. FTTH take rates for direct overbuild competitors are about 70%, up from the normal 30-40% compared to HFC. Reliability, and higher quality drive those take results with a competitive feature of adding new revenue generating services down the road. In essence, it is both a short and long-term strategy.

    Notably, making a case for this type of capital commitment has to be a (hard sell) in the confines of the boardroom. After all, companies have been making stellar profits, even in a down economy, while continuing to believe that building capacity, redundancy, and low latency does not add to shareholder value. A switch to FTTH would take time and money, something executives should be planning for now, and not just relying on CableLabs’ Docsis 3, or AT&T’s U-Verse to carry them through to the end. Unfortunately, executives continue to believe in the short-term, which correlates to the quarterly profits Wall Street has mandated for them.

    However, looking forward technologically is seemingly not on the Cable Industry’s immediate agenda; but it should be, and faster than anyone imagined a few years ago. Consider that municipalities, and yes, countries are getting on the FTTH bandwagon, bypassing traditional cable construction while realizing the many benefits of this type technology.

    Other benefits include being a green technology, a cheaper alternative to maintaining copper wires, while not running out of capacity for a long time to come. Yet, there are still the naysayers with the opinions that; why spend the money if immediate profits are good, and consumers are not jumping in mass, (off the bandwagon).

    Is this not why the FCC is currently re-visiting the rules governing broadband, that restricting capacity on company networks has initiated a firestorm of concerns, not only from consumers, but also from businesses with stakes like Google? The point is that an investment in a long-term technology like FTTH might keep governmental agencies from over regulating your industry out of business, and at the same time may keep immediate and future competitors at bay. Does that not make good business sense in the long-term?