An Analysis of Internet Content Delivery Systems

From: Danny Wyatt (danny@cs.washington.edu)
Date: Sun Nov 07 2004 - 22:54:29 PST

  • Next message: Lillie Kittredge: "kazaa vs. the world"

    An Analysis of Internet Content Delivery Systems
    Setfan Saroiu, Krishna Gummadi, Richard Dunn, Steve Gribble, Hank Levy

    This paper presents 9 days worth of HTTP data observed between UW and
    the internet. The analysis shows that P2P traffic, while having a lower
    number of both requests and users, still occupies most of the network
    traffic due to its far larger object size. I found two aspects of the
    analysis particularly interesting.

    First, their data supports a "some peers are more equal than others"
    interpretation of P2P traffic. They observe that 600 (out of 281,026)
    external P2P servers provide over a quarter of the incoming P2P bytes.
    They also observe that UW P2P servers have an export/import ratio of
    almost 7 to 1. P2P services allow users to choose the hosts from which
    they want to download objects. Typically, these hosts are ranked by
    observed bandwidth. My intuition suggests that hosts within UW have
    more uplink bandwidth than home users (who typically have asymmetric
    uplink/downlink speeds), and thus they appear more attractive as P2P
    servers. Similarly, the top external P2P servers probably appear more
    attractive for some criteria and thus are more popular. This
    interpretation suggests that network inequities cause P2P services to
    adopt a more traditional client-server behavior.

    Second, they suggest that a local cache could provide all the benefit of
    Akamai for content providers. They show that a small number of objects
    account for most of the Akamai traffic. Additionally, though it goes
    unmentioned in the text, Figure 11a shows that Akamai is doing its job
    well: all traffic from UW is routed to a small handful of Akamai
    servers. These two facts together suggest that Akamai is acting as a
    local cache server, albeit one probably not tuned specifically to the
    request signature of only UW. This conclusion is appealing, but not too
    surprising. If there were adequate, reliable, trustworthy local caches,
    then content providers would not need to hire Akamai. But because there
    are none that fulfill all 3 of these needs---particularly
    trustworthiness, or a certain liability---the providers must provide
    these themselves, via Akamai.


  • Next message: Lillie Kittredge: "kazaa vs. the world"

    This archive was generated by hypermail 2.1.6 : Sun Nov 07 2004 - 22:57:09 PST