Review for "A Digital Fountain Approach to Reliable Distribution of Bulk Data"

From: Tyler Robison (trobison@cs.washington.edu)
Date: Sun Oct 10 2004 - 23:29:23 PDT

  • Next message: Danny Wyatt: "A Digital Fountain Approach to the Reliable Distribution of Bulk Data"

            The paper describes a 'digital fountain' protocol for sending a
    piece of data to a large number of clients, possibly on many different
    types of networks, who may not be listening at the beginning of the
    transmission. Furthermore it is assumed that there is little to no
    communication from the client to server, and that the overall process
    should be fairly efficient. The solution involves using erasure codes,
    and transmitting the k packets of data along with k packets of redundant
    data determined through erasure codes. When any combination of k of these
    is had, the original data can be reconstructed. The server repeatedly
    sends out these packets as long as someone is trying to download them, and
    there is no need for clients to send anything back. So even if they start
    late, or miss a few packets here and there, clients won't necessarily have
    to wait until it cycles back to the beginning in order to get the data.
            The idea itself seems to be a very good one, and the paper seems
    pretty solid, as it compares the Tornado codes to the alternative
    Reed-Solomon codes and interleaved codes and shows benchmark results for
    these, and overall provides a good sense of the tradeoff involved. In
    addition they implement a working version to show that it actually works,
    and show how well their congestion control method works.
            One rather significant limitation is that it wouldn't necessarily
    work well for time-critical cases, like streaming video, though they do
    say that it isn't intended for real-time applications. It might be
    possible though to use similar techniques in real-time data, not applying
    the codes to the entire stream but to small segments, maybe a few seconds
    worth of data each, and send a couple of redundant packets to help fill in
    any gaps caused by lost data. Of course, decoding times and potentially
    small benefits may make the whole thing pointless.


  • Next message: Danny Wyatt: "A Digital Fountain Approach to the Reliable Distribution of Bulk Data"

    This archive was generated by hypermail 2.1.6 : Sun Oct 10 2004 - 23:29:23 PDT