Review-9

From: Pravin Bhat (pravinb@u.washington.edu)
Date: Mon Nov 01 2004 - 05:43:39 PST

  • Next message: Kate Everitt: "A Revised ARPANET Routing Metric"

    Paper summary: The paper proposes a new metric for calculating shortest routes
    for intradomain routing algorithms like SPF. The proposed metric fixes several
    known issues with previous delay based metric which performs sub-optimally
    under heavy traffic conditions.

    Strengths:
    The new metric normalizes several criterions into the evaluation function of
    link costs including link-bandwidth (instead of simply link-delay) and current
    network load. This makes the routing algorithm robust to the presence of
    heterogeneous links in the network domain and varying network load.

    By smoothing out changes in the computed link costs over time the algorithm
    essentially strengthens the stability of the routing tables and in turn the
    stability of the entire network domain. This has several desired side-effects:
      - Increases overall network utilization
      - Avoids unnecessary congestion during heavy traffic
      - Reduced oscillations in link costs reduces unnecessary routing related
        traffic and the subsequent CPU cycles wasted on updates to the SPF data
        structures.

    Since the benefits provided by the paper are aligned with the interests of the
    network administrators in addition to the end-users it is more likely to be
    adopted. The proposed changes are simple to implement and affect only domain
    level routes thus can be made independent of other domains. Changes translate
    to immediate results for all intradomain messages and overtime, provide others
    adopt the algorithm, interdomain routing is also improved.

    The paper provides a fairly through simulation-based comparison of performance
    across various criterions (stability, utilization, etc) with competing
    implementations of SPF.

    Limitations and Areas for improvement:
    While the paper provides some good insights into why D-SPF fails to work in
    certain conditions it fails to provide any good insights into the underlying
    reasons why HF-SPF works for these conditions. What is provided is a set of
    heuristics hand-tuned for ARPANET without any techniques for adopting
    these heuristics to other networks or network-changes - i.e. how does one
    decide the upper and lower bound constants for a new link-type.

    The new metric greatly reduces the oscillation in link costs however HN-SNF
    still oscillates around the equilibrium with a bounded amplitude. This causes
    a certain class for flows (i.e. a flow whose 2nd best route is only slightly
    longer than its optimal route) to oscillate among several routes. This keeps
    routers from developing any soft-flow state for these flows which could help
    it provide certain features to the network - i.e. congestion avoidance,
    fair queuing, QoS, etc

    The paper could be better polished. For example it makes references to
    M/M/1 queuing model without providing a reference any documentation that would
    better explain this queuing model to an unfamiliar reader like myself.
    Some of the figures (7,8, etc) have no legends explaining what each of
    the line-styles mean.

    Relevance and Future work:
    In my opinion this paper has some historic value which is relevant to this
    day. The paper belongs to a class of papers published in the late 80's (i.e.
    the Jacobson paper on congestion control) that marked a change in the
    philosophy of how algorithms for the internet were designed to work -
    specifically the algorithms shifted from working in the best interests of the
    hosts to working towards the best interests of the host in union with the
    interests of the entire network. This is a shift towards a more stable system
    as suggested by Game Theory.

    For future work the authors need to define the constants used in their
    paper in terms of some intrinsic properties of a link (i.e. delay, bandwidth,
    noise, etc) so that new link-types could be easily incorporated into
    existing networks. Also a mathematically analysis of how the heuristics/weights
    are dependant on the network properties (i.e. size, topology, load) would
    help better understand the guarantees, if any, provided by HN-SPF.


  • Next message: Kate Everitt: "A Revised ARPANET Routing Metric"

    This archive was generated by hypermail 2.1.6 : Mon Nov 01 2004 - 05:43:40 PST