From: Masaharu Kobashi (mkbsh@cs.washington.edu)
Date: Sun Oct 31 2004 - 14:38:59 PST
1. Main result of the paper
The paper presents the revised metric computation of ARPANET with
the history of evolution which finally reached the HN-SPF. It also
describes the analysis of how HN-SPF outperforms the previous
alternatives,
min-hop and D-SPF, with actual observations of the performance
enhancement
which shows substantial improvement in a transition period from the
old to
the new (HN-SPF) metric: 46% reduction in round-trip delay, 13%
increase in
network throughput.
2. Strengths in this paper
The greatest achievement of the proposed idea is it actually improved
the network performance of ARPANET substantially.
The effectiveness of the revised metric, HN-SPF, in the real world comes
essentially from the very simple but robust methods:
(1) Averaging in measuring link delay
(2) Limiting maximum change in updated values of link cost
(3) Limiting minimum change in updated values of link cost
The greatness, which the paper does not state, of the revised algorithm
is it focuses on eliminating the downside risks rather than trying to
pinpoint on the optimal use of bandwidth all the time.
It is because in the real world, which is a constantly changing
environment with unpredictable phenomena such as misbehaving routers,
and poorly designed networks, any algorithm dependent upon
assumptions of accurate behavior of every node and link is brittle.
The above methods are a rough algorithm and not the one which guarantees
the optimal use of bandwidth all the time. But it focuses on eliminating
the downside risks, rapid fluctuation to small changes, oscillation, and
formation of persistent loops. This strategy works and does so robustly.
3. Limitations and suggested improvements
One weakness in the argument is that the authors give almost no
justification or derivation in setting the values of coefficients
of proposed formulas.
Second, it can eliminate the problem of minor oscillation around
the equilibrium point by the following way.
Since it is know in which region of the cost-utilization map the
oscillation around the equilibrium point starts, you can change
the behavior of the network response by using a two tier system
which is made up of the proposed HN-SPF and an extra response
function and a cost function which are applied only to the situations
where the reported cost and utilization are around the equilibrium point.
The extra functions are designed to be allowed to go to the equilibrium
point once the situation reaches within the oscillation region. It can
be achieved by changing the limits of updating etc. Of course it needs
more precise settings of coefficients and limits in this case. But this
is a rough idea of the two tier system.
As to the equilibrium related argument, in the real world equilibrium
itself is not as vital as it sounds. What is important is the elimination
of drastic fluctuations in the process of adjustment to equilibrium,
since in most time the network is in non-equilibrium state, i.e. in
adjusting stages. In this sense HN-SPF is quite usable in the real world
despite the fact that it oscillates around the equilibrium.
4. Relevance today and future
The next step to further improve the performance is the use of multi-path
routing algorithms. (The paper also refers to them when it touches upon
the weakness of the HN-SPF.) But still the strategy of emphasizing on
eliminating downside risks at the cost of giving up the perfect
optimization will be the right way into the future.
This archive was generated by hypermail 2.1.6 : Sun Oct 31 2004 - 14:38:59 PST