From: Alan L. Liu (aliu@cs.washington.edu)
Date: Tue Nov 30 2004 - 23:41:14 PST
The paper consists of two main parts: examining the potential threat of
future Internet worms and then proposing the computer virus equivalent
of a Center for Disease Control to act as a central authority in
combatting worms.
I found the paper to be quite readable. It presented how some existing
worms were able to attack the Internet and raised some interesting
points about their characteristics, such as the different effects worms
had on different platforms. This raises an issue that is analogous to
the spread of biological diseases, namely how heterogeneity makes for a
natural defense against epidemics. One could see that reliance on Win2K
and it's back-asswards attitude towards security out of the box was the
main culprit in Code Red I/II exploits.
Of course, even having only a fraction of machines on the Internet
compromised can negatively impact the network, just in terms of used up
bandwidth and other undesirable effects, so heterogeneity isn't a total
cure. So the next natural step is to examine the software
vulnerabilities. Contrary to the main focus of the paper, the direction
computing seems to have taken is towards security mechanisms that
prevent unintended things from happening (SP2's default firewall and NX
as two prominent examples). This is not to say that the paper's point of
having something like a rapid response team is unmerited, merely that
the momentum is in a different place. Software mechanisms provide the
"ounce of prevention" while the computer CDC provides the "pound of
cure" when prevention isn't enough.
The paper leaves quite a few questions in the air regarding the role of
the CDC. This may be a weakness or a strength, depending on one's
expectations for the paper. To me, it was a strength in the sense that
good papers should suggest open research issues. It's a weakness because
it felt like long-winded speculation, separate from the first part
regarding worm characteristics.
One of the assumptions that the paper makes is that a CDC is feasible
to maintain. It seems to me that worm outbreaks occur at a lesser order
of magnitude than disease outbreaks. There are many diseases and
collecting and analyzing the wealth of information associated with them
is a large and ongoing task. Worms are relatively simpler to analyze --
it doesn't seem like they require a large group of co-located security
experts to examine them. In other words, having better coordination
among security experts may be what we really need, not necessarily a big
laboratory somewhere for them to work 9 to 5.
Actually, I just realized another area where the paper falls quite
short. There are no references to current practices regarding virus
analysis and comparisons between them and the proposed CDC. The paper
could be improved by explaining where current practice falls short and
how the CDC addresses it, as well as where current practice works and
perhaps the CDC should leave well alone? Unfortunately I am still left
with the impression that this stuff is more of a black art than science.
The paper hit home the point that it is not a good idea to rest on our
laurels and assume that we are basically safe. It is entirely feasible
for new classes of worms to rapidly overwhelm the Internet, and I am
convinced that understanding how to identify and fight compromises is
increasingly crucial.
This archive was generated by hypermail 2.1.6 : Tue Nov 30 2004 - 23:41:16 PST