CSE as AND gate University of Washington Department of Computer Science & Engineering
 CSEP 521 - Topics - Winter 2007
  CSE Home   About Us    Search    Contact Info 

  1. N-Body Simulation: Astronomers are very interested in the evolution of the universe. In the past few years they have developed techniques to simulate large numbers of stars. They have investigated what happens when galaxies collide. What makes this problem hard is that the naive approach to simulating N bodies is to make N2 force computations to calculate the new velocities and positions of the bodies. This is just too much computation. There are ways to do much less computation yet achieve acceptable, low error, simulations.
  2. Program Analysis using Graphs: Compiler writers are always trying to ways to generate better code. Interestingly, they have found ways to represent code as graphs or graphs with weights on the edges to indicate the frequency that one piece of code is executed just after another one is. Using some graph algorithms the code can be mapped to memory so that it has good instruction cache performance.
  3. Tree Drawing: There are many cases when we would like to automatically generate a good drawing of a tree. For example, a big company does a reorganization, so the management hierarchy must be redrawn. A company that prepares family trees would like to generate them automatically. Drawing trees is just one example of graph drawing.
  4. Web Page Layout: It is more and more popular to have a personal web page where you receive things you are interested in. For example, I get my stock prices, local team sports scores, and weather on my personal web page. It would be nice to have a good layout for my web page. How can that be done automatically.
  5. VLSI Layout: Another layout problem that is even more complicated is laying out components on chip automatically. In this case we have to consider not only where the components will go but where the wires will go that connect them. This is a huge problem area but there are subareas that are in themselves quite interesting. For example, let us suppose that the components are already placed, then what is the best way to route the wires.
  6. Exact String Matching: A common problem that comes up in text databases is, given a relatively short query string, find all instances of it in a very long string. For example, the classic unix utility "grep" uses exact string matching. If the length of the query string is m and the length of the database string is n, then we would like to answer the query in O(m+n) time. There are some techniques around that even do better on real data. That is the time is sublinear!
  7. Approximate String Matching: Another common problem that comes up in some text databases and in DNA sequence databases is that of approximate string matching. Here for each query string we want to find all instances of very similar strings in a very long string. A key problem is to make concrete the concept of "very similar". Udi Manber invented a companion to grep called "agrep" that does approximate string matching. In this case the measure of similarity between strings is edit distance, how many insert and delete operations it takes to transform one string into the other.
  8. Algebraic Simplification: In systems like Mathematica, Maple, and MatLab, there are functions that simplify algebraic expressions, to make them shorter or more elegant. For example, the expression (x2 + 3*x + 2)/(x+1) can be simplified to (x+2). Simplification is certainly more complicated than factoring as was done in this case. How do different simplification algorithms compare?
  9. Intersection Detection: A common problem in computer game design is given a set of objects, detect all intersections quickly. Given, a missile and a target, each defined as a polygon how do you detect an intersection between the two very quickly. This is a classic problem in the field of computational geometry where there are many other interesting problems to find. As a starter problem, can you come up with a fast algorithm that will detect if a circle (given by its center and radius) and a rectangle (given by its lower left and upper right corners) intersect? Now do this for hundreds of such objects.
  10. Surface Reconstruction from Polygons: An MRI machine produces a sequence of parallel 2-dimensional slices. Slices can be analyzed to find the boundary of an object like a bone. The boundary can be represented as a sequence of little line segments called a polygon. The problem is to construct a true 3-dimensional surface of the bone. There are several techniques that do this by taking neighboring slices and constructing a surface between them. This surface can be just a set of triangular patches.
  11. Nearest Neighbor Search: This is a classic problem that is seen in many different contexts. There is a fixed database of points. A query point is given and the job is to find the database point that is nearest to the query point. Take as an example the FBI database of fingerprints. A query fingerprint arrives and we want to find the nearest fingerprint in the database. We probably want the top 20, but this is just an example. Naturally, we need to make concrete what we mean be the distance between two fingerprints.
  12. Prime Testing: For RSA public key encryption we need a way of finding large prime numbers. There are several algorithms that do a good job of this. All of them use randomness. A little over two years ago a deterministic polynomial time algorithm was discovered. It's time complexity appears to be too high to compete with the random algorithms.
  13. Public Key Encryption: There are several competing algorithms to provide public key encryption. The first and most famous is RSA. There is a new method that uses elliptic curves that shows a lot of promise.
  14. Network Routing Algorithms: There are several competing network routing algorithms. Network people call them shortest path routing and distance vector routing, and there may be others. Both are in common use today. Why are both still in use? Isn't one better than the other.
  15. External Memory Sorting: Most of us know about the classic "internal sorting algorithms" like quicksort, mergesort and heapsort. What do you do if the data to sort does not fit in memory. I was reminded of this problem when I visited AT&T Research Labs several years ago. AT&T must sort a huge number of phone calls records every day to generate our phone bills.
  16. Web Page Authority: Many current search engines such as Google and Clever use information about the link structure of the web in order to compute the authority of a web page in the internet community. Simply setting the authority of a web page to be the number of other pages that link to it does not work well since it's easy to create a large number of web pages which do nothing other than link to a given page in order to artificially boost the authority of that page. The current methods used to define the authority of web pages treat the web as a graph and assign authority scores to web pages based on the eigenvectors of the adjacency matrix of that graph.
  17. Splay Trees: Splay trees are perhaps the most frequently used non-trivial data structure in computer science. They are used in the GCC compiler, Windows NT and the Malloc Library. Splay trees are easier to implement than red black trees or other binary tree data structures but the insertion operation for splay trees takes O(log n) time in an amortized sense. Surprisingly there are still some fundamental conjectures about splay trees that are believed to be true but which have never been proven.
  18. Error Correcting Codes: By blowing up the length of a message by a certain amount, it is possible to reconstruct the message with high probability even when some subset of the bits in the message are corrupted. In general, we would like to blow up the length of the message as little as possible and to be able to correct as many errors as possible. Error correcting codes give a systematic way to achieve this goal. These codes are used in a wide variety of real world applications including networking, compact disks and disk drives.
  19. SAT Solvers: In the past few years AI researchers have developed a number of satisfiability solvers that can be used to solve planning or general optimization problems. Considering SAT is an NP-complete problem, how is that SAT Solvers seem to do so well in solving some problems.
  20. Primal-Dual Schema: Recently there has been a lot of success in approximately solving some NP-hard problems using linear programming primal-dual schema. These algorithms are quite simple, but their analysis relies on understanding the primal-dual nature of linear programming. It would be interesting to compare these approaches with other less mathematical techniques such as local search.


CSE logo Department of Computer Science & Engineering
University of Washington
Box 352350
Seattle, WA  98195-2350
(206) 543-1695 voice, (206) 543-2969 FAX
[comments to ladner]