Timing Fast Things

The timers on the PCs are accurate only to milliseconds. The implication of this is that if you are timing something that is faster than a millisecond (and lots of things on computers are), the timers may return "0" as the amount of time it took.

In order to time something faster than a millisecond, you will have to use a standard CS research trick which is to wrap a loop around the thing. In this way, rather than timing a single version of it, you are timing "n" versions of it. How big should "n" be? Whatever it takes to get you into execution times that are greater than a millisecond. Then, when you've got your time, divide it by "n" in order to find out how long a single run of the algorithm was.

For example, let's say I time my binary search on a problem size of 100,000. I get back a time of 0. I try bigger and bigger problem sizes, but no matter how big a problem I use, the time is always 0. Eventually I run out of memory. I couldn't find a problem size big enough to give me times of greater than a millisecond. But I need to show that binary search has a sublinear running time, and don't think that a plot of all 0's will prove that very well.

So, I wrap a loop around my binary search, timing it for 10 iterations, then 100 iterations, then 1000 iterations until I find a number of iterations that gives me a time greater than a millisecond. Let's say, for example, that if I run for 1,000,000 iterations my timer reports 0.21423 seconds. I could then divide 0.21423 by 1,000,000 to get the number of seconds that a single run of my binary search took (0.00000021423 seconds, or .21423 microseconds).

Now that you have the general idea, let's look at some  C++ code that does this trick.  Note, I've included the timer.h file so I can use the provided Timer class.  In the code below replace the call to the "RunAlgorithm" procedure with a call to a procedure that runs the algorithm you want to time.

#include <iostream.h>
#include "timer.h"

main() {
  int num_trials = 100000;  // you may want to get this as an input parameter
  double total_runtime;
  double actual_runtime;
  Timer myTimer;            // use the provided timer class

  // restart the timer before running algorithm
  myTimer.Reset();
 
  // run my algorithm
  for (int i=0; i< num_trials; i++) {
    RunAlgorithm();
  }
 
  total_runtime = myTimer.Check();
 
  // compute the time for one run through and print it out
  actual_runtime = total_runtime/num_trials;
  cout << "The algorithm takes " << actual_runtime << " seconds" << endl;
}