image University of Washington Computer Science & Engineering
  CSE 427Au '18:  The Course FAQ
  CSE Home   About Us    Search    Contact Info 

Common questions:

  1. How do I time my program?
  2. How do I time my really, really fast program?


  1. Q.  How do I time my program?

    You've asked us to measure the time taken by our algorithms. How do I to this? Should I buy that $3.99 Rolex I keep getting emails about? --- T. Treasure

    A.   Dear Timeless,
    No, save the Rolex to impress that Certain Someone. Computers generally have built-in clocks. Accesssing them depends on the language. For example, in C/C++, you do:

          #include <time.h>
          int main(void) {
            clock_t start_tick, end_tick;
            double elapsed;
    
            start_tick = clock();
            my_really_fast_algorithm(42);
            end_tick = clock();
            elapsed = (end_tick - start_tick) / (double)CLOCKS_PER_SEC;
            printf("My Really Fast Algorithm took %f seconds!\n", elapsed);
            return 0;
          }
    
    CLOCKS_PER_SEC, defined in time.h, tells how to scale the system-dependent clock() results to seconds. (It is often 1000, but that does not tell you the resolution of the timer, which is often 10 or .001 milliseconds. Oldtimers remember 16.7 millisecond clocks, i.e., 60 hertz powerline frequency...)

    Similarly, in Java public static long currentTimeMillis() returns the current time in millisecond units (but not necessarily millisecond accuracy).

    Specific systems may have additional features. These will be less portable, but may be more convenient. E.g., a former student pointed out these "high performance timer" functions for Windows:

  2. Q.  How do I time my really, really fast program?

    I did what you outlined above, but I always get 0 seconds?!? ---S. LaCode

    A.   Dear Swifty,
    Congratulations! Your algorithm is a blazing marvel of speed. But just to be on the safe side, I always do this, too: Since the system clock may tick only once every few milliseconds, and my computer executes about a billion operations per second, a fair bit of work gets done between clock ticks. To time an algorithm more accurately for small n, repeat it r times, for some number r ≥ 1 judiciously chosen so that the elapsed time for all repetitions is, say, 500 milliseconds or greater, then report the elapsed time divided by r. It's OK to use a different r for different input sizes, just so each chunk you time is bigger than 500 ms or so. A big-O estimate of running time might suggest how r should be scaled with n.


CSE logo Computer Science & Engineering
University of Washington
Box 352350
Seattle, WA  98195-2350
(206) 543-1695 voice, (206) 543-2969 FAX