[   ^ to index...   |   <-- previous   |   next -->   ]

O(n log n) sorts

Mergesort

IDEA: "If there are two or fewer elements in the array, order them and return. If the array has more than two elements, break the array in half, recursively sort the two halves, and then merge them."

I could have dumped some mergesort code onto this page, but the best way to understand mergesort is probably to just do it for a small array:

10 19 3 120 42 33 81 14

               

               

               

               

               

Time cost

We analyze the running time of merge sort by analyzing the three stages in turn:

Splitting the array
Each "split" requires a function call. Notice that we have n elements and n/2 "splits". Therefore, splitting is O(n).
Sorting subarrays (of 2 or fewer elements)
For every pair of elements, we execute a constant number of operations. Therefore, this step is O(n).
Merging subarrays
Each "level" of merge requires O(n) steps, because we must examine each element of each subarray to be merged. How many levels are there?

Therefore, the asymptotic running time of merge sort is ... ?

Space cost

In the paper version above, we made a fresh copy of the array each time, using O(n log n) space. Actually, this isn't necessary, and a real implementation of merge sort can get away with less space. How?


Last modified: Tue Aug 8 10:28:48 PDT 2000