[   ^ to index...   |   <-- previous   ]

Some simple algorithms and their running times

BubbleSort, SelectSort

for (int i=0; i<len; i++) for (int j=0; j<len-i; j++) if (array[j] > array[j+1]) swap(array[j], array[j+1]); for (int i=0; i<len; i++) { int min_idx = i; for (int j=i+1; j<len; j++) { if (array[j] < array[min]) min_idx = j; } swap(array[i], array[min_idx]); }

Binary search

The idea behind binary search: each time, we probe the element in the middle of the current extent. At each probe, the remaining extent of the array to be searched splits in half. Therefore, given any array of size n, the series of array sizes is given by:

n n/2 n/4 n/8 n/16 ...

Clearly, every time we double the size of the initial array, we only add 1 more probe to the worst-case search. The complexity of the algorithm is therefore described by log(n).

Question: would "ternary search" (at each iteration, make 2 probes, thus decreasing the array extent to 1/3 of its original size) be an improvement on binary search?

Linked list copy

Several people implemented linked list copy on homework 4, part 1 as follows:

void LinkedList::copy(const LinkedList& other) { for (int i=0; i<other.size; i++) { carPtr toCopy = other.atIndex(i); enqueue(toCopy); } }

The code has a charming simplicity, but what is the algorithmic complexity? (Consider the cost of the atIndex() operation.)

How could we eliminate this cost?


Last modified: Thu Aug 3 09:52:05 PDT 2000