(a) What would be the worst-case running time if you used Insertion Sort to fix things up? Why?
(b) What would be the worst-case running time if you were to use Mergesort on the entire n+k element array? Why?
(c) What if you were to use mergesort on the final k elements and then a merge operation to combine the original list and the k new elements?
given stable algorithm unstable algorithm Aue: ??? Brush: 3.2 ... ... Buys: ??? Brush: 3.2 Chamberlain: 3.2 Chamberlain: 3.2 Chamberlain: 3.2 Sun: 3.2 Gross: ??? Sun: 3.2 Brush: 3.2 Sun: 3.2 ... ... Ware: ???
(a) What needs to be done when writing the Mergesort algorithm to ensure that it is stable?
(b) Quicksort (as we've discussed it) is unstable due to its partition step. Illustrate why this is by giving an input to a single Quicksort partition step and showing how it loses its stability. Assume that we use the middle element as the pivot. Please label identical values in some manner that keeps them distinct (e.g., 10A, 10B, etc.). Note: you should assume that the pivot value is unique -- having multiple values equal to the pivot can create instability, but also raises more fundamental questions that we haven't addressed.
(a) Show a 7-element input to Quicksort with no duplicate values that results in a worst-case runtime for this choice of pivot. For simplicity, you may do the partitioning conceptually (as I did in my "Quicksort Example" slide), rather than using the precise in-place algorithm (which complicates the design of a worst-case input by moving the pivot around in the array).
(b) Illustrate that this is a worst-case run by showing how each step of the Quicksort would be partitioned.
void PrintSortedGPAs(float GPA[], int numGPAs);