Worksheet, Thursday, Jan 27, 2000.

 

This worksheet will probably take you more than section.  Feel free to pick & choose which questions you want to try in section (with the TA hovering around).  You can also, obviously, try questions later.

 

The questions are marked with some of the following:

·        Exam.  I think that something like this question is likely to be on an exam (Note: I haven’t talked with Darth Wolfman about this, though)

·        Non-Exam.  The question tests material that I’m pretty sure is not fair game for the exam.

·        Hard.  The question seems particularly hard, given what we’ve learned this quarter.

 

Q1 (exam)

 

What is wrong with the following 2-Heap?

 

Q2 (exam)

 

Consider the following (legal) 3-Heap.  What happens when you insert 4?  What if, instead, you called deleteMin?

 

 

Q3 (exam)

 

Suppose you wanted a priority queue that merges sometimes.  List some factors that might contribute to making you pick a d-Heap with a high value of d.  List factors that might make you pick a Leftist heap. 

 

For twice the marks, describe a specific scenario where you’d want a d-Heap, and another specific scenario where you’d want a Leftist heap (Note: this worksheet is not graded, so getting twice the marks might not be that great).

 

Q4 (Exam.)

 

Merge the following two Leftist heaps.  Now merge them as Skew Heaps.

 

Q5 (Exam)

 

Write a recurrence formula for the following pseudocode.  Solve the recurrence.

 

procedure Yorrick (int N)

            if N=1 then

                        return

            end if

            for i=1 to N

                        print “Alas, poor function.”

            end for

            Yorrick(N/2)

 

Q6

 

The following pseudocode is a binary search algorithm for a sorted array Array.  It returns the position at which the parameter ‘findMe’ appears in Array, or it returns –1 if it’s not there. 

 

Determine it’s running time.  Give a proof that this is its running time.  (optionally) Using the preconditions, postconditions and loop invariants, prove that the algorithm is a correct binary search.

 

Note: a precondition is something that is true before a function (or something else) starts.  A postcondition is true after a function (or something else) finishes, assuming all of its preconditions were met.  A loop invariant is something that’s true at the beginning of every loop.

Note 2: to the best of my knowledge, this code is correct.  If you find problems, fix them, and continue the proofs.

 

            function BinarySearch (Array[0..N-1],int findMe) : int

                        precondition: the elements in Array are sorted from least to greatest

 

                        leftCursor=0

                        rightCursor=N

 

                        Loop invariant: findMe is in the range leftCursor,leftCursor+1,…,rightCursor-1, or it’s not in the array

                        while leftCursor<rightCursor do

                                    mid=(leftCursor+rightCursor)/2

                                    if findMe==Array[mid] then

                                                result=mid

                                                break

                                    end if

                                    if findMe < Array[mid] then

                                                rightCursor=mid

                                    else

                                                leftCursor=mid

                                    end if

                        end while

                        if findMe==Array[leftCursor] then

                                    result=leftCursor

                        else

                                    result=-1

                        end if

 

                        postcondition: Array[result] == findMe

OR (result ==  -1 and Array[i]!=findMe for all i=0..N-1

 

                        return result

 

Q7 (exam)

 

What are the advantages of implementing a Stack using an array?  How about a linked list?

 

Q8 (exam)

 

Find 3 functions f(n),g(n),h(n), that are W(n2) and also o(n3), but which are not in big-Theta of each other.

Find 2 functions f(n),g(n), that are o(n4) but are not O(n3) , but which are not in big-Theta of each other.  Hint: you might be able to re-use some of your work for the previous part of this question.

 

 

Q9 (exam)

 

What’s the running time of the following function, in big Theta terms (tight asymptotic bound)

 

procedure Mendacious (int n)

            for i from n to 4*n do

                        for j from n to 4+n do

                                    x=x*2

 

Q10 (exam)

 

  1. What is the running time of the following program, in big-theta terms:

 

procedure Spilliatory (int m, int n)

            for i from 1 to m do

                        x=1

                        while x < n do

                                    x=x*3.1415926535897

 

Q11.

 

Consider the following pseudocode:

 

            procedure Zamboni (int N)

                        for i=1 to N

                        print “The rain in Spain stays mainly on the plane”

            end for

                        Zamboni (N-1)

 

Write a recurrence for this formula.

 

Show that the recurrence is bounded by O(n2), BUT do this without expanding the formula.  Instead do it by using an inductive proof that

 for some c>0 (you can pick it, or just figure out what it is).  Plug the inductive assumption into the recurrence, and see what happens.

 

Q12 (Hard)

 

Suppose .  Show that  for any c (in other words, T(n) is asymptotically larger than any polynomial.

 

The answer to this isn’t at the back, but you can come to my office hours to discuss (I wanted to spend my time on the questions more likely to be on an exam.)

 

Hints: try Q11 first.  Remember the calculus definitions of asymptotic bounds.  Remember that if a<-b, then –a>b.

 

Q13

 

DELETED.  I removed this question.  If you want, why not make up a question, and then answer it?

 

Q14 (Exam)

 

For the following ADTs, list the operations they support, implementations of them that you know, and also list any requirements they make of the data.

·        List

·        Stack

·        Queue

·        Priority Queue

·        Priority Queue with Merge

·        Dictionary

 

Q15 (Exam)

 

At a high level, what is the idea of a dynamic programming algorithm (on an exam, the question would probably be more specific).

 

Q16 (exam)

 

For the following tree:

for this tree,

 

What is the parent of node (a)?

What are the children of (d)?

What is the sibling(s) of node (g)?

What are the 2nd cousins, once removed of node (m) J

List the leaves of this tree.

Assuming this is a Binary Search Tree, what is the output of an in-order traversal (note: pretend that nodes (h) and (i) don’t exist for this question)?

What is the maximum branching factor of the tree?

What is the height and depth of node (h)?


Answers

 

Q1

 

  1. The 5 node is in the wrong position; the tree is not complete.  The 5 node should be the left child of the 7 (well, except that that would break the ordering.
  2. The 2 is less than its parent (3).
  3. The 2 node has 3 children.  Must remove one of them.

 

NOTE: the 3 that is a child of 3 is perfectly valid.  Nodes must be less-than or equal to their children (this was a source of confusion last quarter).

 

Q2.

 

Insert 4 (has to switch place with its parent, 5, but then it’s okay).

 

DeleteMin (When you put the 5 down, it percolates down twice).

 

Q3.

 

d-Heap:

·        Don’t need merge, or need it very infrequently.

·        Your heap will be so large, that much of it will need to be on disk; choosing a high value of d is good.

·        The number of items in the heap stays more or less the same, and minimizing memory is vital (d-Heaps don’t require the extra pointers that Leftist heaps do)

·        DeleteMin needs to be fast – Leftist heaps make it a bit slower.

 

Leftist heap:

·        Darn it, that merge has got to be O(log n), or else you’re in big trouble.

·        The heaps in memory, so access to random parts of RAM is not too big a deal.

·        The number of items in the heap fluctuates up and down very frequently, and you can’t afford the time to keep re-allocating an array or wasting RAM with useless parts of arrays (and overtaxed memory allocators).

·        DeleteMin and Insert are important, but not so important; it’s merge that’s the killer.

·        You are a member of the Communist Party of America.

·        You are a nationalistic fascist, and sometimes get your Left and Right confused.

 

Specific Scenarios:

If you gave a good answer, please tell me.

 

Q4.

 

Leftist Heaps.

Step 1: result from merging down (with Null-Path Lengths shown).  Remember that the NPL(NULL)=-1, so some nodes are not leftist because of a left child they don’t have (e.g. 6).

 

Final: result from going up, and fixing Null-Path Length property.

 

 

As Skew Heaps (always flip left & right children, ignoring actual null path lengths)

 

 

Q5.

 

T(n) is the time for Yorrick.

 

T(1)=1 (base case)

T(n)=n+T(n/2)   (from the code)

T(n)=n+n/2+T(n/4)

T(n)=n+n/2+n/4+T(n/8)

T(n)=n*(1+1/2+1/4+1/8+1/16+…+1/n)

 

You can recognize this geometric series as being less than 1.

 

Or if you want to derive the formula yourself, here it is:

 

 

So, plugging things into our formula:

m=1,

n=log n

c=1/2

cn+1 goes to 0 as n gets large, so asymptotically we can ignore it.  And we see that this is a constant.

 

Q6:

 

Well, you’ve probably heard that binary search is O(log n).  The idea is that every time we go through the loop, the number of elements between leftCursor and rightCursor is decreased by a factor of 2.  In other words,

T(n)=1+T(n/2)

which is logarithmic.

 

The proof requires demonstrating that the number of elements decreases by two no matter what (correct) path through the code we take.

 

The proof of correctness uses the precondition, and inductively shows that the loop invariant is always true.  Once we know that the loop invariant is always true, then we just need a bit of detail work to show that the algorithm must be correct.

 

Q7.

 

Stack as array:

·        Avoid the extra memory overhead of pointers in the linked list.

·        Don’t have to dereference pointers, which may save time.

·        (more advanced) Better cache coherency.  When you have an array stack, you’re tending to access memory you’ve used recently.  This is true of a linked list implementation too, except that the memory isn’t nicely bunched together (in the same part of the cache) when it’s in linked list nodes.

·        Probably easier to implement.

 

Stack as linked list:

·        Doesn’t limit the amount of things you can put in the stack, and avoids nasty re-allocations of larger array sizes.

·        For the same reason, it behaves nicer if the number of elements in the stack changes drastically and quickly.

 

Q8.

 

Some possibilities are:  and many others.

For the second part, Take all of the solutions from question 11, except for  and multiply by n.

 

 

Q9.

 

Q(n).  Outer loop is linear, inner loop in constant, x is irrelevant

 

Q10.

 

Q(m*log n).  It’s linear in m.  3.141… is more than 1, so x will increase exponentially, so the number of inner loops is logarithmic in n.

 

Q11.

 

T(n)=n+T(n-1)

 

Say we picked some c.

 

We want:

 

For the base case, we figure T(1) is a constant, so say it’s 1.  Then, our claim is true as long as .

 

For the inductive step,

 

Now, to complete the inductive step, we show that this is less than , as required:

which if c>1/2, and if n>1.  Based on our induction step, we’re only interested in n>1.  So, we can pick any c>1, and we’re done (remember that the base case required c>1).

 

Q12.

 

Answerless.

 

Q13.

 

Saul.

 

Q14.

 

Stack.

Operations:

·        Make

·        Push

·        Pop

·        (get) Top

 

Implementations:

·        Linked list

·        Array

·        Other wacky things (e.g. linked-list of small arrays)

 

Data requirements:

·        Data can be copied around.

 

Queue.

Operations:

·        Make

·        Enter (aka Enqueue)

·        Leave (aka Dequeue)

 

Data requirements:

·        Data can be copied around.

 

 

Implementations:

·        Linked list

·        Array

·        Other wacky things

 

Priority queue:

Operations:

·        Insert

·        DeleteMin

 

Implementations:

·        Heaps

·        d-Heaps

·        Leftist Heaps, Skew Heaps and Binomial Heaps (why not)

·        Search trees (it’d work)

·        Linked lists/arrays (simple, but slow)

·        Other wacky things.

 

 

Data requirements:

·        Data can be copied around.

·        Data elements can be compared according to some ordering

 

 

Priority queue with Merge:

Operations:

·        Insert

·        DeleteMin

·        Merge

 

Implementations:

·        Leftist Heaps, Skew Heaps and Binomial Heaps (why not)

·        Search trees (it’d work)

·        Linked lists/arrays (simple, but slow)

·        Other wacky things.

 

 

Data requirements:

·        Data can be copied around.

·        Data elements can be compared according to some ordering

 

Dictionary ADT.

Operations:

·        create

·        insert

·        find

·        delete

 

Implementations:

·        Binary Search Tree

·        Linked list/array (slow, but simple)

·        B-trees

·        Self-balancing search trees, like AVL trees, 2-3 trees, red-black trees (we’ll get to at least one of these)

·        Hash tables

 

Data requirements:

·        Data can be copied around.

·        Data elements can be tested for equality (for the search tree-based implementations, they’d also have to be compared according to some ordering)

 

Q15.

 

Ideas:

·        Use a table to lookup partial results (of simpler problems)

·        Express the answer in one cell of the table in terms of other cells

·        Fill in the table (top-down, or bottom-up)

·        See what the answer is.

·        Hopefully, if you de-composed the partial results well, your dynamic programming algorithm will be faster.

 

(Note: obviously this question was kind of vague, so if you said something different, it wasn’t necessarily a bad answer.  But, you should know the general idea of dynamic programming)

 

Q16.

 

What is the parent of node (a)? no parent – it’s a root

What are the children of (d)?  (j) and (k)

What is the sibling(s) of node (g)?  (f), (h), (i)

What are the 2nd cousins, once removed of node (m) J ?  (d),(e), I think

List the leaves of this tree.  (j),(k),(l),(m),(n),(g),(h),(i)

Assuming this is a Binary Search Tree, what is the output of an in-order traversal?  (j),(d),(k),(b),(e),(l),(a),(m),(f),(n),(c),(g)

What is the maximum branching factor of the tree? 4, for node (c)

What is the height and depth of node (h)?  height: 2, depth: 0