CSE 326: Summer 2001

Midterm Worksheet

 

 

  1. Abstract Data Types (ADTs) vs. Data Structures

 

  1. Things you already know: arrays, linked lists, trees, queues, stacks

 

  1. Priority Queue ADT Implementations

 

    1. Lists, arrays, trees – not very efficient

 

    1. Binary and d-Heaps

                                                               i.      Heap order and Structure properties

                                                             ii.      Array vs. linked list implementation

                                                            iii.      Time complexity of various operations

 

    1. Support for fast merge: Leftist and Skew Heaps

                                                               i.      Heap order and Structure properties

                                                             ii.      Null path length

                                                            iii.      Time complexity of various operations

 

    1. Still better average-case performance: Binomial Queues

 

  1. Asymptotic Complexity
    1. Big-Oh, -Theta, etc.
    2. Upper/lower bounds, worst/average/best-case bounds, loose/tight bounds
    3. Analyzing code

 

  1. Dictionary ADT: Binary Search Tree implementation
    1. Insertion
    2. Deletion
    3. Lazy deletion

 

  1. Balanced BST: AVL tree
    1. Single and double rotations

 

  1. Balanced on average BST: Splay Tree
    1. Splay operation
    2. Easy implementation, hard analysis

 

 

 

Q1.      List three things that are wrong with the following binary heap.

 

 

 

 

 

 

 

Q2.      Consider the following (legal) 3-Heap.  What happens when you insert 4?  What
if, instead, you called deleteMin?

 

 

 

 

 


Q3.      Suppose you wanted a priority queue that merges sometimes.  List some factors

that might contribute to making you pick a d-Heap with a high value of d.  List

factors that might make you pick a Leftist heap. 

 

 

 

 

 

 

 

 

 

 

 

 

 

Q4.      Merge the following two Leftist heaps.  Now merge them as if they were Skew

Heaps.

 

 

 

 

 

 

 

 

Q5.      Write a recurrence formula representing the running time of the following

pseudocode.  Solve the recurrence.

 

procedure Yorrick (int N)

     if N=1 then

          return

     end if

     for i=1 to N

          print “Alas, poor function.”

     end for

     Yorrick(N/2)

 

 

 

 

 

 

Q6.      We know that Floyd’s algorithm for buildHeap takes linear worst-case time for

binary and d-Heaps. How do you implement buildHeap for Leftist and Skew

Heaps? How long does that take?

 

 

 

 

 

Q7.      What are the advantages of implementing a Stack using an array?  How about implementation using a linked list?

 

 

 

 

 

 

 

 

Q8.      a) Find 3 functions f(n),g(n),h(n), that are W(n2) and also o(n3), but which are not
    in big-Theta of each other.

 

b) Find 2 functions f(n),g(n), that are o(n4) but are not O(n3) , but which are not in
    big-Theta of each other.

 

 

 

 

 

Q9.      What is the running time of the following function, in big Theta terms (tight asymptotic bound)?

 

procedure Mendacious (int n)

          for i from n to 4*n do

              for j from n to 4+n do

                   x=x*2

 

 

 

 

 

Q10.    What is the running time of the following program in big-Oh notation?

 

procedure Spilliatory (int m, int n)

     for i from 1 to m do

          x = 1

          while x < n do

              x = x*3.1415926535897

 

 

 

 

 

Q11.    For the following ADTs, list the operations they support, implementations of

them that you know, and also list any requirements they make of the data.

 

·        List


·        Stack


·        Queue


·        Priority Queue


·        Priority Queue with Merge


·        Dictionary

 

 

 

 

Q12.    Which operation is supported better by Binomial Queues than by the other Heap

implementations of Priority Queues we have seen? What is the running time of

this operation in various Heap structures?


Q13.    Consider the following tree:

 

 

 

Q14.    What is the structure property of AVL trees? How is it maintained through

rotations?

 

Q15.    Give two advantages of Splay trees compared to AVL trees.

 

 

 

 

 


Answers

 

 

A1.      a) The tree is not binary; (2) has three children.

            b) The tree is not complete; (9) should be the left child of (7).

            c) (2) is less than its parent (3).

 

 

A2.
Insert 4:      Need to switch place with its parent (5).

 

 

DeleteMin:     (5) is moved to the root and then percolated down twice

 

A3.  d-Heap:

·        Don’t need merge, or need it very infrequently.

·        Your heap will be so large, that much of it will need to be on disk; choosing a high
 value of d is good for cache performance.

·        The number of items in the heap stays more or less the same, and minimizing
 memory is vital (d-Heaps don’t require the extra pointers that Leftist heaps do)

·        DeleteMin needs to be fast – Leftist heaps make it a bit slower.

 

Leftist heap:

·        Merge has got to be O(log n) for the application, or else you’re in big trouble.

·        The heap is small enough to be in memory, so access to random parts of RAM is
 not too big a deal.

·        The number of items in the heap fluctuates up and down very frequently, and you
 can’t afford the time to keep re-allocating an array or wasting RAM with useless
 parts of arrays (and overtaxed memory allocators).

·        DeleteMin and Insert are important, but not so important; it’s merge that’s
 the killer.

·        You are a member of the Communist Party of America.

·        You are a nationalistic fascist, and sometimes get your Left and Right confused.

 

 

A4.

Leftist Heaps.

Step 1: Result from merging down (with Null-Path Lengths shown).  Remember that the NPL(NULL) = -1, so some nodes are not leftist because of a left child they don’t have (e.g. 6).

 

Final: Result from going up, and fixing Null-Path Length property.

 

 

Skew Heaps (always flip left and right children, ignoring actual null path lengths)

 

 

 

 

A5.      Let T(n) is the time for Yorrick for parameter n.

 

Base case:                                T(1) = 1

Recurrence from the code:        T(n) = n + T(n/2)

Solving the recurrence:

T(n) = n + T(n./2)

        = n + n/2 + T(n/4)

        = n + n/2 + n/4 + T(n/8)

        = …

        = n * (1 + ½ + ¼ + 1/8 + 1/16 + … + 2/n ) + T(1)

        < n * (1 + ½ + ¼ + 1/8 + 1/16 + …) + T(1)

        = 2n + 1                        (geometric series with ratio ½)

Hence, T(n) = 2n+1.

 

 

A6.      A binary heap, being complete, binary and balanced, is also a Leftist as well as

Skew Heap. Hence, we can simply use Floyd’s algorithm for Leftist and Skew Heaps as well. Running time: linear worst-case.

 

 

A7.

Stack as array:

·        Avoid the extra memory overhead of pointers in the linked list.

·        Don’t have to dereference pointers, which may save time.

·        (more advanced) Better cache coherency.  When you have an array stack, you’re tending to access memory you’ve used recently.  This is true of a linked list implementation too, except that the memory isn’t nicely bunched together (in the same part of the cache) when it’s in linked list nodes.

·        Probably easier to implement.

 

Stack as linked list:

·        Doesn’t limit the amount of things you can put in the stack, and avoids nasty re-allocations of larger array sizes.

·        For the same reason, it behaves nicer if the number of elements in the stack changes drastically and quickly.

 

 

 

A8.      a) Some possibilities are: , etc.

b) Take all of the solutions from part a) except for  and multiply by n.

 

 

A9.      Q(n).  Outer loop is linear, inner loop in constant, x is irrelevant

 

 

A10.    Q(m*log n).  It’s clearly linear in m.  For dependence on n, note that 3.141… is
bigger than 1, so x will increase exponentially with each iteration of the while loop.  Hence the number of times the inner loop is executed is logarithmic in n.

 

 

A11.

  1. Stack
    Operations:             Make, Push, Pop, (get) Top, isEmpty
    Implementations:    Linked list, Array, Other wacky ways (e.g. linked-list of
                                    small arrays)
    Data requirements: None

 

  1. Queue
    Operations:             Make, Enqueue, Dequeue, (get) First, isEmpty
    Implementations:    Linked list, Array, Other wacky ways
    Data requirements: None

  2. Priority queue
    Operations:             Insert, DeleteMin, FindMin, isEmpty
    Implementations:    Heaps, d-Heaps, Leftist Heaps, Skew Heaps, Binomial Heaps,
                                    Search trees (not as fast), Linked lists/Arrays (simple, slow)
    Data requirements: Data is comparable

  3. Priority queue with Merge
    Operations:             Insert, DeleteMin, FindMin, Merge, isEmpty
    Implementations:    Leftist Heaps, Skew Heaps, Binomial Heaps, other slow ways
    Data requirements: Data is comparable

  4. Dictionary
    Operations:             Create, Insert, Find, Delete, isEmpty
    Implementations:    Binary Search Trees, Linked list/Array (simple, slow),
                                    B-trees, self-balancing search trees (such as AVL trees,
                                    Splay Trees, 2-3 trees, red-black trees), Hash Tables
    Data requirements: Data can be tested for equality; also comparable for some
                                    implementations

 

 

A12.    Insert is O(1) average-case and O(log n) worst-case for Binomial Queues. For

Binary, d-, Leftist and Skew Heaps, the best we can show is O(log n) worst-case.

 

 

 

A13.

·        Height, depth and npl of (e)                                                        height 1, depth 2, npl 0

 

 

 

A14.    The structure property of AVL trees is that it’s a binary tree where for each node,

the heights of its left and right subtrees differ by at most 1.  Whenever the tree is

altered (through insert or delete), we find the deepest node, if any, that is

unbalanced and apply single/double rotation, depending on the situation, to

balance it. This one rotation restores the balance property of the whole tree.

 

 

 

A15.    Splay trees are much easier to implement.  They do not require auxiliary

information such as height to be stored and maintained in each node.