Guest User

Untitled

a guest
Apr 22nd, 2018
93
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 2.91 KB | None | 0 0
  1. 1. Algorithm Analysis
  2. a. Formal definition: T(N) = O(f(N)) if there are positive constants c and n_0 such that T(N) <= cf(N) when N >= n_0/
  3. b. Formal definition: T(N) = omega(f(N)) if there are positive constants c and n_0 such that T(N) >= cf(N) when N >= n_0/
  4. c. Formal definition: T(N) = theta(f(N)) if and only if T(N) = O(f(N)) and T(N) = omega(f(N))/
  5. d. Informal definition: O(f(N)) sets the upper-bound on T(N).
  6. e. Informal definition: omega(f(N)) sets the lower-bound on T(N).
  7. f. Informal definition: theta(f(N)) is the average runtime.
  8. g. Typical growth rates: C < log(N) < log^2(N) < N < Nlog(N) < N^2 < N^3 < 2^N/
  9.  
  10. 2. Recurrence Relations
  11. a. Page 276 and page 287 provide a recurrence relation analysis.
  12.  
  13. 3. Sorting Algorithms
  14. a. Heap sort: O(NlogN), theta(NlogN), omega(NlogN). Builds a minimum heap, and pulls out the minimum value and places it at the end of a list. Inserting into a min heap is O(logN).
  15. b. Insertion sort: O(N^2), theta(N^2), omega(N). Remove an element from input list and insert it correctly into a sorted list.
  16. c. Selection sort:O(N^2), theta(N^2), omega(N^2). Find the minimum value in the list, place it in the lowest position. Continue doing this, moving up in position.
  17. d. Bubble sort: O(N^2), theta(N^2), omega(NlogN). Has the same upper and lower bounds as quicksort, but a slower average run time. Compare two elements, swapping if a < b, continue up the list. Repeat.
  18. e. Merge sort: O(NlogN), theta(NlogN), omega(N). Split the list in half. Call mergesort on each one recursively, and then merge them back together in order.
  19. f. Quick sort: O(N^2), theta(NlogN), omega(NlogN). Has same upper and lower bounds as bubblesort, but a faster average run time. Pick an element, partition the list into two partitions, one above the pivot and the other containing elements below. Recursively call this on each partition.
  20.  
  21. 4. Shortest Path
  22. a. Greedy algorithm: At each stage, it picks the best available option at the time.
  23. b. Read pp. 354-356 in the book. Basically you can just make a quick table (as illustrated in the book), work through each node until the table is complete, and then find the quickest route from A to B by analyzing the table.
  24.  
  25. 5. C++
  26. a. Read chapter 1 in the book if you feel you need a review on C++.
  27.  
  28. 6. AVL Trees
  29. a. In-order: Left, root, right.
  30. b. Pre-order: Root, left, right.
  31. c. Post-order: Left, right, root.
  32. d. Building: when building an AVL tree from a an input list, place the first number in the root. Following numbers are inserted in the same manner as a binary tree, i.e. smaller values move down the left side of the tree, and greater values move down the right side of the tree. After inserting the entire list, search the tree for imbalances and rotate appropriately. Alternatively, you can balance it at each insertion.
  33. e. Examining for imbalances: To get the balance of a node, b = #rightnodes - #leftnodes.
  34. f. Rotating: Know how to rotate AVL trees in pictorial form.
Add Comment
Please, Sign In to add comment