SHARE
TWEET

Untitled

a guest Oct 9th, 2019 57 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. # Assessment 11
  2.  
  3. ## Stacks
  4. Data structure used to storage data in sequential order, method used is "Last In, First Out". For all the standard stack operations (push, pop, isEmpty, size), the worst-case run-time complexity can be O(1). Some of their implementations are string reversal functions, undo & redo operations in OS programs, and navigating history in browsers.
  5.  
  6. ## Queues
  7. Also a data structure used to storage data in sequential order, but method used is "First In First Out". For queue operations (enqueue, dequeue, size), run-time complexity is O(1). Some of their implementations are when there are simultaneous server requests from multiple users, like 3 people buying the last ticket for a plane at almost the same time.
  8.  
  9. ## Deque
  10. This is also a data structure used to storage data in sequential order, and allows adding and removing items at either end. It has capabilities of both stacks and queues. For standard deque operations (insertFront, insertLast, deleteFront, deleteLast), run-time complexity is O(1). Some of their implementations are storing a web browser's history and your text message's history, on both recent ones are added to the front of the deque and old ones at the back of the deque are removed after some specified time.
  11.  
  12. ## Linked lists
  13. Also a linear data structure, but not stored at a contiguous location, elements are linked using pointers. Its advantage is its dynamic size and ease of insertion/deletion. Their worst case scenario's run-time complexity is O(n). Some of their implementations are an image viewer (with "previous" and "next" image), previous and next page in web browser, a music player like Spotify (with "previous" and "next" song).
  14.  
  15. ## Hash Table
  16. Data structure that allows to store key-value pairs and quickly modify/access them. It's a helpful structure when dealing with large amounts of unorganized data. For their operations like insert, retrieve and delete, run-time complexity is O(1). Some of their implementations are objects in JavaScript and dictionaries in Python.
  17.  
  18. ## Trees
  19. Trees are data structures that represents data in a top down arrangement, similar to a hierarchy. Some of their implementations are memory allocation in computers (binary search tree), auto complete in search engines (tries) and find the closest driver to a rider in car share apps (heaps).
  20.  
  21. ## Linear Search
  22. Algorithm used to find an element in an array in a sequential way, iterating across it from left to right. The worst-case run-time complexity is O(n). Its advantage is that it works on sorted or unsorted lists. It's ok to use linear search when data is not too big, otherwise it will have a slow performance.
  23.  
  24. ## Binary Search
  25. Algorithm used to find an element in a sorted array, reducing the search area by half each time in order to find the target number. The worst-case run-time complexity is O(log n). Its advantage is speed compared to linear search but array should be sorted first.
  26.  
  27. ## Depth First Search
  28. Algorithm used to find an element in a tree or graph structure. DFS goes deep in the node structure before going wide. DFS runtime complexity is O(|v|+|e|) where v = vertices and e = edges we are traversing. Some of DFS's implementations are recursive algorithms and algorithm to create mazes.
  29.  
  30. ## Breadth First Search
  31. Another algorithm used to find an element in a tree or graph structure. BFS goes wide in the node structure before going deep. Its runtime complexity is also O(|v|+|e|) where v = vertices and e = edges we are traversing. BFS's possible implementations are checking similarity, as comparing DNAs or connections in a network.
  32.  
  33.  
  34. ## Compare and contrast a few sorting algorithms
  35.  
  36. **Bubble sort**: Compare consecutive pairs of elements. Swap elements in the pair so that the smaller one is first.
  37. When we reach the end of the list, start over.
  38. ````
  39. for i from 1 to n
  40.  for j from 0 to n-1
  41.     if a[j] > a[j+1]
  42.         swap (a[j], a[j+1])
  43. ````
  44.  
  45. **Selection sort**: Extract the smallest element. Swap it with the element at index 0. In the remaining unsorted sublist, extract the smallest element. Swap it with the element at index 1 and so on.
  46. ````
  47. for (j=0; j<n-1; j++)
  48.     let iMin = j;
  49.     for (i=j+1; i<n; i++)
  50.         if(a[i]<a[iMin])
  51.             iMin=i
  52.     if (iMin != j)
  53.         swap(a[j], a[iMin])
  54. ````
  55.  
  56. **Insertion sort**: Removes one element from the input data, finds the location it belongs within the sorted list, and inserts it there. It repeats until no input elements remain.
  57. ````
  58. for (i:1 to length(A)-1)
  59. j=i
  60. while (j>0 and A[j-1]>A[j])
  61.     swap (A[j], A[j-1])
  62.     j=j-1
  63. ````
RAW Paste Data
We use cookies for various purposes including analytics. By continuing to use Pastebin, you agree to our use of cookies as described in the Cookies Policy. OK, I Understand
 
Top