Computer Science GRE Study Guide, Exact and asymptotic analysis of specific algorithms

Table of Contents

III. Theory and Mathematical Background – 40%

A.    Algorithms and complexity

1.      Exact and asymptotic analysis of specific algorithms

a.       Definition: When looking at an algorithm and evaluating it, we might want to precisely define how long it will take the program to run (exact analysis), but we usually just approximate the time (asymptotic analysis).

b.      The worst-case analysis of an algorithm is not the same as the upper bound. Upper bound is a property of the equations that represent the algorithm. The worst-case analysis finds the worst-case scenario and expresses the time or space to solve. On the other hand, the concepts are so closely related that we often use the terms interchangeably.

c.       Sorting methods

i.        http://www.cs.ubc.ca/spider/harrison/Java/sorting-demo.html

ii.      Insertion sort

¨      Simple to code

¨      Good for a small number of elements

¨      Starts with one element, then inserts the next element in the correct position by scanning until the successor is found (or the end of the list)

¨      O(n2)

¨      If the information is already sorted, then it runs at O(n)

iii.    Shellsort

¨      Named after Donald Shell

¨      Also known as diminishing gap sort

¨      Simple to code

¨      Sorts elements that are not next to each other – this is what makes it possible to perform better than O(n2)

¨      Elements are sorted only relative to items that are k elements away. So at the 6-sort, all of the elements that are 6 away from each other are treated as a set and sorted using insertion sort

¨      The rate and method for diminishing the gap (the size of the k-sort) is important to the performance of the sort. Empirically, it appears that the best way to decrement the gap is to divide the size of the set by 2.2

¨      Using the best known design we can get O(n5/4)

iv.    Merge sort

¨      Divide-and-conquer

¨      O(n log n) because of the O(n) overhead

¨      The array is divided into two sub arrays, then the first element in each array is compared (e and f), the smaller element (e) is copied to the new third sorted array. Repeat using e+1 and f until everything has been sorted.

¨      Rarely used because of the large amounts of memory and copying necessary

v.      Quicksort

¨      Divide-and-conquer

¨      Fastest algorithm known

¨      Average case O(n log n)

¨      Worst case O(n2) (If you choose the smallest element every time.)

¨      Uses recursion

1.      If the number of elements in the set is 0 or 1, then return

2.      Choose any element in the set to be a pivot

3.      The remaining elements are divided into two groups: left and right

4.      Perform the recursion on the two groups left and right

5.      The elements are sorted into left and right based on if they are smaller or larger than the pivot

¨      The choice of pivot is important to running time – try to choose the middle element

¨      Not great for small arrays

vi.    Selection sort

¨      Finds the smallest item and puts it in the correct place

¨      Not efficient as it must look at all remaining items

¨      O(n2)

vii.  Heapsort

¨      O(n log n)

¨      Place all of the items into a heap

¨      Grab the largest one and put it in place

¨      Time efficient, but space can be a problem

viii.Bubble sort

¨      Compare two items and swap if necessary; repeat through all items

¨      O(n2)

¨      But, if the list is sorted, then O(n)

⇒ Next section

Liked it? Take a second to support Hunter Hogan on Patreon!
Become a patron at Patreon!