algorithms

last edited Wed, 24 Jul 2024 05:21:40 GMT
backlinks: null


Analysis direct link to this section

An algorithm's efficiency is measured in two ways.

Time Complexity direct link to this section

The amount of time it takes for each statment in an algorithm to complete. Speed does not refer to time but the number of increasing operations, run-time against the size of input is analyzed with the worst case scenario in mind.

input size refers to the number of items in the input
running time the number of primitive operaations executed

void fun(int n){
    int i, j,  k, count =  0;
    for(i = n/2; i <= n; i++)
        for(j = 1; j+n/2 <= n; j++)
            for(k = 1;  k <= n; k = k*2)
                count++;
}

Space Complexity direct link to this section

The amount of memory used by a program to execute it is represented by its space complexity. Mainly dependent on hardware.

Space Complexity = Auxillary Space + space used by input

Recursion direct link to this section

Technique used in various algorithms. Problems solved with while loops can often be solved uusing recursion but there doesn't seem to be any significant performance gain.

  1. base case
    • when function won't call
  2. recursive case
    • function calls itself
def look_for_key(box):
    for item in box:
        if item.is_a_box():
            look_for_key(item)
        elif item.is_a_key():
            print "found the key!"

Types direct link to this section

Gale-Shapely direct link to this section

Traveling Salesman direct link to this section

Worst Case: O(n!)

  1. pick middle of the index and test it against the key
  2. terminate if the middle index matches the key, otherwise move on to the left or right of the remaining search space and repeat

worst case: O(logn) sublinear

Quicksort direct link to this section

a pivot is chosen and then a partition() occurs recursively on both sides of the pivot

Signs of an ideal pivot:

  1. correct position in final sorted array
  2. items to the left are smaller
  3. items to right are larger
def quicksort(array):
    if len(array)  < 2:
        return array
    else: 
        pivot = array[0]
        less = [i for i in array[1:] if i <= pivot]
        greater = [i for i in array[1:] if i > pivot]
        return quicksort(less) + [pivot] + quicksort(greater)
        print quicksort([10, 5, 2, 3])

worst case: O(n^2) if pivot is poorly chosen
best case: O(nlogn)

Mergesort

Selection Sort direct link to this section

two nested loops to sort a list

  1. traverse through list once and find minimum
  2. swap place with min or max of list
  3. repeat until end of list

Insertion Sort direct link to this section

1 for j = 2 to A.length
2   key = A[j]
3   // Insert A[j] into the sorted sequence A[1...j - 1]
4   i = j - 1
5   while i > 0 and A[i] > key
6       A[i + 1] = A[i]
7       i = i - 1
8   A[i + 1] = key

Worst Case: O(n^2)

Bubble Sort direct link to this section

  1. traverse from left and compare two adjacent elements
  2. bigger element moved to the right
  3. continue across entire list

A type of graph and search algorithm. Will find the shortest path if one exists. Running time will be O(n + m) where n is the number of vertices and m is the number of edges

def search(name):
    search_queue = deque()
    search_queue += graph[name]
    searched = []
    while search_queue:
        person = search_queue.popleft()
        if not person in searched:
            if person_is_seller(person):
                print person + " is a mango seller"
             return True
            else:
                search_queue += graph[person]
                searched.append(person)
    return False

a binary search is superior