Posts

CST 370 - Module 8

 Week 8, This is the final week for the Design and Analysis of Algorithms course. The material covered is only one topic to give us time for our final exam, but still useful. We were introduced to an algorithm that finds the shortest path from a single source called Dijkstra's Algorithm. In short, Dijkstra's algorithm uses the greedy technique to find a shortest route from a source vertex to all the other vertices. For example, when given a graph with weights on each edge between vertices and a starting point we calculate the distances from each adjacent vertex from the starting point. We select the vertex with the smallest edge value, and expand our search to determine what other vertices can be reached, and what is the shortest distance. By the end, we will know the shortest route from the starting node, to all other nodes. What is particularly interesting about this algorithm is it's wide applications. For instance, we can use it to determine GPS navigation which in turn...

CST 370 - Module 7

 Week 7  This module's material covers different algorithms for solving problems with graphs, and sorting. For instance, we were introduced to non-comparison sorting such as the Counting Sort or Distribution Counting Sort, where we calculate the frequency of each value of a given list and determine distribution values via the sums of frequencies. What is interesting about Counting Sort is the fact that is stable, or the numbers with the same values stay in the same order as they do in the given list.  A technique introduced was Dynamic Programming. One problem about using a brute force technique is that sometimes, we would use a recursive function, which depending on the problem we want to solve, can induce too many overlapped functions. Dynamic Programming will allow us to avoid this by using an array to keep solutions of sub-problems. When the solutions to the sub-problems are recorded in the array, we solve the original problem with that array. The next two algorithms ...

CST 370 - Module 6

 Week 6, This week's module covered more material on tree structures. We previously talked about binary search trees, however, a problem with that is that it can become unbalanced, making the operation take linear time. To solve this problem, we just make sure we keep the binary search tree balanced. To do this, we make use of an AVL Tree, where we keep track of the levels below each node. In short, (and probably with lack of detail) we keep the number of children to each node equal from one side to the other as much as possible, within a difference of 1. If any of the nodes are unbalanced, we rotate the node with the first violation with its children, either left or right. Another way of keeping a tree balanced is by using a 2-3 tree. Using a 2-3 tree is different from your usual tree as each node can contain either 1 or 2 values, and either 2 or 3 children. You insert values in a binary fashion. However, if you insert a third value into a node, the value between moves to the pare...

CST 370 - Module 5

 Week 5, This module's material covered some more algorithms using divide and conquer. One of these algorithms is the quicksort algorithm. The goal of quicksort is to partition the array into two subarrays around a pivot. From there, we recursively sort the two subarrays, then combine the results. The Big Oh notation for quicksort is O(n*log(n)), which is similar to merge sort, however, when dealing with large sorting operations, quicksort beats merge sort by a small margin.  Next, we talk about a divide and conquer ready data structure called a binary tree. Data is usually inserted to the left or the right of a root node, typically by smaller numbers to the left of the node, and greater numbers to the right. This continues further and further down the tree creating what could be describe as... well... a tree. The main problem with using this data structure is how to traverse the binary tree. In short (got to leave some room for you to research), we talk about three methods, p...

CST 370 - Module 4

 Week 4,  This week doesn't cover a lot of new material to allow us to study and review our previous assignments and quizzes. This allows us to truly understand how we analyze algorithms, and to review how me may have missed some questions on the previous quizzes.  Besides being given time to review the material from the past three weeks, we were introduced to the Merge Sort algorithm. Though some of us are already familiar with Merge Sort, the material was still informative as we got to analyze it more thoroughly and re-familiarize ourselves with the concept.  In this case, we implement Merge Sort with recursion, which also follows a Divide and Conquer technique. Starting with a full size array, we split the array in half and make two recursive calls using each half of the array. Once the array is eventually separated into it's individual values, merge the values while sorting them, eventually having a fully sorted array. As previously mentioned, this algorithm foll...

CST 370 - Module 3

 Week 3, This week's module started off with the continuation of Brute Force. This time using Brute Force with string matching problems. In short, when given a string of text and a pattern, find the pattern within the text. Using brute force, you would compare the strings character by character, and if there is a mismatch, shift a character and re-compare. This could have a fast best case scenario, however, worst case is longer than you would initially think. The next topic involved Exhaustive Search, an algorithm which determines all possible scenarios of a solution to the problem, compares each of them, and selects the fastest/lowest cost result. For example, the Traveling Salesman Problem requires you to find the shortest path to visit all nodes once, and return to the starting nodes. Using the Exhaustive Search approach, you would first find out all the different combinations of paths to take when starting from the first node, find out how much it costs for each path, then choo...

CST 370 - Module 2

 Week 2, This week's material focuses on the basics of analyzing non-recursive and recursive algorithms. Starting off, we were introduced to using asymptotic notation to describe the category of time efficiency of an algorithm. Using the time efficiency categories of last week (i.e. n, n*log(n), n^2) we can describe the algorithm in three ways. One, Big Oh Notation (O (f(n)) ), where all functions with lower or same order of growth as f(n) can be used to express the category of an algorithm. For instance, you can say n is expressed in O(n) or even O(n^2). Second, Big Theta ( Θ(f(n)) ), where all functions with the same order of growth as f(n). For example, you can say n is expressed in Θ(n), but not expressed in Θ(n^2) or Θ(log(n)). Finally, Big Omega ( Ω(f(n)) ), where all functions with same or higher order of growth as f(n). An example of this would be that you can say n^2 is expressed in Ω(n), but not expressed in Ω(n^3). As mentioned previously, we also learned about analyzing...