CST 370 - Module 2

 Week 2,

This week's material focuses on the basics of analyzing non-recursive and recursive algorithms. Starting off, we were introduced to using asymptotic notation to describe the category of time efficiency of an algorithm. Using the time efficiency categories of last week (i.e. n, n*log(n), n^2) we can describe the algorithm in three ways. One, Big Oh Notation (O (f(n)) ), where all functions with lower or same order of growth as f(n) can be used to express the category of an algorithm. For instance, you can say n is expressed in O(n) or even O(n^2). Second, Big Theta ( Θ(f(n)) ), where all functions with the same order of growth as f(n). For example, you can say n is expressed in Θ(n), but not expressed in Θ(n^2) or Θ(log(n)). Finally, Big Omega ( Ω(f(n)) ), where all functions with same or higher order of growth as f(n). An example of this would be that you can say n^2 is expressed in Ω(n), but not expressed in Ω(n^3).

As mentioned previously, we also learned about analyzing non-recursive and recursive algorithms. For now, I will only mention non-recursive analysis. Given a data structure of some sort (for now, an array of n integer elements), and an algorithm which finds the max value in the array, we must first find the location of the basic operation. This is typically an operation found in the innermost loop of the function. We use this basic operation to consider how many times the algorithm loops to complete it's task, which in turn, tells us the general category of time complexity the algorithm lies in. In this example, the algorithm uses one while loop which goes through the whole array once . So the basic operation can be whatever call to compare, add, etc within this while loop. Knowing the basic operation lies in one while loop, which loops n times, we can safely say the time complexity of this algorithm is Θ(n).

I would recommend watching a video on this as it is easier to visualize how to analyze an algorithm as opposed to reading it from a short paragraph.

Skipping the analysis of recursive algorithms, we were also introduced to Brute-Force. To summarize, Brute-Force implies we use a straightforward approach to solving a problem. This wouldn't be the most efficient way to solve a problem, but it is a good starting point. For example, if we want to sort an array of numbers, we can simply keep looping through the entire array swapping numbers until all elements are sorted. An example of this would be the selection sort, where we scan the array to find the smallest element and swap it with the first element, then keep repeating, moving one element forward each time we swap. Though this algorithm finishes the task, there is no difference with the best case or worst case with the time efficiency, as we will always scan through the entire array multiple times.

Comments

Popular posts from this blog

CST 338 - Module 4

CST 363 - Module 7

CST 363 Module - 1