Saturday, 22 March 2014

Complexity and O(n) Notation

This week we went through calculating the complexity of a given function and finding its BigOh notation. This is the standard approach to estimate the efficiency of an algorithm. Since computers are quite fast, the problem with efficiency matters only on really large number of items, therefore BigOh notation is used whenever we assume a great number of items are given to the algorithm. In calculus you can think of this as taking the limit of a function as our parameters, called n from now on, approaches infinity. We essentially analyze and compare the growth rate of the algorithms. A great example is actually the two implementations I did of calculating the fibonacci sequence on a previous post. For the normal recursive solution, a simple call to calculate fibonacci(5) was actually running the function 15 extra times.

As our n gets pretty big, we can see that our recursive fibonacci algorithm would grow exponentially. Its complexity is O(((1+sqrt(5))/2)^n) to be precise and actually that base is the golden ratio! Another implementation that I showed on my previous post was using memoization to store previously calculated sequences so our number of calls would be 8, we nearly cut it in half!

As you can see from the illustration above, the memoization method gets rid of the redundant part of the problem and we just have to calculate and store the left part of our call tree. Hope you guys like the illustration I drew! Since we just have to calculate the leftmost branch of the tree to compute the sequence, our complexity becomes only O(n)! Much better eh?

Monday, 17 March 2014

Binary Search Trees

This week we analyzed a specific version of binary trees called the binary search trees. They are quite similar with the addition of a simple rule. Every left child of the root node must have a smaller value than root's value and every right child has a bigger value than root's value. So a simple binary search tree would look like this.

This special version of binary tree has some fascinating properties. For example, to find the smallest or the highest value, you just have to go to leftmost or the rightmost node without even comparing any value! It is also terrific at finding a given value in the tree, hence its name. To find a value, you just compare it to a node starting from the root and then recursively search through left subtree if given value is smaller than node or scour the right subtree if the value is greater!