Skip to main content

Posts

Showing posts from May, 2017

Big Oh notation and Algorithmic Complexity analysis

Introduction When executing a particular task using algorithms, one common thing that hits every problem solver brain is the efficient and fast algorithm to solve that problem. But, what do exactly fast and efficient means? Is it the measure of real processing time? No, it's not the measure of real time like second and minutes because the old computer with Pentium processor may take a long time to process the same algorithm than the new Intel i3 processor, or, A bad algorithm written in Assembly may execute faster than a good algorithm written in python. So, run time of program can't be considered to measure algorithmic efficiency.  Hence, to measure the algorithmic efficiency, the concept of Asymptotic Complexity of a program and a notation for describing this called Big Oh  was introduced. Big Oh is the worst case, Big Omega and Big Theta are best and average case notations. Worst case means we are mostly unlucky for that problem domain , i.e precondition of task do