Friday 15 June 2012

Asymptotic Notation

We can a solve a single problem of computation in many ways , different minds come with different solutions.But which solution we should choose while implementing it is dependent upon the running time of the algorithm. This is judgement is based upon most important two factors:

  1. Same complexity : The minimum space requirement (in the memory) for running the algorithm. 
  2. Time Complexity : The minimum time required by the algorithm to output a solution.This may be in seconds/minutes/hours/days/months/year/century and so on.
The above two factor are known as asymptotic complexity of the algorithm, which is a way of expressing the main components of the cost of an algorithm, using idealized units of computational work.

Step count is compare the time complexity of two programs that compute the same function and also to predict the growth in  running time as instance characteristics changes. However determining the exact step count is difficult and also its not necessary.

Common asymptotic notations are given below:



Efficiency Of Algorithm:

1.Worst Case

The worst-case efficiency of an algorithm is its efficiency for the worst-case input of
size n, which is an input (or inputs) of size n for which the algorithm runs the longest
among all possible inputs of that size.


2.Average Case
  • It yields the information about an algorithm about an algorithm‘s behavior on a "typical" and random" input.
  • To analyze the algorithm's average-case efficiency, we must make some assumptions about possible inputs of size n.
  • The investigation of the average case efficiency is considerably more difficult than investigation of the worst case and best case efficiency.


3.Best Case
The best-case efficiency of an algorithm is its efficiency for the best-case input of size
n, which is an input (or inputs) of size n for which the algorithm runs the fastest among
all possible inputs of that size.


Big 'oh' Notation(O)

O(g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥
n0 }
It is the upper bound of any function. Hence it denotes the worse case complexity of any
algorithm.




Omega Notation(Ω)

Ω (g(n)) = { f(n) : there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n
≥ n0 }
It is the lower bound of any function. Hence it denotes the best case complexity of any
algorithm.





Theta Notation(Θ)

Θ(g(n)) = {f(n) : there exist positive constants c1,c2 and n0 such that c1g(n) ≤f(n) ≤c2g(n) for
all n ≥ n0 }
If f(n) = Θ(g(n)), all values of n right to n0 f(n) lies on or above c1g(n) and on or below
c2g(n).



Little-O Notation

For non-negative functions, f(n) and g(n), f(n) is little o of g(n) if and only if f(n) = O(g(n)),
but f(n) ≠ Θ(g(n)). This is denoted as "f(n) = o(g(n))".

This represents a loose bounding version of Big O. g(n) bounds from the top, but it does
not bound the bottom.

Little Omega Notation

For non-negative functions, f(n) and g(n), f(n) is little omega of g(n) if and only if f(n) =
Ω(g(n)), but f(n) ≠ Θ(g(n)). This is denoted as "f(n) = ω(g(n))".

Much like Little Oh, this is the equivalent for Big Omega. g(n) is a loose lower boundary of
the function f(n); it bounds from the bottom, but not from the top.

No comments:

Post a Comment