**Example-1:**

Let f(n) = n^{2} + n + 5. Then f(n) is O(n^{2}), and f(n) is O(n^{3}), but f(n) is not O(n).

**Example-2:**

Let f(n) = 3^{n} . Then f(n) is O(4^{n}) but not f(n) is not O(2^{n})

*(1) refers to constant time.*

**Note:**O*O*(

*n*) indicates linear time;

*O*(

*n*

^{k}) (k fixed) refers to polynomial time;

*O*(log

*n*) is called logarithmic time;

*O*(2

^{n}) refers to exponential time, etc.

*O*(1) < *O*(log *n*) < *O*(*n*) < *O*(*n ^{2}*) <

*O*(2

^{n})

**Omega (Ω): **

- If we write f(n) = Ω(g(n)), then there exists a function f(n) such that ∀
*n*≥*n*_{0, }f(n) ≥ cg(n) with any constant c and a positive integer*n*_{0}.**Or** - f(n) = Ω(g(n)) means we can say Function g(n) is an asymptotic lower bound for f(n).
**Example-1:**Let*f*(*n*) = 2*n*^{2}+ 4*n*+ 10. Then*f*(*n) is Ω*(*n*^{2})

**Theta (θ): **

- If we write f(n) = θ(g(n)), then there exists a function f(n) such that ∀
*n*≥*n*_{0, }c_{1}g(n) ≤ f(n) ≤ c_{2}g(n) with a positive integer*n*_{0}, any positive constants c_{1}and c_{2}.**Or** - f(n) = θ(g(n)) means we can say Function g(n) is an asymptotically tight bound for f(n).

**Example-1:**

Let *f* (*n*) = 2*n*^{2} + 4*n* + 10. Then *f* (*n) is θ*(*n*^{2})

**Small Oh (o): **

- If we write f(n) = o(g(n), then there exists a function such that f(n) < c g(n) with any positive constant c and a positive integer
*n*_{0}.**Or** - f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper bound of f(n).
- Example: n
^{1.99}= o(n^{2})

**Small Omega (ω): **

- If we write f(n) = ω(g(n)), then these exists a function such that f(n) > cg(n) with any positive constant c and a positive integer
*n*_{0}.**Or** - f(n) = ω(g(n)) means we can say g(n) is asymptotically tight lower bound of f(n).
- Example: n
^{2.00001}= ω(n^{2}) and n^{2}≠ ω(n^{2})

## II. The relationship between asymptotic notations :

**III. Properties of Asymptotic notations**

**1.Reflexive Property:**

**4.Mixed asymptotic transitive Properties:**

**IV. Analysis of Algorithms**

The *complexity*of an algorithm is a function describing the efficiency of the algorithm in terms of the amount of data the algorithm must process. Usually there are natural units for the domain and range of this function.

- Algorithm can be classified by the amount of time they need to complete compared to their input size.
- The analysis of an algorithm focuses on the complexity of algorithm which depends on time or space.

**1. Time Complexity: **

**Worst case time complexity:**It is the function defined by the maximum amount of time needed by an algorithm for an input of size n.**Average case time complexity**: The average-case running time of an algorithm is an estimate of the running time for an "average" input. Computation of average-case running time entails knowing all possible input sequences, the probability distribution of occurrence of these sequences, and the running times for the individual sequences.**Amortized Running Time:**It is**Best case time complexity:**It is the minimum amount of time that an algorithm requires for an input of size n.

**2. ****Space Complexity: **

**V. What is Recurrence Relations**

^{2}).

**Methods to Solve the Recurrence Relation**

There are three methods to solve the recurrence relation given as: Master method , Substitution Method and Recursive Tree method.

**1. Master Method: **

The master method gives us a quick way to find solutions to recurrence relations of the form T(n) = aT (n/b) + f(n). Where, a and b are constants, a ≥ 1 and b > 1)

*T(n) = aT(n/b) + f (n)*where*f(n)**∈**Θ(n*^{d}), d ≥ 0*Case-1: If a < b*^{d}, T(n)*∈**Θ(n*^{d})*Case-2: If a = b*^{d}, T(n)*∈**Θ(n*^{d}log n)*Case-3: If a > b*^{d}, T(n)*∈**Θ(n*^{log b a})

*Examples:**T(n) = 4T(n/2) + n**⇒**T(n)**∈**Θ(n*^{2})*T(n) = 4T(n/2) + n*^{2}*⇒**T(n)**∈**Θ(n*^{2}log n)*T(n) = 4T(n/2) + n*^{3}*⇒**T(n)**∈**Θ(n*^{3})

**2. Iterative Substitution Method: **

Recurrence equation is substituted itself to find the final generalized form of the recurrence equation.

T(N) = 2T(N/2)+bN Here T(N/2) is substituted with 2T((N/2)/2)+b(N/2)

T(N) = 4T(N/4)+2bN (Similarly substitute for T(N/4)T(N) = 8T(N/8)+3bNAfter (i-1) substititions,T(N) = 2^{i}T(N/2^{i})+ibN

When i=log(N), N/2^{i}=1 and we have the base case.

T(N) = 2^{log(N)}T(N/2^{log(N)})+ibN

T(N) = N T(1) +log(N)bN

T(N) = Nb +log(N)bN

Therefore T(N) is O(Nlog(N))

Using recursion method, n element problem can be further divided into two or more sub problems. In the following

**3. Recursion Tree Method:**

figure, given problem with n elements is divided into two equal sub problems of size n/2. For each level of the tree the number of elements is N. When the tree is split so evenly the sizes of all the nodes on each level. Maximum depth of tree is logN (number of levels).

Recurrence relation T(n) = 2 T(n/2) + O(n) = Θ(Nlog(N)).

**Example-1:** **Find the Time complexity for T(N) = 4T(N/2)+N.**

Let T(N) = aT(N/b)+f(N).

Then a=4, b=2, and f(N)=N

N^{logba} = N^{log24} = N^{2}.

f(N) is smaller than N^{logba}

Using case 1 of master theorem with ε=1.

T(N) = Θ(N^{logba}) = Θ(N^{2}).

**Example-2**: **Find the Time complexity for T(N) = 2T(N/2) = Nlog(N)**

a=2, b=2, and f(N)=Nlog(N)

Using case 2 of the master theorem with k=1.

T(N) = Θ(N(logN)^{2}).

**Example-3**: **Find the Time complexity for T(N) = T(N/4) + 2N**

a=1, b=4, and f(N)=2N

log_{b}a = log_{4}1 = 0

N^{logb} = N^{0} = 1

Using case 3: f(N) is Ω(N^{0+ε}) for &epsilon=1 and af(N/b) = (1)2(N/4) = N/2 = (1/4)f(N)

Therefore T(N) = Θ(N)

**Example-4**: ** Find the Time complexity for T(N) = 9T(N/3) + N ^{2.5}**

a=9, b=3, f(N)=N^{2.5}

log_{b}a = 2, so N^{logb} = N^{2}

Using case 3 since f(N) is Ω(N^{0+ε}), epsilon=.5 and af(N/b)=9f(N/3)= 9(N/3)^{2.5}=(1/3)^{0.5}f(n).

Therefore T(n) is O(N^{2.5})

Candidates can also practice 110+ Mock tests for exams like GATE, NIELIT with BYJU'S Exam Prep Test Series check the following link:

### Click Here to Avail GATE CSE Test Series!

**Thanks**

**Download BYJU'S Exam Prep, Best gate exam app**

**for Preparation**

## Comments

write a commentJgjMay 2, 2018

Ayush AggarwalMay 3, 2018

poonam maheshwariAug 5, 2018

Chirag ShahAug 15, 2019

...Read MoreDev VatsAug 21, 2019

Dev VatsAug 21, 2019

Sweta VermaAug 23, 2020

T(n) = 4T(n/2) + n⇒T(n)∈<sup>Θ(n</sup>2)T(n) = 4T(n/2) + n<sup>3</sup>⇒T(n)∈<sup>Θ(n</sup>3)Shouldn't these two be opposite?

...Read MoreSheeraz KhanDec 17, 2021

Sanjay HothtaJan 19, 2022

Pawar Nandini VilasJan 20, 2022