GROWTH FUNCTIONS
Those functions that derive the running time of the algorithm and memory space requirements for the given set of inputs are known as Algorithm Complexity.
ASYMPTOMATIC NOTATIONS
- Asymptotic notations are the mathematical notations used to describe the running time of an algorithm when the input tends towards a particular value or a limiting value.
- Using asymptotic analysis, we can very well conclude the best case, average case, and worst-case scenario of an algorithm.
- Types of Asymptotic Notation
- Big-O Notation (Ο) – Big-O notation specifically describes the worst-case scenario.
- Omega Notation (Ω) – Omega (Ω) notation specifically describes the best-case scenario.
- Theta Notation (θ) – This notation represents the average complexity of an algorithm.
Big-O Notation (Ο):
Big O notation specifically describes the scenario. It represents the upper bound running time complexity of an algorithm. Many times we easily find an upper bound by simply looking at the algorithm.
O(g(n)) = { f(n) ∶ there exist positive constants c and
n0 such that 0 ≤ f(n) ≤ c ∗ g(n)
for all n >= n0 }
Example:
f(n) = 3n + 2
g(n) = n
can we write f(n) = O(g(n))?
Condition to be satisfied is 0 <= f(n) <= c ∗ g(n)
0 <= 3n + 2 <= c ∗ n
0 <= 3n + 2 <= 4 ∗ n for all n0 >= 2
So, f(n) = O(g(n))
3n + 2 = O(n)
Big-Omega Notation (Ω):
Omega notation represents the lower bound of the running time of an algorithm. Thus, it provides the best case complexity of an algorithm.
Ω(g(n)) = { f(n): there exist positive constants c and n0
such that 0 ≤ cg(n) ≤ f(n)
for all n ≥ n0 }
f(n)=Ω(g(n)) |
Example:
f(n) = 3n + 2
g(n) = n
can we write f(n) = Ω (g(n))?
Solution:
Condition to be satisfied is 0 <= c ∗ g(n) <= f(n)
0 <= c ∗ n <= 3n + 2
0 <= n <= 3n + 2 for all n0 >= 1
So, f(n) = Ω (g(n))
3n + 2 = Ω (n)
Theta Notation (θ):
This notation describes both the upper bound and lower bound of an algorithm so we can say that it defines exact asymptotic behavior. In the real case scenario, the algorithm does not always run on best and worst cases, the average running time lies between best and worst and can be represented by the theta notation. Thus, it provides the average-case complexity of an algorithm.
Θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0
such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n)
for all n ≥ n0 }
f(n)=θ(g(n)) |
Example:
f(n) = 3n + 2
g(n) = n
can we write f(n) = θ (g(n))?
Solution:
Condition to be satisfied is c1 ∗ g(n) <= f(n) <= c2 ∗ g(n)
c1n <= 3n + 2 <= c2n
n <= 3n + 2 <= 5n for all n0 >= 1
So, f(n) = θ (g(n))
3n + 2 = θ (n)
IMPORTANCE OF GROWTH FUNCTIONS
- SPACE COMPLEXITY
- TIME COMPLEXITY