The efficiency of an algorithm depends on the amount of time, storage and other resources required to execute the algorithm. The efficiency is measured with the help of asymptotic notations.

An algorithm may not have the same performance for different types of inputs. With the increase in the input size, the performance will change.

The study of change in performance of the algorithm with the change in the order of the input size is defined as asymptotic analysis.

There are five types of asymptotic notation: O (Big O), Ω (Big omega), Θ (Big theta), o (Little o) and ω (Little Omega).

Big-O Notation

Big O notation represents the upper bound or worst-case scenario of the growth rate of a function. It describes an algorithm’s maximum time or space complexity.

Let $f(n)$ and $g(n)$ be functions that map positive integers to positive real numbers. We say that $f(n)$ is $O(g(n))$ (or $f(n) \in O(g(n))$) if there exists a real constant $c \gt 0$ and there exists an integer constant $n_0 \geq 1$ such that $f(n) \leq c \cdot g(n)$ for every integer $n \geq n_0$.

Example

Let $f(n) = 7n + 8$ and $g(n) = n$. Is $f(n) \in O(g(n))$?

Solution

To determine whether $f(n) = 7n + 8$ belongs to $O(g(n))$ where $g(n) = n$, we need to check if $f(n)$ is asymptotically bounded above by $g(n)$. Specifically, we want to see if there exist positive constants $c$ and $n_0$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$7n + 8 \leq c \cdot n \qquad \text{for all} n \geq n_0$$

Subtract $7n$ from both sides:

$$8 \leq (c − 7)n$$

To satisfy this inequality, we need $c - 7 \gt 0$ and $n \geq n_0$, which implies $c \gt 7$. Thus, the inequality becomes:

$$n \geq \frac{8}{c - 7}$$

Let's pick $c = 8$ (any $c$ greater than to $7$ will work):

$$n \geq \frac{8}{8 - 7} = 8$$

So, for $c = 8$ and $n_0 = 8$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n) \in O(g(n))$.

Example

Prove that running time $f(n) = n^3 + 20n + 1 is O(n^3)$

Solution

To prove that the running time $f(n) = n^3 + 20n + 1 is O(n^3)$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$n^3 + 20n + 1 \leq c \cdot n^3 \qquad \text{for all} n \geq n_0$$

Divide by n^3 from both sides:

$$1 + \frac{20}{n^2} + \frac{1}{n^3} \leq c$$

As n increases, the terms $\frac{20}{n^2}$ and $\frac{1}{n^3}$ decrease. Specifically, for $n \geq 1$:

$$\frac{n^2}{20} \leq 20 \qquad and \qquad \frac{1}{n^3} \leq 1$$

Thus, for $n \leq 1$, we have:

$$1 + \frac{n^2}{20} + \frac{1}{n^3} \leq c$$ $$1 + 20 + 1 \leq c$$ $$22 \leq c$$

So, for $c = 22$ and $n_0 = 1$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n)$ is $O(n^3)$.

Example

Prove that running time $f(n) = n^3 + 20n + 1$ is not $O(n^2)$

Solution

To prove that the running time $f(n) = n^3 + 20n + 1$ is not $O(n^2)$, we need to show that there do not exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Assume, for contradiction, that $f(n)$ is $O(n^2)$. Then there exist constants $c \gt 0$ and $n_0 \geq 1$ such that for all $n \geq n_0$:

$$n^3 + 20n + 1 \leq C \cdot n^2$$

For large $n$, the $n^3$ term dominates the left-hand side. Therefore, the inequality simplifies to:

$$n^3 \leq c \cdot n^2$$

Dividing both sides by $n^2$ (assuming $n \gt 0$):

$$n \leq c$$

This implies that $n$ is bounded above by the constant $c$. However, this contradicts the fact that $n$ can grow arbitrarily large.

Since assuming that $f(n) = n^3 + 20n + 1$ is $O(n^2)$ leads to a contradiction, we conclude that $f(n)$ is not $O(n^2)$.

Example

Prove that running time $f(n) = n^3 + 20n + 1$ is $O(n^4)$

Solution

To prove that the running time $f(n) = n^3 + 20n + 1$ is $O(n^4)$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$n^3 + 20n + 1 \leq c \cdot n^4 \qquad \text{for all} n \geq n_0$$

Divide by $n^4$ from both sides:

$$\frac{1}{n} + \frac{20}{n^3} + \frac{1}{n^4} \leq c$$

As $n$ increases, the terms $\frac{1}{n}$, $\frac{20}{n^3}$, and $\frac{1}{n^4}$ all approach $0$. Specifically, for $n \geq 1$, we have:

$$\frac{1}{n} \leq 1, \qquad \frac{20}{n^3} \leq 20, \qquad \frac{1}{n^4} \leq 1$$

Thus, for $n \geq 1$:

$$\frac{1}{n} + \frac{20}{n^3} + \frac{1}{n^4} \leq c$$ $$1 + 20 + 1 \leq c$$ $$22 \leq c$$

So, for $c = 22$ and $n_0 = 1$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n)$ is $O(n^4)$.

Example

Prove that running time $f(n) = 2^{n+1} is O(2^n)$

Solution

To prove that the running time $f(n) = 2^{n+1}$ is $O(2^n)$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$2^{n+1} \leq c \cdot 2^n \qquad \text{for all} n \geq n_0$$

Divide by $2^n$ from both sides (assuming $n \geq 1$):

$$2 \leq c$$

So, for $c = 2$ and $n_0 = 1$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n)$ is $O(2^n)$.

Example

Prove that running time $f(n) = 10^80$ is $O(1)$

Solution

To prove that the running time $f(n) = 10^80$ is $O(1)$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$10^80 \leq c \cdot 1 \qquad \text{for all} n \geq n_0$$

Divide by $2^n$ from both sides (assuming $n \geq 1$):

$$2 \leq c$$

Here, we can choose $c = 10^80$ and $n_0$ can be any value because $10^80$ is constant and does not depend on $n$. Therefore, we can conclude that $f(n)$ is $O(1)$.

Example

Prove that running time $f(n) = \log_{ln 5}(\log^{\log 100} n)$ is $O(\log(\log n))$

Solution

The function $f(n) = \log_{ln 5}(\log^{\log 100} n)$ can be rewritten using the change of base formula:

$$f(n) = \log_{ln 5}(\log^{\log 100} n) = \frac{\log (\log^{\log 100} n)}{\log (\ln 5)}$$

Next, simplify the expression $\log (\log^{\log 100} n)$:

$$\log (\log^{\log 100} n) = \log 100 \cdot \log(\log n)$$

Substituting this back into $f(n)$, we get:

$$f(n) = \frac{\log 100 \cdot \log(\log n)}{\log (\ln 5)}$$

To prove that the running time $f(n) = \log_{ln 5}(\log^{\log 100} n)$ is $O(\log(\log n))$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$\log_{ln 5}(\log^{\log 100} n) \leq c \cdot \log(\log n) \qquad \text{for all} n \geq n_0$$ $$\frac{\log 100 \cdot \log(\log n)}{\log (\ln 5)} \leq c \cdot \log(\log n)$$

Divide by $\log(\log n)$ from both sides (assuming $n \geq 1$):

$$\frac{\log 100}{\log (\ln 5)} \leq c$$

Here, we can choose $c = 10^80$ and $n_0 = 1$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n)$ is $O(\log(\log n))$.

Example

Prove that running time $f(n) = 5^log(3) n^3 + 10^80 n^2 + log(3) n^{3.1} + 6006$ is $O(n^{3.1})$.

Solution

To prove that the running time $f(n) = 5^log(3) n^3 + 10^80 n^2 + log(3) n^{3.1} + 6006$ is $O(n^{3.1})$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$5^log(3) n^3 + 10^80 n^2 + log(3) n^{3.1} + 6006 \leq c \cdot n^{3.1} \qquad \text{for all} n \geq n_0$$

Divide by $n^{3.1}$ from both sides (assuming $n \geq 1$):

$$\frac{5^log(3)}{n^{0.1}} + \frac{10^80}{n^{1.1}} + log(3) + \frac{6006}{n^{3.1}} \leq c$$

As n increases, the terms $\frac{5^log(3)}{n^{0.1}}$, $\frac{10^80}{n^{1.1}}$, and $\frac{6006}{n^{3.1}}$ all approach $0$. Specifically, for $n \geq 1$, we have:

$$\frac{5^log(3)}{n^{0.1}} \leq 5^log(3), \qquad \frac{10^80}{n^{1.1}} \leq 10^80, \qquad \frac{6006}{n^{3.1}} \leq 6006$$

Thus, for $n \geq 1$:

$$\frac{5^log(3)}{n^{0.1}} + \frac{10^80}{n^{1.1}} + log(3) + \frac{6006}{n^{3.1}} \leq c$$ $$5^log(3) + 10^80 + log(3) + 6006 \leq c$$

Here, we can choose $c = 5^log(3) + 10^80 + log(3) + 6006$ and $n_0 = 1$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n)$ is $O(n^{3.1})$.

Example

Prove that running time $f(n) = \binom{n}{n/2} is O(\frac{2^n}{\sqrt{n}})$.

Solution

The binomial coefficient $\binom{n}{n/2}$ can be approximated using Stirling's approximation for factorials, which is useful for large $n$. Stirling's approximation states:

$$n! \approx \sqrt{2\pin} (\frac{n}{e})^n$$

The binomial coefficient can be expressed as:

$$\binom{n}{n/2} = \frac{n!}{(\frac{n}{2})!(\frac{n}{2})!}$$

Applying Stirling's approximation:

$$n! \approx \sqrt{2\pin} (\frac{n}{e})^n$$ $$\frac{n}{2}! \approx \sqrt{\pin} (\frac{\frac{n}{2}}{e})^\frac{n}{2}$$

Thus:

$$$\binom{n}{n/2} \approx \frac{\sqrt{2\pin} (\frac{n}{e})^n}{(\sqrt{\pin} (\frac{\frac{n}{2}}{e})^\frac{n}{2})^2}$$

Simplifying this expression:

$$\binom{n}{n/2} \approx \frac{\sqrt{2\pin} (\frac{n}{e})^n}{\pin (\frac{\frac{n}{2}}{e})^n} = \frac{2^n \sqrt{2\pin}}{\pin}$$

Simplify further:

$$\binom{n}{n/2} \approx \frac{2^n}{\sqrt{\pin}}$$

To prove that the running time $f(n) = \binom{n}{n/2}$ is $O(\frac{2^n}{\sqrt{n}})$, we need to show that there exist constants $c \gt 0$ and $n_0 \geq 1$ such that:

$$f(n) \leq c \cdot g(n) \qquad \text{for all} n \geq n_0$$

Substituting the given functions:

$$\binom{n}{n/2} \leq c \cdot \frac{2^n}{\sqrt{n}} \qquad \text{for all} n \geq n_0$$

Divide by $\frac{2^n}{\sqrt{n}}$ from both sides (assuming $n \geq 1$):

$$\frac{2^n}{\sqrt{\pin}} \leq c \cdot \frac{2^n}{\sqrt{n}}$$ $$\frac{1}{\sqrt{\pi}} \leq c$$

Here, we can choose $c = \frac{1}{\sqrt{\pi}}$ and $n_0 = 1$, the inequality holds for all $n \geq n_0$. Therefore, we can conclude that $f(n)$ is $O(\frac{2^n}{\sqrt{n}})$.

Little-o Notation

Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is o(g(n)) (or f(n) \in o(g(n))) if for any real constant c \gt 0, there exists an integer constant n_0 \geq 1 such that f(n) \lt c \cdot g(n) for every integer n \geq n_0.

Big–Omega Notation

Big Omega notation represents the lower bound or best-case scenario of the growth rate of a function. It describes an algorithm’s minimum time or space complexity.

Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is \Omega(g(n)) (or f(n) \in \Omega(g(n))) if there exists a real constant c \gt 0 and there exists an integer constant n_0 \geq 1 such that f(n) \geq c \cdot g(n) for every integer n \geq n_0.

Little–Omega Notation

Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is \omega(g(n)) (or f(n) \in \omega(g(n))) if for any real constant c \gt 0, there exists an integer constant n_0 \geq 1 such that f(n) \gt c \cdot g(n) for every integer n \geq n_0.

Big–Theta Notation

Big Theta notation represents both the upper and lower bounds of the growth rate of a function. It provides a tight bound on the time or space complexity of an algorithm.

Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is \Theta(g(n)) (or f(n) \in \Theta(g(n))) if and only if f(n) \in O(g(n)) and f(n) \in \Omega(g(n)).