The efficiency of an algorithm depends on the amount of time, storage and other resources required to execute the algorithm. The efficiency is measured with the help of asymptotic notations.
An algorithm may not have the same performance for different types of inputs. With the increase in the input size, the performance will change.
The study of change in performance of the algorithm with the change in the order of the input size is defined as asymptotic analysis.
There are five types of asymptotic notation: \(O\) (Big O), \(\Omega\) (Big omega), \(\Theta\) (Big theta), \(o\) (Little o) and \(\omega\) (Little Omega).
Big-O Notation
Big O notation represents the upper bound or worst-case scenario of the growth rate of a function. It describes an algorithm’s maximum time or space complexity.
Let \(f(n)\) and \(g(n)\) be functions that map positive integers to positive real numbers. We say that \(f(n)\) is \(O(g(n))\) (or \(f(n) \in O(g(n))\)) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that \(f(n) \leq c \cdot g(n)\) for every integer \(n \geq n_0\).
Example
Let \(f(n) = 7n + 8\) and \(g(n) = n\). Is \(f(n) \in O(g(n))\)?
Solution
To determine whether \(f(n) = 7n + 8\) belongs to \(O(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Subtract both sides by \(c \cdot n\):
For the inequality to hold for sufficiently large \(n\), the dominant term \((7 - c)n\) must be negative. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(7\).
Choose \(c = 8\):
Therefore, we can pick \(c = 8\) and \(n_0 = 8\):
Hence, we have shown that there exist constants \(c = 8\) and \(n_0 = 8\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Let \(f(n) = 7n^{2} + 8\) and \(g(n) = n\). Is \(f(n) \in O(g(n))\)?
Solution
To determine whether \(f(n) = 7n^{2} + 8\) belongs to \(O(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Subtract both sides by \(c \cdot n\):
For large \(n\), the term \(7n^{2}\) will grow faster than \(c \cdot n\). Hence, the inequality \(7n^{2} + 8 \leq c \cdot n\) will not hold for any large \(n\).
Hence, we have shown that there do not exist constants \(c \gt 0\) and \(n_0\) such that \(7n^{2} + 8 \leq c \cdot n\) for all \(n \geq n_0\). Thus, \(f(n) \notin O(g(n))\).
Example
Prove that running time \(f(n) = n^{3} + 20n + 1\) is \(O(n^{3})\)
Solution
To determine whether \(f(n) = n^{3} + 20n + 1\) belongs to \(O(g(n))\) where \(g(n) = n^{3}\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c - 1)n^{3}\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(1\).
Choose \(c = 2\):
We want to find \(n_0\) such that:
To estimate \(n\) for which \(n^{3} - 20n - 1 \geq 0\) holds, we need to find the point where the positive terms (\(n^{3}\)) become dominant over the negative terms (\(20n\) and \(1\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(n^{3}\) is the highest-degree positive term and \(20n\) is the highest-degree negative term. We compare \(n^{3}\) with \(20n\) to estimate \(n\).
So, \(n^{3}\) starts to dominate \(20n\) when \(n\) is \(5\) or larger.
Let's approach it by testing positive values for \(n\) such that \(n^{3} - 20n - 1 \geq 0\) holds.
For \(n = 5\):
Hence, we have shown that there exist constants \(c = 2\) and \(n_0 = 5\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Prove that running time \(f(n) = n^{3} + 20n + 1\) is not \(O(n^{2})\)
Solution
To determine whether \(f(n) = n^{3} + 20n + 1\) belongs to \(O(g(n))\) where \(g(n) = n^{2}\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
Assume, for contradiction, that \(f(n) = n^{3} + 20n + 1\) is \(O(n^{2})\). Then there exist constants \(c \gt 0\) and \(n_0 \geq 1\) such that for all \(n \geq n_0\):
Subtract by \(c \cdot n^{2}\) from both sides:
For large \(n\), the term \(n^{3}\) will grow faster than \(c \cdot n^{2}\). Hence, the inequality \(n^{3} + 20n + 1 \leq c \cdot n^{2}\) will not hold for any large \(n\).
Hence, we have shown that there do not exist constants \(c \gt 0\) and \(n_0\) such that \(n^{3} + 20n + 1 \leq c \cdot n^{2}\) for all \(n \geq n_0\). Thus, \(f(n) \notin O(g(n))\).
Example
Prove that running time \(f(n) = 100n^{3} + 20n + 1\) is \(O(n^{4})\)
Solution
To determine whether \(f(n) = 100n^{3} + 20n + 1\) belongs to \(O(g(n))\) where \(g(n) = n^{4}\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \(c \cdot n^{4}\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(0\).
Choose \(c = 1\):
We want to find \(n_0\) such that:
To estimate \(n\) for which \(n^{4} - 100n^{3} - 20n - 1 \geq 0\) holds, we need to find the point where the positive terms (\(n^{4}\)) become dominant over the negative terms (\(100n^{3}\), \(20n\) and \(1\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(n^{4}\) is the highest-degree positive term and \(100n^{3}\) is the highest-degree negative term. We compare \(n^{4}\) with \(100n^{3}\) to estimate \(n\).
So, \(n^{4}\) starts to dominate \(100n^{3}\) when \(n\) is larger than \(100\).
Let's approach it by testing positive values for \(n\) such that \(n^{4} - 100n^{3} - 20n - 1 \geq 0\) holds.
For \(n = 101\):
For \(n = 100\):
Hence, we have shown that there exist constants \(c = 1\) and \(n_0 = 101\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Prove that running time \(f(n) = 2^{n+1}\) is \(O(2^{n})\)
Solution
To determine whether \(f(n) = 2^{n+1}\) belongs to \(O(g(n))\) where \(g(n) = 2^{n}\), we need to check if f(n) grows at most as fast as g(n) asymptotically. A function f(n) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Divide by \(2^{n}\) from both sides:
This means that for large enough \(n\), the inequality will hold as long as \(c \geq 2\).
Choose \(c = 2\):
Therefore, we can pick \(c = 2\) and \(n_0 = 1\):
Hence, we have shown that there exist constants \(c = 2\) and \(n_0 = 1\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Prove that running time \(f(n) = 10^{80}\) is \(O(1)\)
Solution
To determine whether \(f(n) = 10^{80}\) belongs to \(O(g(n))\) where \(g(n) = 1\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
The function \(f(n) = 10^{80}\) is a constant value, meaning that it does not depend on \(n\). It remains \(10^{80}\) regardless of the input \(n\).
We need to find a constant \(c\) such that:
This inequality is true for any constant \(c \geq 10^{80}\). Therefore, we can choose \(c = 10^{80}\).
Thus, we can conclude that \(f(n) \in O(g(n))\) with \(c = 10^{80}\) and \(n_0 = 1\).
Example
Prove that running time \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) is \(O(\log(\log(n)))\)
Solution
The function \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) can be rewritten using the change of base formula:
Next, simplify the expression:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) belongs to \(O(g(n))\) where \(g(n) = \log(\log(n))\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Simplify the function \(f(n)\):
Divide by \(\log(\log(n))\) from both sides:
This means that for large enough \(n\), the inequality will hold as long as \(c \geq \frac{\log(100)}{\log (\ln(5))}\).
Thus, we can conclude that \(f(n) \in O(g(n))\) with \(c = \frac{\log(100)}{\log(\ln(5))}\) and \(n_0 = 1\).
Example
Prove that running time \(f(n) = 5^{log(3)} n^{3} + 10^{80} n^{2} + log(3) n^{3.1} + 6006\) is \(O(n^{3.1})\).
Solution
To determine whether \(f(n) = 5^{log(3)} n^{3} + 10^{80} n^{2} + log(3) n^{3.1} + 6006\) belongs to \(O(g(n))\) where \(g(n) = n^{3.1}\), we need to check if f(n) grows at most as fast as g(n) asymptotically. A function f(n) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c - log(3))n^{3.1}\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(0.4771\).
Choose \(c = 1\):
To estimate \(n\) for which \((1 - log(3))n^{3.1} - 5^{log(3)}n^{3} - 10^{80}n^{2} - 6006 \geq 0\) holds, we need to find the point where the positive terms (\((1 - log(3))n^{3.1}\)) become dominant over the negative terms (\(5^{log(3)}n^{3}\), \(10^{80}n^{2}\) and \(6006\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \((1 - log(3))n^{3.1}\) which is the highest-degree positive term with \(5^{log(3)}n^{3}\) which is the highest-degree negative term to estimate \(n\).
Divide both sides by \((1 - log(3))n^{3}\):
Raise both sides to the power of \(10\):
Next, we compare \((1 - log(3))n^{3.1}\) which is the highest-degree positive term with \(10^{80}n^{2}\) which is the term with the largest constant magnitude to estimate \(n\).
Divide both sides of the inequality by n^{2}
Divide both sides by \(1−log(3)\):
Now, take the 1.11-th root of both sides to isolate \(n\):
This is a much larger threshold than \(4.94 \cdot 10^{9}\).
Let's approach it by testing positive values for \(n\) such that \((1 - log(3))n^{3.1} - 5^{log(3)}n^{3} - 10^{80}n^{2} - 6006 \geq 0\) holds.
For \(n = 10^{73}\):
For \(n = 10^{74}\):
Hence, we have shown that there exist constants \(c = 1\) and \(n_0 = 10^{74}\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Prove that running time \(f(n) = \binom{n}{\frac{n}{2}}\) is \(O(\frac{2^{n}}{\sqrt{n}})\).
Solution
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation for factorials. Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplify the expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) belongs to \(O(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
From the above approximation:
Divide by \(\frac{2^{n}}{\sqrt{n}}\) from both sides:
This means that for large enough \(n\), the inequality will hold as long as \(c \geq \frac{\sqrt{2}}{\sqrt{\pi}}\).
Thus, we can conclude that \(f(n) \in O(g(n))\) with \(c = \frac{\sqrt{2}}{\sqrt{\pi}}\) and \(n_0 = 1\).
Example
Prove that running time \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{3} − 11n^{2} + 12}\) is \(O(n)\)
Solution
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) belongs to \(O(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Simplify the inequality:
Expand the expression:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((2c - 4)n^{3}\) must be positive or zero. If the dominant term \((2c - 4)n^{3}\) becomes zero, the term \((28 - 11c)n^{2}\) can still satisfy the inequality for large \(n\). Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than or equal to \(2\).
Choose \(c = 2\):
To estimate \(n\) for which \(6n^{2} - 32n + 35 \geq 0\) holds, we need to find the point where the positive terms (\(6n^{2}\) and \(35\)) become dominant over the negative terms (\(32n\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \(6n^{2}\) which is the highest-degree positive term with \(32n\) which is the highest-degree negative term to estimate \(n\).
Divide both sides by \(6n\):
So, \(6n^{2}\) starts to dominate \(32n\) when \(n\) is \(6\) or larger.
Let's approach it by testing positive values for \(n\) such that \(6n^{2} - 32n + 35 \geq 0\) holds.
For \(n = 6\):
For \(n = 5\):
For \(n = 4\):
For \(n = 3\):
Hence, we have shown that there exist constants \(c = 2\) and \(n_0 = 4\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Prove that running time \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) is \(O(\frac{1}{n^{2}})\)
Solution
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) belongs to \(O(g(n))\) where \(g(n) = \frac{1}{n^{2}}\), we need to check if \(f(n)\) grows at most as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
Simplify the inequality:
Expand the expression:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((2c - 5)n^{4}\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than or equal to \(2.5\).
Choose \(c = 2.6\):
Multiply both sides by \(5\):
To estimate \(n\) for which \(n^{4} - 133n^{3} + 331n^{2} + 52n + 585 \geq 0\) holds, we need to find the point where the positive terms (\(n^{4}\), \(331n^{2}\), \(52n\) and \(585\)) become dominant over the negative terms (\(133n^{3}\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \(n^{4}\) which is the highest-degree positive term with \(133n^{2}\) which is the highest-degree negative term to estimate \(n\).
Solving for \(n\):
Let's approach it by testing positive values for \(n\) such that \(n^{4} - 133n^{3} + 331n^{2} + 52n + 585 \geq 0\) holds.
For \(n = 133\):
For \(n = 132\):
For \(n = 131\):
For \(n = 130\):
Hence, we have shown that there exist constants \(c = 2.6\) and \(n_0 = 131\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Example
Prove that running time \(f(n) = \log(n!)\) is \(O(n \log(n))\).
Solution
The factorial function can be approximated using Stirling's approximation. Stirling's approximation states:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) belongs to \(O(g(n))\) where \(g(n) = n \log(n)\), we need to check if f(n) grows at most as fast as g(n) asymptotically. A function f(n) is in \(O(g(n))\) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to determine if there exist constants \(c\) and \(n_0\) such that:
From the approximation above:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c - 1)n \log(n)\) must be positive or zero. If the dominant term \((c - 1)n \log(n)\) becomes zero, the term \(n \log(e)\) can still satisfy the inequality for large \(n\). Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than or equal to \(1\).
Choose \(c = 1\):
To estimate \(n\) for which \(n \log(e) - \frac{1}{2} \log(n) - \frac{1}{2} \log(2\pi) \geq 0\) holds, we need to find the point where the positive terms (\(n \log(e)\)) become dominant over the negative terms (\(\frac{1}{2} \log(n)\) and \(\frac{1}{2} \log(2\pi)\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \(n \log(e)\) which is the highest-degree positive term with \(\frac{1}{2} \log(n)\) which is the highest-degree negative term to estimate \(n\).
The inequality \(n \log(e) \gt \frac{1}{2} \log(n)\) holds for all positive \(n\)
Let's approach it by testing positive values for \(n\) such that \(n \log(e) - \frac{1}{2} \log(n) - \frac{1}{2} \log(2\pi) \geq 0\) holds.
For \(n = 1\):
Hence, we have shown that there exist constants \(c = 1\) and \(n_0 = 1\) such that \(f(n) \leq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in O(g(n))\).
Little-o Notation
Let \(f(n)\) and \(g(n)\) be functions that map positive integers to positive real numbers. We say that \(f(n)\) is \(o(g(n))\) (or \(f(n) \in o(g(n))\)) if for any real constant \(c \gt 0\), there exists an integer constant \(n_0 \geq 1\) such that \(f(n) \lt c \cdot g(n)\) for every integer \(n \geq n_0\).
Example
Let \(f(n) = 7n + 8\) and \(g(n) = n\). Is \(f(n) \in o(g(n))\)?
Solution 1
To determine whether \(f(n) = 7n + 8\) belongs to \(o(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Subtract both sides by \(c \cdot n\):
For the inequality to hold for sufficiently large \(n\), the dominant term \((7 - c)n\) must be negative. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(7\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt 7\) requires \(c\) to be at least \(7\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(7\)).
Hence, \(f(n) = 7n + 8 \notin o(g(n))\).
Solution 2
To determine whether \(f(n) = 7n + 8\) is in \(o(g(n))\) where \(g(n) = n\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = 7n + 8\) and \(g(n) = n\), we need to evaluate:
Simplify the expression:
As \(n\) approaches infinity, the term \(\frac{8}{n}\) approaches \(0\). Thus, the limit becomes:
Since the limit is \(7\) (and not \(0\)), the function \(f(n) = 7n + 8\) does not grow faster than \(g(n) = n\). Therefore, \(f(n) \notin o(g(n))\).
Example
Let \(f(n) = 7n + 8\) and \(g(n) = n^{2}\). Is \(f(n) \in o(g(n))\)?
Solution 1
To determine whether \(f(n) = 7n + 8\) belongs to \(o(g(n))\) where \(g(n) = n^{2}\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Subtract both sides by \(c \cdot n^{2}\):
For the inequality to hold for sufficiently large \(n\), the dominant term \((-c)n^{2}\) must be negative. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(0\).
Since the inequality holds for any positive \(c\), \(f(n)\) satisfy the condition for \(o(g(n))\). Therefore, \(f(n) = 7n + 8 \in o(g(n) = n^{2})\).
Solution 2
To determine whether \(f(n) = 7n + 8\) is in \(o(g(n))\) where \(g(n) = n^{2}\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = 7n + 8\) and \(g(n) = n^{2}\), we need to evaluate:
Simplify the expression:
As \(n\) approaches infinity, the terms \(\frac{7}{n}\) and \(\frac{8}{n^{2}}\) approach \(0\). Thus, the limit becomes:
Since the limit is \(0\), then \(f(n) = 7n + 8 \in o(g(n) = n^{2})\).
Example
Let \(f(n) = 2^{n+1}\) and \(g(n) = 2^{n}\). Is \(f(n) \in o(g(n))\)?
Solution 1
To determine whether \(f(n) = 2^{n+1}\) belongs to \(o(g(n))\) where \(g(n) = 2^{n}\), we need to see if for any given positive value of \(c\), we can find a point \(n_0\) beyond which the function \(f(n) = 2^{n+1}\) grows strictly slower than \(c\) times \(g(n) = 2^{n}\):
Substituting the given functions:
Divide both sides by \(2^{n}\):
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(2\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt 2\) requires \(c\) to be at least \(2\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(2\)).
Hence, \(f(n) = 2^{n+1} \notin o(g(n) = 2^{n})\).
Solution 2
To determine whether \(f(n) = 2^{n+1}\) belongs to \(o(g(n))\) where \(g(n) = 2^{n}\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = 2^{n+1}\) and \(g(n) = 2^{n}\), we need to evaluate:
Simplify the expression and evaluate the limit:
Since the limit is \(2\) (and not \(0\)), the function \(f(n) = 2^{n+1}\) does not grow faster than \(g(n) = 2^{n}\). Therefore, \(f(n) \notin o(g(n))\).
Example
Let \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) and \(g(n) = \log(\log(n))\). Is \(f(n) \in o(g(n))\)
Solution 1
Let's simplify the expression \(\log_{\ln(5)}(\log^{\log(100)}(n))\).
The change of base formula states:
Applying this to the expression:
The expression inside the logarithm can be simplified using the logarithm power rule, \(\log (x^{y}) = y \cdot \log (x)\):
So, the original expression becomes:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) belongs to \(o(g(n))\) where \(g(n) = \log(\log(n))\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Simplify the expression \(\log_{\ln(5)}(\log^{\log(100)}(n))\):
Divide both sides by \(\log(\log(n))\) (which is positive for all \(n\)):
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(\frac{\log(100)}{\ln(5)}\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt \frac{\log(100)}{\ln(5)}\) requires \(c\) to be at least \(\frac{\log(100)}{\ln(5)}\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(\frac{\log(100)}{\ln(5)}\)).
Hence, \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n)) \notin o(g(n) = \log(\log(n)))\).
Solution 2
Let's simplify the expression \(\log_{\ln(5)}(\log^{\log(100)}(n))\).
The change of base formula states:
Applying this to the expression:
The expression inside the logarithm can be simplified using the logarithm power rule, \(\log (x^{y}) = y \cdot \log (x)\):
So, the original expression becomes:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) belongs to \(o(g(n))\) where \(g(n) = \log(\log(n))\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) and \(g(n) = \log(\log(n))\), we need to evaluate:
Simplify the expression and evaluate the limit:
Since the limit is \(\frac{\log(100)}{\ln(5)}\) (and not \(0\)), the function \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) does not grow faster than \(g(n) = \log(\log(n))\). Therefore, \(f(n) \notin o(g(n))\).
Example
Let \(f(n) = \binom{n}{\frac{n}{2}}\) and \(g(n) = \frac{2^{n}}{\sqrt{n}}\). Is \(f(n)\) in \(o(g(n))\).
Solution 1
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation for factorials. Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplifying this expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) belongs to \(o(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation:
Divide both sides by \(\frac{2^{n}}{\sqrt{n}}\) (which is positive for all \(n\)):
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(\sqrt{\frac{2}{\pi}}\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt \sqrt{\frac{2}{\pi}}\) requires \(c\) to be at least \(\sqrt{\frac{2}{\pi}}\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(\sqrt{\frac{2}{\pi}}\)).
Hence, \(f(n) = \binom{n}{\frac{n}{2}} \notin o(g(n) = \frac{2^{n}}{\sqrt{n}})\).
Solution 2
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation for factorials. Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplifying this expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) belongs to \(o(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = \binom{n}{\frac{n}{2}}\) and \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to evaluate:
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation and be simplified to:
Since the limit is \(\sqrt{\frac{2}{\pi}}\) (and not \(0\)), the function \(f(n) = \binom{n}{\frac{n}{2}}\) does not grow faster than \(g(n) = \frac{2^{n}}{\sqrt{n}}\). Therefore, \(f(n) \notin o(g(n))\).
Example
Let \(f(n) = \log(n!)\) and \(g(n) = n \log(n)\). Is \(f(n)\) in \(o(g(n))\).
Solution 1
The factorial function can be approximated using Stirling's approximation. Stirling's approximation states:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) belongs to \(o(g(n))\) where \(g(n) = n \log(n)\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
The factorial \(n!\) can be approximated using Stirling's approximation and be simplified to:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c - 1)n \log(n)\) must be positive or zero. If the dominant term \((c - 1)n \log(n)\) becomes zero, the term \(n \log(e)\) can still satisfy the inequality for large \(n\). Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than or equal to \(1\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \geq 1\) requires \(c\) to be at least \(1\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(1\)).
Hence, \(f(n) = \log(n!) \notin o(g(n) = n \log(n))\).
Solution 2
The factorial function can be approximated using Stirling's approximation. Stirling's approximation states:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) is in \(o(g(n))\) where \(g(n) = n \log(n)\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = \log(n!)\) and \(g(n) = n \log(n)\), we need to evaluate:
The factorial \(n!\) can be approximated using Stirling's approximation and be simplified to:
As n approaches infinity, the term \(\frac{\log(2\pi)}{2n \log(n)}\), \(\frac{1}{2n}\) and \(\frac{\log(e)}{\log(n)}\) approach \(0\). Thus, the limit becomes:
Since the limit is \(1\) (and not \(0\)), the function \(f(n) = \log(n!)\) does not grow faster than \(g(n) = n \log(n)\). Therefore, \(f(n) \notin o(g(n))\).
Example
Let \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) and \(g(n) = n\). Is \(f(n)\) in \(o(g(n))\).
Solution 1
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) belongs to \(o(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Multiply both sides by the denominator \(2n^{2} − 11n + 12\):
Expand the right-hand side, the inequality becomes:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((2c - 4)n^{3}\) must be positive or zero. If the dominant term \((2c - 4)n^{3}\) becomes zero, the term \((28 - 11c)n^{2}\) can still satisfy the inequality for large \(n\). Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than or equal to \(2\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \geq 2\) requires \(c\) to be at least \(2\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(2\)).
Hence, \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12} \notin o(g(n) = n)\).
Solution 2
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) belongs to \(o(g(n))\) where \(g(n) = n\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) and \(g(n) = n\), we need to evaluate:
Simplify the fraction:
Applying L'Hôpital's rule:
Applying L'Hôpital's rule again:
Applying L'Hôpital's rule again:
Since the limit is \(2\) (and not \(0\)), the function \(f(n) = \frac{5n^{4} − 2n^{3} − 35n^{2}}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) does not grow faster than \(g(n) = n\). Therefore, \(f(n) \notin o(g(n))\).
Example
Let \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) and \(g(n) = \frac{1}{n^{2}}\). Is \(f(n)\) in \(o(g(n))\).
Solution 1
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) belongs to \(o(g(n))\) where \(g(n) = \frac{1}{n^{2}}\), we need to check if \(f(n)\) grows strictly slower than \(g(n)\) asymptotically. A function \(f(n)\) is in \(o(g(n))\) if for any given positive real constant \(c\), there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Multiply both sides by the denominators \(2n^{4} − 11n^{3} + 12n^{2} + 4n + 45\) and \(n^{2}\):
Expand the expression, the inequality becomes:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((2c - 5)n^{4}\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(2.5\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt 2.5\) requires \(c\) to be at least \(2.5\), \(f(n)\) cannot be less than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(2.5\)).
Solution 2
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) belongs to \(o(g(n))\) where \(g(n) = \frac{1}{n^{2}}\) using limits, we need to evaluate the following limit:
If the limit is \(0\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) and \(g(n) = \frac{1}{n^{2}}\), we need to evaluate:
Simplify the fraction:
Applying L'Hôpital's rule:
Applying L'Hôpital's rule again:
Applying L'Hôpital's rule again:
Applying L'Hôpital's rule again:
Since the limit is \(2.5\) (and not \(0\)), the function \(f(n) = \frac{5n^{4} − 2n^{3} − 35n^{2}}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) does not grow faster than \(g(n) = \frac{1}{n^{2}}\). Therefore, \(f(n) \notin o(g(n))\).
Big–Omega Notation
Big Omega notation represents the lower bound or best-case scenario of the growth rate of a function. It describes an algorithm’s minimum time or space complexity.
Let \(f(n)\) and \(g(n)\) be functions that map positive integers to positive real numbers. We say that \(f(n)\) is \(\Omega(g(n))\) (or \(f(n) \in \Omega(g(n))\)) if there exists a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that \(f(n) \geq c \cdot g(n)\) for every integer \(n \geq n_0\).
Example
Let \(f(n) = 7n + 8\) and \(g(n) = n\). Is \(f(n) \in \Omega(g(n))\)?
Solution
To determine whether \(f(n) = 7n + 8\) belongs to \(\Omega(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
Subtract \(c \cdot n\) from both sides:
For the inequality to hold for sufficiently large \(n\), the dominant term \((7 - c)n\) must be positive or zero. If the dominant term \((7 - c)n\) becomes zero, the term \(8\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c = 7\). Then, the inequality becomes:
Subtract both sides by \(7n\):
This is true for all \(n \geq 1\) (or any positive \(n_0\)).
We have shown that for \(c = 7\) and \(n_0 = 1\), the inequality holds. Hence, \(f(n) = 7n + 8 \in \Omega(g(n) = n)\).
Example
Let \(f(n) = n^{3} + 20n + 1\) and \(g(n) = n^{3}\). Is \(f(n) \in \Omega(g(n))\)?
Solution
To determine whether \(f(n) = n^{3} + 20n + 1\) belongs to \(\Omega(g(n))\) where \(g(n) = n^{3}\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
Subtract \(c \cdot n^{3}\) from both sides:
For the inequality to hold for sufficiently large \(n\), the dominant term \((1 - c)n^{3}\) must be positive or zero. If the dominant term \((1 - c)n^{3}\) becomes zero, the term \(20n\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c = 1\). Then, the inequality becomes:
This is true for all \(n \geq 1\), so we can set \(n_0 = 1\).
We have shown that for \(c = 1\) and \(n_0 = 1\), the inequality holds. Hence, \(f(n) = n^{3} + 20n + 1 \in \Omega(g(n) = n^{3})\).
Example
Prove that running time \(f(n) = 2^{n+1} is \Omega(2^{n})\)
Solution
To determine whether \(f(n) = 2^{n+1}\) belongs to \(\Omega(g(n))\) where \(g(n) = 2^{n}\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
Cancel \(2^{n}\) from both sides of the inequality:
We can choose \(c = 2\). Then, the inequality becomes:
Divide both sides by \(2^{n+1}\):
This is true for all \(n \geq 1\), so we can set \(n_0 = 1\).
We have shown that for \(c = 1\) and \(n_0 = 1\), the inequality holds. Hence, \(f(n) = 2^{n+1} \in \Omega(g(n) = 2^{n})\).
Example
Prove that running time \(f(n) = 10^{80}\) is \(\Omega(1)\)
Solution
To determine whether \(f(n) = 10^{80}\) belongs to \(\Omega(g(n))\) where \(g(n) = 1\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
We can choose \(c = 10^{80}\). Then, the inequality becomes:
This is true for all \(n \geq 1\), so we can set \(n_0 = 1\).
We have shown that for \(c = 1\) and \(n_0 = 1\), the inequality holds. Hence, \(f(n) = 10^{80} \in \Omega(g(n) = 1)\).
Example
Prove that running time \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) is \(\Omega(\log(\log(n)))\)
Solution
The function \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) can be rewritten using the change of base formula:
Next, simplify the expression \(\log (\log^{\log(100)}(n))\):
Substituting this back into \(f(n)\), we get:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) belongs to \(\Omega(g(n))\) where \(g(n) = \log(\log(n))\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
From the simplified form of \(f(n)\):
Cancel \(\log(\log(n))\) from both sides of the inequality:
We can choose \(c = \frac{\log(100)}{\log (\ln(5))}\). Then, the inequality becomes:
Divide both sides by \(\frac{\log(100)}{\log (\ln(5))} \cdot \log(\log(n))\):
This is true for all \(n \geq 1\), so we can set \(n_0 = 1\).
We have shown that for \(c = \frac{\log(100)}{\log (\ln(5))}\) and \(n_0 = 1\), the inequality holds. Hence, \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n)) \in \Omega(g(n) = \log(\log(n)))\).
Example
Prove that running time \(f(n) = \binom{n}{\frac{n}{2}} is \Omega(\frac{2^{n}}{\sqrt{n}})\).
Solution
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation for factorials. Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplifying this expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) belongs to \(\Omega(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
From the simplified form of \(f(n)\):
Cancel \(\frac{2^{n}}{\sqrt{n}}\) from both sides of the inequality:
We can choose \(c = \frac{\sqrt{2}}{\sqrt{\pi}}\). Then, the inequality becomes:
This is true for all \(n \geq 1\), so we can set \(n_0 = 1\).
We have shown that for \(c = \frac{\sqrt{2}}{\sqrt{\pi}}\) and \(n_0 = 1\), the inequality holds. Hence, \(f(n) = \binom{n}{\frac{n}{2}} \in \Omega(g(n) = \frac{2^{n}}{\sqrt{n}})\).
Example
Prove that running time \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) is \(\Omega(n)\)
Solution
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) belongs to \(\Omega(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
Multiply both sides by the denominator \(2n^{2} − 11n + 12\):
Expand the right-hand side, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((4 - 2c)n^{3}\) must be positive. Therefore:
Choose \(c = 1.9\):
Multiply both sides by \(10\):
To estimate \(n\) for which \(2n^{3} - 71n^{2} + 332n - 350 \geq 0\) holds, we need to find the point where the positive terms (\(2n^{3}\) and \(332n\)) become dominant over the negative terms (\(71n^{2}\) and \(350\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \(2n^{3}\) which is the highest-degree positive term with \(71n^{2}\) which is the highest-degree negative term to estimate \(n\).
Divide both sides by \(2n^{2}\):
Let's approach it by testing positive values for \(n\) such that \(2n^{3} - 71n^{2} + 332n - 350 \geq 0\) holds.
For \(n = 31\):
For \(n = 30\):
Hence, we have shown that there exist constants \(c = 1.9\) and \(n_0 = 31\) such that \(f(n) \geq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in \Omega(g(n))\).
Example
Prove that running time \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) is \(\Omega(\frac{1}{n^{2}})\)
Solution
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) belongs to \(\Omega(g(n))\) where \(g(n) = \frac{1}{n^{2}}\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
Multiply both sides by the denominator \(2n^{2} − 11n + 12\) and \(n^{2}\):
Expand the expression, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((5 - 2c)n^{4}\) must be positive or zero. If the dominant term \((5 - 2c)n^{4}\) becomes zero, the term \((11c - 2)n^{3}\) can still satisfy the inequality for large \(n\). Therefore:
Choose \(c = 2.5\):
Calculate each coefficient:
To estimate \(n\) for which \(51n^{3} - 130n^{2} - 20n - 225 \geq 0\) holds, we need to find the point where the positive terms (\(51n^{3}\)) become dominant over the negative terms (\(130n^{2}\), \(20n\) and \(225\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \(51n^{3}\) which is the highest-degree positive term with \(130n^{2}\) which is the highest-degree negative term to estimate \(n\).
Divide both sides by \(51n^{2}\):
Let's approach it by testing positive values for \(n\) such that \(51n^{3} - 130n^{2} - 20n - 225 \geq 0\) holds.
For \(n = 3\):
For \(n = 4\):
Hence, we have shown that there exist constants \(c = 2.5\) and \(n_0 = 4\) such that \(f(n) \geq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in \Omega(g(n))\).
Example
Let \(f(n) = \log(n!)\) and \(g(n) = n \log(n)\). Is \(f(n)\) in \(\Omega(g(n))\).
Solution
Stirling's approximation for \(n!\) is given by:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) belongs to \(\Omega(g(n))\) where \(g(n) = n \log(n)\), we need to check if \(f(n)\) grows at least as fast as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Omega(g(n))\) if there exist a real constant \(c \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c\) and \(n_0\) such that:
From the simplified form of \(f(n)\):
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((1 - c)n \log(n)\) must be positive. Therefore:
Choose \(c = 0.9\):
Multiply both sides by \(10\):
To estimate \(n\) for which \(n \log(n) - 10n \log (e) + 5 \log (n) + 5 \log (2\pi) \geq 0\) holds, we need to find the point where the positive terms (\(n \log(n)\), \(5 \log (n)\), and \(5 \log (2\pi)\)) become dominant over the negative terms (\(10n \log (e)\)). We can compare terms based on their degree and the magnitude of their coefficients.
We compare \(n \log(n)\) which is the highest-degree positive term with \(10n \log (e)\) which is the highest-degree negative term to estimate \(n\).
Divide both sides by \(n\):
Rewrite it in its exponential form:
Let's approach it by testing positive values for \(n\) such that \(n \log(n) - 10n \log (e) + 5 \log (n) + 5 \log (2\pi) \geq 0\) holds.
For \(n = 10^{5}\):
Hence, we have shown that there exist constants \(c = 0.9\) and \(n_0 = 66029\) such that \(f(n) \geq c \cdot g(n)\) for all \(n \geq n_0\). Thus, \(f(n) \in \Omega(g(n))\).
Little–Omega Notation
Let f(n) and g(n) be functions that map positive integers to positive real numbers. We say that f(n) is \omega(g(n)) (or f(n) \in \omega(g(n))) if for any real constant c \gt 0, there exists an integer constant n_0 \geq 1 such that f(n) \gt c \cdot g(n) for every integer n \geq n_0.
Example
Let \(f(n) = 7n + 8\) and \(g(n) = n\). Is \(f(n)\) in \(\omega(g(n))\)?
Solution 1
To determine whether \(f(n) = 7n + 8\) belongs to \(\omega(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Subtract both sides by \(c \cdot n\):
For the inequality to hold for sufficiently large \(n\), the dominant term \((7 - c)n\) must be positive or zero. If the dominant term \((7 - c)n\) becomes zero, the term 8 can still satisfy the inequality for large \(n\). Therefore:
This means that for the inequality to hold for large enough \(n\), c must be less than or equal to \(1\). However, \(\omega(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \leq 7\) requires \(c\) to be at most \(7\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary large values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(\omega(g(n))\) because there exists a upper bound on \(c\) (namely \(7\)).
Hence, \(f(n) = 7n + 8 \notin \omega(g(n) = n)\).
Solution 2
To determine whether \(f(n) = 7n + 8\) is in \(\omega(g(n))\) where \(g(n) = n\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = 7n + 8\) and \(g(n) = n\), we need to evaluate:
Simplify the expression:
As \(n\) approaches infinity, the term \(\frac{8}{n}\) approaches \(0\). Thus, the limit becomes:
Since the limit is \(7\) (and not \(\infty\)), the function \(f(n) = 7n + 8\) does not grow faster than \(g(n) = n\). Therefore, \(f(n) \notin \omega(g(n))\).
Example
Let \(f(n) = 7n^{2} + 8\) and \(g(n) = n\). Is \(f(n)\) in \(\omega(g(n))\)?
Solution 1
To determine whether \(f(n) = 7n^{2} + 8\) belongs to \(\omega(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Subtract both sides by \(c \cdot n\):
For the inequality to hold for sufficiently large \(n\), the coefficient of the dominant term \(n^{2}\), must be positive and it does not depend on \(c\). Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(0\).
Since the inequality holds for any positive \(c\), \(f(n)\) satisfy the condition for \(\omega(g(n))\). Therefore, \(f(n) = 7n + 8 \in \omega(g(n) = n^{2})\).
Solution 2
To determine whether \(f(n) = 7n^{2} + 8\) is in \(\omega(g(n))\) where \(g(n) = n\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = 7n^{2} + 8\) and \(g(n) = n\), we need to evaluate:
Simplify the expression:
As \(n\) approaches infinity, the terms \(\frac{8}{n^{2}}\) approach \(0\). Thus, the limit becomes:
Since the limit is \(\infty\), the function \(f(n) = 7n^{2} + 8\) always grows faster than \(g(n) = n\). Therefore, \(f(n) \in \omega(g(n))\).
Example
Let \(f(n) = 2^{n+1}\) and \(g(n) = 2^{n}\). Is \(f(n)\) in \(\omega(g(n))\)?
Solution 1
To determine whether \(f(n) = 2^{n+1}\) belongs to \(\omega(g(n))\) where \(g(n) = 2^{n}\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Divide both sides by \(2^{n}\):
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than \(2\). However, \(\omega(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt 2\) requires \(c\) to be at least \(2\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a lower bound on \(c\) (namely \(2\)).
Hence, \(f(n) = 2^{n+1} \notin \omega(g(n) = 2^{n})\).
Solution 2
To determine whether \(f(n) = 2^{n+1}\) belongs to \(\omega(g(n))\) where \(g(n) = 2^{n}\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = 2^{n+1}\) and \(g(n) = 2^{n}\), we need to evaluate:
Simplify the expression and evaluate the limit:
Since the limit is \(2\) (and not \(\infty\)), the function \(f(n) = 2^{n+1}\) does not grow faster than \(g(n) = 2^{n}\). Therefore, \(f(n) \notin \omega(g(n))\).
Example
Let \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) and \(g(n) = \log(\log(n))\). Is \(f(n)\) in \(\omega(g(n))\)
Solution 1
Simplify the expression using the change of base formula, \(\log_{a}(x) = \frac{\log_{b}(x)}{\log_{b}(a)}\):
The expression inside the logarithm can be simplified using the logarithm power rule, \(\log (x^{y}) = y \cdot \log(x)\):
So, the original expression becomes:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) belongs to \(\omega(g(n))\) where \(g(n) = \log(\log(n))\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Simplify the expression \log_{\ln(5)}(\log^{\log(100)}(n)):
Divide both sides by \(\log(\log(n))\):
This means that for the inequality to hold for large enough \(n\), \(c\) must be less than \(\frac{\log(100)}{\ln(5)}\). However, \(o(g(n))\) requires the inequality to hold for any positive \(c\), no matter how large.
Since \(c \lt \frac{\log(100)}{\ln(5)}\) requires \(c\) to be at most \(\frac{\log(100)}{\ln(5)}\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary big values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(o(g(n))\) because there exists a upper bound on \(c\) (namely \(\frac{\log(100)}{\ln(5)}\)).
Hence, \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n)) \notin \omega(g(n) = \log(\log(n)))\).
Solution 2
Let's simplify the expression \(\log_{\ln(5)}(\log^{\log(100)}(n))\).
The change of base formula states:
Applying this to the expression:
The expression inside the logarithm can be simplified using the logarithm power rule, \(\log (x^{y}) = y \cdot \log (x)\):
So, the original expression becomes:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) belongs to \(\omega(g(n))\) where \(g(n) = \log(\log(n))\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) and \(g(n) = \log(\log(n))\), we need to evaluate:
Simplify the expression and evaluate the limit:
Since the limit is \(\frac{\log(100)}{\ln(5)}\) (and not \(\infty\)), the function \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) does not grow faster than \(g(n) = \log(\log(n))\). Therefore, \(f(n) \notin \omega(g(n))\).
Example
Let \(f(n) = \binom{n}{\frac{n}{2}}\) and \(g(n) = \frac{2^{n}}{\sqrt{n}}\). Is \(f(n)\) in \(\omega(g(n))\).
Solution 1
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation for factorials. Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplifying this expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) belongs to \(\omega(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation:
Divide both sides by \(\frac{2^{n}}{\sqrt{n}}\):
This means that for the inequality to hold for large enough \(n\), \(c\) must be larger than \(\sqrt{\frac{2}{\pi}}\). However, \(\omega(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \gt \sqrt{\frac{2}{\pi}}\) requires \(c\) to be at least \(\sqrt{\frac{2}{\pi}}\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary small values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(\omega(g(n))\) because there exists a lower bound on \(c\) (namely \(\sqrt{\frac{2}{\pi}}\)).
Hence, \(f(n) = \binom{n}{\frac{n}{2}} \notin \omega(g(n) = \frac{2^{n}}{\sqrt{n}})\).
Solution 2
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation for factorials. Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplifying this expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) belongs to \(\omega(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = \binom{n}{\frac{n}{2}}\) and \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to evaluate:
The binomial coefficient \(\binom{n}{\frac{n}{2}}\) can be approximated using Stirling's approximation and be simplified to:
Since the limit is \(\sqrt{\frac{2}{\pi}}\) (and not \(\infty\)), the function \(f(n) = \binom{n}{\frac{n}{2}}\) does not grow faster than \(g(n) = \frac{2^{n}}{\sqrt{n}} \log(n)\). Therefore, \(f(n) \notin \omega(g(n))\).
Example
Let \(f(n) = \log(n!)\) and \(g(n) = n \log(n)\). Is \(f(n)\) in \(\omega(g(n))\).
Solution 1
Stirling's approximation for \(n!\) is given by:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) belongs to \(\omega(g(n))\) where \(g(n) = n \log(n)\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
The factorial \(n!\) can be approximated using Stirling's approximation and be simplified to:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((1 - c)n \log(n)\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be smaller than \(1\). However, \(\omega(g(n))\) requires the inequality to hold for any positive \(c\), no matter how big.
Since \(c \lt 1\) requires \(c\) to be at most \(1\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary large values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(\omega(g(n))\) because there exists a upper bound on \(c\) (namely \(1\)).
Hence, \(f(n) = \log(n!) \notin \omega(g(n) = n \log(n))\).
Solution 2
Stirling's approximation for \(n!\) is given by:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) is in \(\omega(g(n))\) where \(g(n) = n \log(n)\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin o(g(n))\).
Given \(f(n) = \log(n!)\) and \(g(n) = n \log(n)\), we need to evaluate:
The factorial \(n!\) can be approximated using Stirling's approximation and be simplified to:
As \(n\) approaches infinity, the term \(\frac{\log (2\pi)}{2n \log(n)}\), \(\frac{1}{2n}\) and \(\frac{log(e)}{log(n)}\) approach \(0\). Thus, the limit becomes:
Since the limit is \(1\) (and not \(\infty\)), the function \(f(n) = \log(n!)\) does not grow faster than \(g(n) = n \log(n)\). Therefore, \(f(n) \notin \omega(g(n))\).
Example
Let \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) and \(g(n) = n\). Is \(f(n)\) in \(\omega(g(n))\).
Solution 1
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) belongs to \(\omega(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Multiply both sides by the denominator \(2n^{2} − 11n + 12\):
Expand the right-hand side, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((4 - 2c)n^{3}\) must be positive. Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be smaller than \(2\). However, \(\omega(g(n))\) requires the inequality to hold for any positive \(c\), no matter how big.
Since \(c \lt 2\) requires \(c\) to be at most \(2\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary large values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(\omega(g(n))\) because there exists a upper bound on \(c\) (namely \(2\)).
Hence, \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12} \notin \omega(g(n) = n)\).
Solution 2
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) belongs to \(\omega(g(n))\) where \(g(n) = n\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in o(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) and \(g(n) = n\), we need to evaluate:
Simplify the fraction:
Applying L'Hôpital's rule:
Applying L'Hôpital's rule again:
Applying L'Hôpital's rule again:
Since the limit is \(2\) (and not \(\infty\)), the function \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) does not grow faster than \(g(n) = n\). Therefore, \(f(n) \notin \omega(g(n))\).
Example
Let \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) and \(g(n) = \frac{1}{n^{2}}\). Is \(f(n)\) in \(\omega(g(n))\).
Solution 1
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) belongs to \(\omega(g(n))\) where \(g(n) = \frac{1}{n^{2}}\), we need to check if \(f(n)\) grows strictly faster than \(g(n)\) asymptotically. A function \(f(n)\) is in \(\omega(g(n))\) if for any given positive constant \(c\) and there exists an integer constant \(n_0 \geq 1\) such that:
Substituting the given functions:
Multiply both sides by the denominators \(2n^{4} − 11n^{3} + 12n^{2} + 4n + 45\) and \(n^{2}\):
Expand the expression, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((5 - 2c)n^{4}\) must be positive or zero. If the dominant term \((5 - 2c)n^{4}\) becomes zero, the term \((11c − 2)n^{3}\) can still satisfy the inequality for large \(n\). Therefore:
This means that for the inequality to hold for large enough \(n\), \(c\) must be greater than or equal to \(2.5\). However, \(\omega(g(n))\) requires the inequality to hold for any positive \(c\), no matter how small.
Since \(c \geq 2.5\) requires \(c\) to be at least \(2.5\), \(f(n)\) cannot grow faster than \(c \cdot n\) for arbitrary large values of \(c\). Therefore, \(f(n)\) does not satisfy the condition for \(\omega(g(n))\) because there exists a lower bound on \(c\) (namely \(2.5\)).
Solution 2
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) belongs to \(\omega(g(n))\) where \(g(n) = \frac{1}{n^{2}}\) using limits, we need to evaluate the following limit:
If the limit is \(\infty\), then \(f(n) \in \omega(g(n))\). Otherwise, \(f(n) \notin \omega(g(n))\).
Given \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) and \(g(n) = \frac{1}{n^{2}}\), we need to evaluate:
Simplify the fraction:
Applying L'Hôpital's rule:
Applying L'Hôpital's rule again:
Applying L'Hôpital's rule again:
Applying L'Hôpital's rule again:
Since the limit is \(\frac{5}{2}\) (and not \(\infty\)), the function \(f(n) = \frac{5n^{4} − 2n^{3} − 35n^{2}}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) does not grow faster than \(g(n) = \frac{1}{n^{2}}\). Therefore, \(f(n) \notin \omega(g(n))\).
Big–Theta Notation
Big Theta notation represents both the upper and lower bounds of the growth rate of a function. It provides a tight bound on the time or space complexity of an algorithm.
Let \(f(n)\) and \(g(n)\) be functions that map positive integers to positive real numbers. We say that \(f(n)\) is \(\Theta(g(n))\) (or \(f(n) \in \Theta(g(n))\)) if and only if \(f(n) \in O(g(n))\) and \(f(n) \in \Omega(g(n))\).
Example
Let \(f(n) = 7n + 8\) and \(g(n) = n\). Is \(f(n)\) in \(\Theta(g(n))\)?
Solution
To determine whether \(f(n) = 7n + 8\) is in \(\Theta(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c_2 - 7)n\) must be positive. Therefore:
We can choose \(c_2 = 8\). Then, the inequality becomes:
Solve for \(n\):
Thus, we can choose \(c_2 = 8\) and it holds for \(n \geq 8\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((7 - c_1)n\) must be positive or zero. If the dominant term \((7 - c_1)n\) becomes zero, the term \(8\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c_1 = 7\). Then, the inequality becomes:
Simplify the inequality:
Thus, we can choose \(c_1 = 7\) and it holds for all \(n \geq 1\).
To satisfy both conditions for \(n \geq n_0\), we can choose \(c_1 = 7\), \(c_2 = 8\), \(n_0 = 8\).
Hence, \(f(n) = 7n + 8 \in \Theta(g(n) = n)\).
Example
Let \(f(n) = n^{3} + 20n + 1\) and \(g(n) = n^{3}\). Is \(f(n)\) in \(\Theta(g(n))\)?
Solution
To determine whether \(f(n) = n^{3} + 20n + 1\) is in \(\Theta(g(n))\) where \(g(n) = n^{3}\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c_2 - 1)n^{3}\) must be positive. Therefore:
We can choose \(c_2 = 2\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(n^{3} - 20n - 1 \geq 0\) holds, we need to find the point where the positive terms (\(n^{3}\)) become dominant over the negative terms (\(20n\) and \(1\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(n^{3}\) is the highest-degree positive term and \(20n\) is the highest-degree negatiive term. We compare \(n^{3}\) with \(20n\) to estimate \(n\).
So, \(n^{3}\) starts to dominate \(20n\) when \(n\) is \(5\) or larger.
Let's approach it by testing positive values for \(n\) such that \(n^{3} - 20n 1 \geq 0\) holds.
For \(n = 5\):
Thus, we can choose \(c_2 = 2\) and it holds for \(n \geq 5\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((1 - c_1)n^{3}\) must be positive or zero. If the dominant term \((1 - c_1)n^{3}\) becomes zero, the term \(20n\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c_1 = 1\). Then, the inequality becomes:
Simplify the inequality:
The inequality holds for all \(n \geq 1\).
Thus, we can choose \(c_1 = 1\) and it holds for all \(n \geq 1\).
To satisfy both conditions, we can choose \(c_1 = 1\), \(c_2 = 2\), \(n_0 = 5\)
Hence, \(f(n) = n^{3} + 20n + 1 \in \Theta(g(n) = n^{3})\).
Example
Prove that running time \(f(n) = 2^{n+1}\) is \(\Theta(2^{n})\)
Solution
To determine whether \(f(n) = 2^{n+1}\) is in \(\Theta(g(n))\) where \(g(n) = 2^{n}\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Divide both sides by \(2^{n}\):
The inequality holds for all \(n \geq 1\).
We can choose \(c_2 = 2\) and it holds for all \(n \geq 1\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Divide both sides by \(2^{n}\):
The inequality holds for all \(n \geq 1\).
We can choose \(c_1 = 2\) and it holds for all \(n \geq 1\).
To satisfy both conditions, we can choose \(c_1 = 2\), \(c_2 = 2\), \(n_0 = 1\).
Hence, \(f(n) = 2^{n+1} \in \Theta(g(n) = 2^{n})\).
Example
Prove that running time \(f(n) = 10^{80}\) is \(\Theta(1)\)
Solution
To determine whether \(f(n) = 10^{80}\) is in \(\Theta(g(n))\) where \(g(n) = 1\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Simplify the inequality:
The inequality holds for all \(n \geq 1\).
We can choose \(c_2 = 10^{80}\) and it holds for all \(n \geq 1\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Simplify the inequality:
The inequality holds for all \(n \geq 1\).
We can choose \(c_1 = 10^{80}\) and it holds for all \(n \geq 1\).
To satisfy both conditions, we can choose \(c_1 = 10^{80}\), \(c_2 = 10^{80}\), and for all \(n \geq 1\).
Hence, \(f(n) = 10^{80} \in \Theta(g(n) = 1)\).
Example
Prove that running time \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) is \(\Theta(\log(\log(n)))\)
Solution
Simplify the expression using the change of base formula, \(\log_{a}(x) = \frac{\log_{b}(x)}{\log_{b}(a)}\):
The expression inside the logarithm can be simplified using the logarithm power rule, \(\log (x^{y}) = y \cdot \log(x)\):
So, the original expression becomes:
To determine whether \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n))\) is in \(\Theta(g(n))\) where \(g(n) = \log(\log(n))\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Simplify the expression \(\log_{\ln(5)}(\log^{\log(100)}(n))\):
Divide both sides by \(\log(\log(n))\):
The inequality holds for all \(n \geq 1\).
We can choose \(c_2 = \frac{\log(100)}{\ln(5)}\) and it holds for all \(n \geq 1\).
To show \(f(n)\) is in \(\Theta(g(n))\), we need to find \(c_1\) such that:
Simplify the expression \(\log_{\ln(5)}(\log^{\log(100)}(n))\):
Divide both sides by \(\log(\log(n))\):
The inequality holds for all \(n \geq 1\).
We can choose \(c_1 = \frac{\log(100)}{\ln(5)}\) and it holds for all \(n \geq 1\).
To satisfy both conditions, we can choose \(c_1 = \frac{\log(100)}{\ln(5)}\), \(c_2 = \frac{\log(100)}{\ln(5)}\), and for all \(n \geq 1\).
Hence, \(f(n) = \log_{\ln(5)}(\log^{\log(100)}(n)) \in \Theta(g(n) = \log(\log(n)))\).
Example
Prove that running time \(f(n) = \binom{n}{\frac{n}{2}}\) is \(\Theta(\frac{2^{n}}{\sqrt{n}})\).
Solution
Stirling's approximation states:
The binomial coefficient can be expressed as:
Applying Stirling's approximation:
Thus:
Simplifying this expression:
Simplify further:
To determine whether \(f(n) = \binom{n}{\frac{n}{2}}\) is in \(\Theta(g(n))\) where \(g(n) = \frac{2^{n}}{\sqrt{n}}\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Simplify the expression \(\binom{n}{\frac{n}{2}}\) using Stirling's approximation:
Divide both sides by \(\frac{2^{n}}{\sqrt{n}}\):
The inequality holds for all \(n \geq 1\).
We can choose \(c_2 = \frac{\sqrt{2}}{\sqrt{\pi}}\) and it holds for all \(n \geq 1\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Simplify the expression \(\binom{n}{\frac{n}{2}}\) using Stirling's approximation:
Divide both sides by \(\frac{2^{n}}{\sqrt{n}}\):
The inequality holds for all \(n \geq 1\).
We can choose \(c_1 = \frac{\sqrt{2}}{\sqrt{\pi}}\) and it holds for all \(n \geq 1\).
To satisfy both conditions, we can choose \(c_1 = \frac{\sqrt{2}}{\sqrt{\pi}}\), \(c_2 = \frac{\sqrt{2}}{\sqrt{\pi}}\), and for all \(n \geq 1\).
Hence, \(f(n) = \binom{n}{\frac{n}{2}} \in \Theta(g(n) = \frac{2^{n}}{\sqrt{n}})\).
Example
Prove that running time \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) is \(\Theta(n)\)
Solution
To determine whether \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12}\) is in \(\Theta(g(n))\) where \(g(n) = n\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Multiply both sides by the denominator \(2n^{2} − 11n + 12\):
Expand the right-hand side, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large n, the dominant term \((2c_2 - 4)n^{3}\) must be positive or zero. If the dominant term \((2c_2 - 4)n^{3}\) becomes zero, the term \((28 - 11c_2)n^{2}\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c_2 = 2\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(6n^{2} - 32n + 35 \geq 0\) holds, we need to find the point where the positive terms (\(6n^{2}\) and \(35\)) become dominant over the negative terms (\(32n\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(6n^{2}\) is the highest-degree positive term and \(32n\) is the highest-degree negatiive term. We compare \(6n^{2}\) with \(32n\) to estimate \(n\).
So, \(6n^{2}\) starts to dominate \(32n\) when \(n\) is \(6\) or larger.
Let's approach it by testing positive values for \(n\) such that \(6n^{2} - 32n + 35 \geq 0\) holds.
For \(n = 6\):
For \(n = 5\):
For \(n = 4\):
For \(n = 3\):
Thus, we can choose \(c_2 = 2\) and \(n \geq 4\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Multiply both sides by the denominator \(2n^{2} − 11n + 12\):
Expand the left-hand side, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((4 - 2c_1)n^{3}\) must be positive. Therefore:
We can choose \(c_1 = 1\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(2n^{3} - 17n^{2} + 44n - 35 \geq 0\) holds, we need to find the point where the positive terms (\(2n^{3}\) and \(44n\)) become dominant over the negative terms (\(17n^{2}\) and \(35\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(2n^{3}\) is the highest-degree positive term and \(17n^{2}\) is the highest-degree negative term. We compare \(2n^{3}\) with \(17n^{2}\) to estimate \(n\).
So, \(2n^{3}\) starts to dominate \(17n^{2}\) when \(n\) is \(9\) or larger.
Let's approach it by testing positive values for \(n\) such that \(2n^{3} - 17n^{2} + 44n - 35 \geq 0\) holds.
For \(n = 9\):
For \(n = 8\):
For \(n = 6\):
For \(n = 4\):
For \(n = 5\):
Thus, we can choose \(c_1 = 1\) and \(n \geq 5\).
To satisfy both conditions, we can choose \(c_1 = 1\), \(c_2 = 2\), and \(n_0 = 5\).
Hence, \(f(n) = \frac{4n^{3} − 28n^{2} + 56n − 35}{2n^{2} − 11n + 12} \in \Theta(g(n) = n)\).
Example
Prove that running time \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) is \(\Theta(\frac{1}{n^{2}})\)
Solution
To determine whether \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45}\) is in \(\Theta(g(n))\) where \(g(n) = \frac{1}{n^{2}}\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Multiply both sides by the denominator \(2n^{4} − 11n^{3} + 12n^{2} + 4n + 45\) and \(n^{2}\):
Expand the expression, the inequality becomes:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((2c_2 - 5)n^{4}\) must be positive. Therefore:
We can choose \(c_2 = 3\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(n^{4} − 31n^{3} + 71n^{2} + 12n + 135 \geq 0\) holds, we need to find the point where the positive terms (\(n^{4}\), \(71n^{2}\), \(12n\) and \(135\)) become dominant over the negative terms (\(31n^{3}\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(n^{4}\) is the highest-degree positive term and \(31n^{3}\) is the highest-degree negative term. We compare \(n^{4}\) with \(31n^{3}\) to estimate \(n\).
So, \(n^{4}\) starts to dominate \(31n^{3}\) when \(n\) is \(31\) or larger.
Let's approach it by testing positive values for \(n\) such that \(n^{4} − 31n^{3} + 71n^{2} + 12n + 135 \geq 0\) holds.
For \(n = 31\):
For \(n = 29\):
For \(n = 27\):
For \(n = 28\):
Thus, we can choose \(c_2 = 3\) and \(n \geq 29\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Multiply both sides by the denominator \(2n^{4} − 11n^{3} + 12n^{2} + 4n + 45\) and \(n^{2}\):
Expand the expression, the inequality becomes:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((5 - 2c_1)n^{4}\) must be positive or zero. If the dominant term \((5 - 2c_1)n^{4}\) becomes zero, the term \((11c_1 - 2)n^{3}\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c_1 = 2\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(n^{4} + 20n^{3} - 59n^{2} - 8n - 90 \geq 0\) holds, we need to find the point where the positive terms (\(n^{4}\) and \(20n^{3}\)) become dominant over the negative terms (\(59n^{2}\), \(8n\) and \(90\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(n^{4}\) is the highest-degree positive term and \(59n^{2}\) is the highest-degree negative term. We compare \(n^{4}\) with \(59n^{2}\) to estimate \(n\).
So, \(n^{4}\) starts to dominate \(59n^{2}\) when \(n\) is \(8\) or larger.
Let's approach it by testing positive values for \(n\) such that \(n^{4} + 20n^{3} - 59n^{2} - 8n - 90 \geq 0\) holds.
For \(n = 8\):
For \(n = 6\):
For \(n = 4\):
For \(n = 3\):
Thus, we can choose \(c_1 = 2\) and \(n \geq 3\).
To satisfy both conditions, we can choose \(c_1 = 2\), \(c_2 = 3\), and \(n_0 = 29\).
Hence, \(f(n) = \frac{5n^{2} − 2n − 35}{2n^{4} − 11n^{3} + 12n^{2} + 4n + 45} \in \Theta(g(n) = \frac{1}{n^{2}})\).
Example
Let \(f(n) = \log(n!)\) and \(g(n) = n \log(n)\). Is \(f(n)\) in \(\Theta(g(n))\).
Solution
Stirling's approximation for \(n!\) is given by:
Taking the logarithm of both sides:
Using the properties of logarithms, we can expand the expression:
Simplify each term:
To determine whether \(f(n) = \log(n!)\) is in \(\Theta(g(n))\) where \(g(n) = n \log(n)\), we need to check if \(f(n)\) grows at the same rate as \(g(n)\) asymptotically. A function \(f(n)\) is in \(\Theta(g(n))\) if there exist positive real constants \(c_1 \gt 0\), \(c_2 \gt 0\) and there exists an integer constant \(n_0 \geq 1\) such that:
We need to find \(c_1\), \(c_2\) such that:
To show \(f(n)\) is in \(O(g(n))\), we need to find \(c_2\) such that:
Simplify the expression \(\log(n!)\) using Stirling's approximation:
Move all terms to right side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((c_2 - 1)n \log(n)\) must be positive or zero. If the dominant term \((c_2 - 1)n \log(n)\) becomes zero, the term \(n \log(e)\) can still satisfy the inequality for large \(n\). Therefore:
We can choose \(c_2 = 1\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(0.4343n - 0.5 \log(n) - 0.39905 \geq 0\) holds, we need to find the point where the positive terms (\(0.4343n\)) become dominant over the negative terms (\(0.5 \log(n)\) and \(0.39905\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(0.4343n\) is the highest-degree positive term and \(0.5 \log(n)\) is the highest-degree negative term. We compare \(0.4343n\) with \(0.5 \log(n)\) to estimate \(n\).
Divide both sides by \(0.5\)
The inequality holds for all \(n \geq 1\).
To show \(f(n)\) is in \(\Omega(g(n))\), we need to find \(c_1\) such that:
Simplify the expression \(\log(n!)\) using Stirling's approximation:
Move all terms to left side of the inequality and combine like terms:
For the inequality to hold for sufficiently large \(n\), the dominant term \((1 - c_1)n \log(n)\) must be positive. Therefore:
We can choose \(c_1 = 0.5\). Then, the inequality becomes:
Simplify the inequality:
To estimate \(n\) for which \(0.5n \log(n) - 0.4343n + 0.5 \log(n) + 0.39905 \geq 0\) holds, we need to find the point where the positive terms (\(0.5n \log(n)\), \(0.5 \log(n)\), and \(0.39905\)) become dominant over the negative terms (\(0.4343n\)). We can compare terms based on their degree and the magnitude of their coefficients.
\(0.5n \log(n)\) is the highest-degree positive term and \(0.4343n\) is the highest-degree negative term. We compare \(0.5n \log(n)\) with \(0.4343n\) to estimate \(n\).
Divide both sides by \(0.5n\)
Exponentiate both sides to remove the logarithm:
So, \(0.5n \log(n)\) starts to dominate \(0.4343n\) when \(n\) is \(8\) or larger.
Let's approach it by testing positive values for \(n\) such that \(n^{4} + 20n^{3} - 59n^{2} - 8n - 90 \geq 0\) holds.
For \(n = 8\):
For \(n = 7\):
For \(n = 5\):
For \(n = 3\):
For \(n = 2\):
To satisfy both conditions, we can choose \(c_1 = 0.5\), \(c_2 = 1\), and for all \(n \geq 2\).
Hence, \(f(n) = \log(n!) \in \Theta(g(n) = n \log(n))\).