Skip to main content

Section 8.6 Growth of Functions

How long will it take to process the company payroll once we complete our planned merger? Should I buy a new payroll program from vendor X or vendor Y? If a particular program is slow, is it badly implemented or is it solving a hard problem?

Questions like these ask us to consider the difficulty of a problem, or the relative efficiency of two or more approaches to solving a problem.

This section and the next introduce the motivation, basic notation, and fundamental techniques of algorithm analysis. In this section we focus on Big O notation a mathematical notation that describes the limiting or asymptotic behavior of a function as the input values become large. In the next section we will use Big O notation to analyze running times of different algorithms using asymptotic algorithm analysis.

Subsection 8.6.1 Big O Notation

Big O is the most frequently used asymptotic notation. It is often used to give an upper bound on the growth of a function, such as the running time of an algorithm. Big O notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same Big O notation. The letter O is used because the growth rate of a function is also referred to as the order of the function

.
Definition 8.6.1. Big O.

Given functions \(f(x),g(x): \mathbb{R} \rightarrow \mathbb{R}\) with \(g\) non-negative, we say \(f(x)\) is Big O \(g(x)\) or:

\begin{equation*} f(x) = O(g(x)) \end{equation*}

if and only if there exists a constant \(c \geq 0\) and a \(k\) such that for all \(x \geq k:\)

\begin{equation*} \lvert f(x) \rvert \leq c g(x) \end{equation*}

This definition is rather complicated, but the idea is simple: \(f(x) = O(g(x))\) means \(f(x)\) is less than or equal to \(g(x)\text{,}\) except that we’re willing to ignore a constant factor, namely \(c\text{,}\) and to allow exceptions for small \(x\text{,}\) namely, for \(x \leq k\) .

If we choose \(c = 10\) and \(k = 0\text{,}\) then \(10x^2 = O(x^2) \) holds because for all \(x \geq 0,\)

\begin{equation*} \lvert 10x^2 \rvert \leq 10x^2 \text{.} \end{equation*}

In the following SageMath cell, we plot \(f(x) = 10x^2\text{,}\) \(g(x) = x^2\text{,}\) and \(c\,*\,g(x)\) with \(c = 10\text{.}\) The resulting plot shows that \(f(x)\) (red) never exceeds \(c \,*\, g(x)\) (black dots). That is what we want to show for Big O.

If we take the polynomial \(x^2 + 100x + 10\) and change it so that all of the constant cofficients are multiplied with \(x^2\) we have:

\begin{equation*} x^2 + 100x^2 + 10x^2 = 111x^2. \end{equation*}

It must be true that the original polynomial is less than this new equation:

\begin{equation*} x^2 + 100x + 10 \leq x^2 + 100x^2 + 10x^2, \text{ with } x\geq 1\text{.} \end{equation*}

Therefore, if we choose \(c = 111\) and \(k = 1\) we have \(x^2 + 100x + 10 = O(x^2)\)

In the following SageMath cell, we plot \(f(x) = x^2 + 100x + 10\text{,}\) \(g(x) = x^2\text{,}\) and \(c\,*\,g(x)\) with \(c = 111\text{.}\) The resulting plot shows that \(f(x)\) (red) never exceeds \(c \, g(x)\) after \(x = 1\) (black dots). That is what we want to show for Big O.

Given a polynomial:

\begin{equation*} a_k x^k + a_{k-1}x^{k-1} + \dots + a_1x + a_0 \end{equation*}

if we replace all of the lower-order \(x\) terms with the highest order \(x^k\) we have:

\begin{equation*} \begin{array}{c c}\\ a_k x^k + a_{k-1}x^{k-1} + \dots + a_1x + a_0 & \leq a_k x^k + a_{k-1}x^k + \dots + a_1x^k + a_0x^k\\ & = (a_k + a_{k-1} + \dots + a_1 + a_0)x^k\\ \end{array} \end{equation*}

Therefore,

\begin{equation*} a_k x^k + a_{k-1}x^{k-1} + \dots + a_1x + a_0 = O(x^k) \end{equation*}

for \(c = (a_k + a_{k-1} + \dots + a_1 + a_0)\) and \(k \geq 1\text{.}\)

If we have negative coefficients in a polynomial, Theorem 8.6.4 still applies because Big O as defined in Definition 8.6.1 is an upper bound on the absolute value of \(f(x)\) and \(g(x)\) must be non-negative.

Here we take the absolute value of our polynomial \(\lvert 6x^4 - 2x^3 + 5 \rvert\) and multipling all of the constant cofficients by \(x^4\) we have:

\begin{equation*} \lvert 6x^4 - 2x^4 + 5x^4 \rvert = 6x^4 + \lvert 2x^4 \rvert + 5x^4 = 13x^4. \end{equation*}

Therefore, we have

\begin{equation*} \begin{array}{c c} 6x^4 - 2x^3 + 5 & \leq 13x^4 \\ & = O(x^4) \end{array} \end{equation*}

With \(c = 13\) and \(k = 1\text{.}\)

We can actually get a closer bound if we realize that the \(- 2x^3\) is only making the outcome of the polynomial smaller:

\begin{equation*} \lvert 6x^4 - 2x^3 + 5 \rvert < \lvert 6x^4 + 5 \rvert. \end{equation*}

Therefore, we can also say that \(6x^4 - 2x^3 + 5 = O(x^4)\) for \(c = 11\) and \(k = 1\text{.}\)

We only need to show that one \(c\) and one \(k\) exist to prove Big O. Once they are found, the bound is also true for any larger values of \(c\) and \(k\text{.}\)

In the following SageMath cell, we plot \(f(x) = 6x^4 - 2x^3 + 5\) (red), \(g(x) = x^4\) (blue), and \(c\,*\,g(x)\) with \(c = 13\) (black dots). The resulting plot shows that \(f(x)\) never exceeds \(c \, g(x)\) after \(x = 1\text{.}\) That is what we want to show for Big O.

\(n! = n * (n-1) * \dots * 2 * 1\) for \(n \geq 1\text{.}\) There are \(n\) terms total. Since \(n \geq 1 \text{,}\) it must be true that:

\begin{equation*} n * (n-1) * \dots * 2 * 1 \leq n * n * \dots *n * n = n^n \end{equation*}

Therefore, \(n! = O(n^n)\) with \(c = 1\) and \(k = 1\text{.}\)

Subsection 8.6.2 Big O For Common Functional Forms

In Section 9.3 we will be using several functions to analyze the running time of algorithms. The most commonly encountered functions are shown in the Figure 8.6.7 below.

Figure 8.6.7. Growth of functions commonly used for algorithm analysis. Image by Cmglee CC BY-SA 4

In addition to polynomials, we have the following Big O estimates:

  1. Any power of \(n\) is Big O any larger power of \(n\text{,}\) powers greater than 1:
    \begin{equation*} n^c = O(n^d), 1 < c < d. \end{equation*}
  2. A logarithm of \(n\) to a power is Big O any power of \(n\text{,}\) for log bases greater than 1:
    \begin{equation*} (log_b n)^c = O(n^d), 0 < c, d. \end{equation*}
  3. Any power of \(n\) is Big O an exponential in \(n\text{:}\)
    \begin{equation*} n^c = O(b^n), 1 < b, 0 < c. \end{equation*}
  4. Any exponential is Big O an exponential of a larger constant:
    \begin{equation*} b^n = O(c^n), 1 < b < c \end{equation*}
Notation Name Example
\(O(1)\) Constant time No looping; Determining if a binary number is even or odd; Calculating \((-1)^n\)
\(O(\log n)\) Logarithmic time Loops where \(n\) is divided by a constant each iteration;
Finding an item in a sorted array with a BinarySearch.
\(O(n)\) Linear time One loop over all \(n\text{;}\) Finding an item in an unsorted list or in an unsorted array
\(O(n\log n)=O(\log n!)\) Linearithmic time Algorithms that divide \(n\) by a constant, but then need to reassemble all \(n\) items in some way;
Fastest possible comparison sort; HeapSort and MergeSort
\(O(n^2)\) Quadratic time Two nested loops both over \(n\text{;}\) Multiplying two \(n\)-digit numbers by a simple algorithm;
Simple sorting algorithms, such as BubbleSort, SelectionSort, and InsertionSort
\(O(n^c)\) Polynomial time \(c\) nested loops over \(n\text{;}\) Matrix multiplication using the standard algorithm.
\(O(c^n)\text{,}\) \(c>1\) Exponential time Recursive algorithms that make two or more recursive calls each time;
Determining if two logical statements are equivalent using a truth table (\(2^n\) rows)
\(O(n!)\) Factorial time Solving the travelling salesman problem via brute-force search; enumerating all partitions of a set
\(O(n^n)\) \(n\) to the \(n\) Resulting value of \(f(n) = n!\)
Table 8.6.8. Orders of Common FunctionsHere is a list of orders (or classes) of functions that are commonly encountered when analyzing algorithms. In each case, \(c\) is a positive constant and \(n\) increases without bound. Functions are listed from slowest growing at the top to fastest growing at the bottom.

Subsection 8.6.3 Big O For Combinations of Functions

The Big O of multiple functions can be combined. If we have a complex function, we can break it down into simpler parts then find the Big O of the simpler functions and combine them.

  • If two functions are added together, their Big O is the Big O of the larger one.

    • If \(f_1(x) = O(g_1(x))\) and \(f_2(x) = O(g_2(x))\text{:}\)

      \begin{equation*} f_1+f_2(x) = O(\max(g_1(x),g_2(x)) \end{equation*}
  • If two functions are multiplied together, their Big O is the product of the Big O of both.

    • If \(f_1(x) = O(g_1(x))\) and \(f_2(x) = O(g_2(x))\text{:}\)

      \begin{equation*} f_1*f_2(x) = O(g_1(x)*g_2(x)) \end{equation*}

Here we have two functions multiplied together: \(f(x) = x^3\) and \(g(x) = \log_{2}x\text{.}\) We can calculate the Big O of both separately and combine them.

For the first one Theorem 8.6.4 applies so we have \(f(x) = O(x^3)\text{.}\) For \(g(x)\) there are no powers or constants so the Big O is just \(\log_2(x)\text{.}\)

Multiplying: \(7x^3\log_{2}x = O(x^3\log_{2}x)\)

Here we have two functions added together: \(f(x) = 9x^{2}\) and \(g(x) = 3^x\text{.}\) We must calculate the Big O of both separately and decide which is largest.

For the first one Theorem 8.6.4 applies so we have \(f(x) = O(x^2)\text{.}\) For \(g(x)\text{,}\) this is an exponential in \(x\text{,}\) so the Big O is just the function itself: \(3^x\text{.}\) Looking at Figure 8.6.7 we can see that \(3^x\) grows much faster than the polynomial \(x^2\text{,}\) therefore \(9x^{2} + 3^x = O(3^x)\)

Subsection 8.6.4 Big Omega

This is almost the same definition as Big O, except that Big \(\Omega\) (Omega) gives a lower bound on the growth of a function or running time of an algorithm. It is used to describe the best case running time for an algorithm on a given data size.

Definition 8.6.11. Big \(\Omega\).

Given functions \(f(x),g(x): \mathbb{R} \rightarrow \mathbb{R}\) with \(f\) non-negative, we say \(f(x)\) is Big \(\Omega\) (Omega) \(g(x)\) or:

\begin{equation*} f(x) = \Omega (g(x)) \end{equation*}

to mean

\begin{equation*} g(x) = O(f(x)). \end{equation*}

\(f(x) = \Omega (g(x))\) if and only if there exists a constant \(c \geq 0\) and a \(k\) such that for all \(x \geq k:\)

\begin{equation*} c f(x) \geq \lvert g(x) \rvert \end{equation*}

To prove: \(f(x) = \Omega(g(x))\) we just flip the equation around into Big O form: \(g(x) = O(f(x))\) and find a \(c\) and \(k\) where \(f(x)\) is greater than or equal to \(g(x)\text{.}\)

Subsection 8.6.5 Big Theta

Big \(\Theta\) (Theta) combines Big O and Big \(\Omega\) to specify that the growth of a function (or running time of an algorithm) is always equal to some other function plus or minus a constant factor. We take the definitions of both Big O and Big \(\Omega\) to give both an upper and lower bound. It is used to describe the best and worst case running times for an algorithm on a given data size.

Definition 8.6.12. Big \(\Theta\).

Given functions \(f(x),g(x): \mathbb{R} \rightarrow \mathbb{R}\) with \(f,g\) non-negative, we say \(f(x)\) is Big \(\Theta\) (Theta) \(g(x)\) or:

\begin{equation*} f(x) = \Theta (g(x)) \end{equation*}

if and only if:

\begin{equation*} f(x) = O(g(x)) \end{equation*}

and

\begin{equation*} g(x) = O(f(x)). \end{equation*}

\(f(x) = \Theta (g(x))\) means \(f(x)\) and \(g(x)\) are equal to within a constant factor.

To prove: \(f(x) = \Theta(g(x))\) we must show \(f(x) = O(g(x))\) for some \(c_1\) and \(k_1\) and then flip the equation around and show: \(g(x) = O(f(x))\) for some \(c_2\) and \(k_2\text{.}\)

In the following SageMath cell, we plot \(f(x) = x^4 + 2^x\) (red), this is an addition of two functions with the larger one being the exponential \(2^x\text{,}\) so it should be \(\Theta(2^x)\text{.}\) We show this by using \(g(x) = 2^x\) and plotting both \(c_1\,*\,g(x)\) with \(c_1 = 20\) to show \(f(x) = O(2^x)\) and \(c_2\,*\,g(x)\) with \(c_2 = 1\) to show \(f(x) = \Omega(2^x)\) with black dots. The resulting plot shows that \(f(x)\) is bounded above by \(c_1 \,*\, g(x)\) after \(x \approx 7\) and it is always bounded below by \(c_2 \,*\, g(x)\) . That is what we want to show for Big \(\Theta\text{.}\)

Exercises 8.6.6 Exercises for Section 8.6

1.

Show that \(f(x) = 8x^3 - 5x^2 + 7x + 13 \) is \(\Theta(x^3)\text{.}\) Find \(c_1, c_2, k_1, k_2\text{.}\)

Answer

for all \(x \geq 1\)

\begin{equation*} \begin{array}{c c} 8x^3 - 5x^2 + 7x + 13 & \leq 8x^3 + 7x + 13\\ & \leq 8x^3 + 7x^3 + 13x^3\\ & = 28x^3 \end{array} \end{equation*}

So \(8x^3 - 5x^2 + 7x + 13\) is \(O(x^3)\) with \(c_1 = 28, k_1 = 1\)

\begin{equation*} \begin{array}{c c} 8x^3 - 5x^2 + 7x + 13 & \geq 8x^3 -5x^2\\ 8x^3 - 5x^2 + 7x + 13 & \geq 8x^3 -x^3\\ & x^3 \geq 5x^2 \text{ if } x > 5 \\ & = 7x^3 \end{array} \end{equation*}

So \(8x^3 - 5x^2 + 7x + 13\) is \(\Omega(x^3)\) with \(c_2 = 7, k_2 = 5\) Therefore \(8x^3 - 5x^2 + 7x + 13\) is \(\Theta(x^3)\text{.}\)

2.

Show that \(f(x) = 10x^2 + x - 22 \) is \(\Theta(x^2)\text{.}\) Find \(c_1, c_2, k_1, k_2\text{.}\)

3.

Show that \(f(x) = x^5 \neq O(x^4)\)

Answer
\begin{equation*} \begin{array}{c c} x^5 & \leq x^4\\ \frac{x^5}{x^4} & \leq \frac{x^4}{x^4}\\ x & \leq 1\\ \end{array} \end{equation*}

This is false, because by definition of Big O the equation must be true for all values of \(x > k\text{,}\) for some value \(k\text{.}\) Therefore \(x^5 \neq O(x^4)\)

4.

For each pair of functions below, determine whether \(f(x) = O(g(x))\) and whether \(g(x) = O(f(x))\text{.}\) If one function is Big O of the other, give values for \(c\) and \(k\text{.}\)

  1. \(f(x) = x^2, g(x) = 3x.\)
  2. \(f(x) = \frac{3x - 7}{x+4}, g(x) = 4.\)
  3. \(f(x) = 2^x, g(x) = 2^{\frac{x}{2}}.\)
  4. \(f(x) = \log (n!), g(x) = \log (n^n).\)
  5. \(f(x) = x^{10}, g(x) = 10^x.\)
5.

Determine whether each of the functions: \(\log(3n + 2)\) and \(\log(n^3 + 1)\) is \(O(\log n)\) by finding a valid \(c\) and \(k\) for each, or showing they do not exist.

Answer
\begin{equation*} \begin{array}{c c} \log(3n + 2) & \leq \log(3n + n), \forall n > 2\\ & = \log(4n)\\ & = \log 4 + \log n\\ & \leq \log n + \log n, \text{ if } n > 4.\\ & = 2 \log n. \end{array} \end{equation*}

Therefore \(\log(3n + 2)\) is \(O(\log n)\) with \(c = 2, k = 4\text{.}\)

\begin{equation*} \begin{array}{c c} \log(n^3 + 1) & \leq \log(n^3 + n^3), \forall n > 1\\ & = \log(2n^3)\\ & = \log 2 + \log n^3\\ & = \log 2 + 3\log n\\ & \leq 4 \log n, \text{ if } n > 2. \end{array} \end{equation*}

Therefore \(\log(n^3 + 1)\) is \(O(\log n)\) with \(c = 4, k = 2\text{.}\)

6.

Arrange these functions into a sequence from smallest to largest Big O

  1. \(n\log n\)
  2. \(2^{100} n\)
  3. \(n!\)
  4. \(2^{2^{100}}\)
  5. \(2^{2^{n}}\)
  6. \(2^{n}\)
  7. \(3^n\)
  8. \(n2^{n}\)
  9. \(2n\)
  10. \(3n\)
  11. \(n^2\)
  12. \(n^3\)
  13. \(\log (n!)\)