2015-02-01 06:16:20 +03:00
|
|
|
|
---
|
|
|
|
|
category: Algorithms & Data Structures
|
2015-02-03 23:26:54 +03:00
|
|
|
|
name: Asymptotic Notation
|
2015-02-01 06:16:20 +03:00
|
|
|
|
contributors:
|
|
|
|
|
- ["Jake Prather", "http://github.com/JakeHP"]
|
2016-03-04 23:32:54 +03:00
|
|
|
|
- ["Divay Prakash", "http://github.com/divayprakash"]
|
2015-02-01 06:16:20 +03:00
|
|
|
|
---
|
|
|
|
|
|
|
|
|
|
# Asymptotic Notations
|
|
|
|
|
|
|
|
|
|
## What are they?
|
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Asymptotic Notations are languages that allow us to analyze an algorithm's
|
|
|
|
|
running time by identifying its behavior as the input size for the algorithm
|
|
|
|
|
increases. This is also known as an algorithm's growth rate. Does the
|
|
|
|
|
algorithm suddenly become incredibly slow when the input size grows? Does it
|
|
|
|
|
mostly maintain its quick run time as the input size increases? Asymptotic
|
|
|
|
|
Notation gives us the ability to answer these questions.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
|
|
|
|
## Are there alternatives to answering these questions?
|
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
One way would be to count the number of primitive operations at different
|
|
|
|
|
input sizes. Though this is a valid solution, the amount of work this takes
|
|
|
|
|
for even simple algorithms does not justify its use.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Another way is to physically measure the amount of time an algorithm takes to
|
|
|
|
|
complete given different input sizes. However, the accuracy and relativity
|
|
|
|
|
(times obtained would only be relative to the machine they were computed on)
|
|
|
|
|
of this method is bound to environmental variables such as computer hardware
|
|
|
|
|
specifications, processing power, etc.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
|
|
|
|
## Types of Asymptotic Notation
|
|
|
|
|
|
2020-02-27 22:58:17 +03:00
|
|
|
|
In the first section of this doc, we described how an Asymptotic Notation
|
2016-03-12 14:28:00 +03:00
|
|
|
|
identifies the behavior of an algorithm as the input size changes. Let us
|
|
|
|
|
imagine an algorithm as a function f, n as the input size, and f(n) being
|
|
|
|
|
the running time. So for a given algorithm f, with input size n you get
|
2020-02-27 22:58:17 +03:00
|
|
|
|
some resultant run time f(n). This results in a graph where the Y-axis is
|
|
|
|
|
the runtime, the X-axis is the input size, and plot points are the resultants
|
|
|
|
|
of the amount of time for a given input size.
|
2016-03-12 14:28:00 +03:00
|
|
|
|
|
|
|
|
|
You can label a function, or algorithm, with an Asymptotic Notation in many
|
|
|
|
|
different ways. Some examples are, you can describe an algorithm by its best
|
2020-02-27 22:58:17 +03:00
|
|
|
|
case, worst case, or average case. The most common is to analyze an algorithm
|
|
|
|
|
by its worst case. You typically don’t evaluate by best case because those
|
|
|
|
|
conditions aren’t what you’re planning for. An excellent example of this is
|
|
|
|
|
sorting algorithms; particularly, adding elements to a tree structure. The
|
|
|
|
|
best case for most algorithms could be as low as a single operation. However,
|
|
|
|
|
in most cases, the element you’re adding needs to be sorted appropriately
|
|
|
|
|
through the tree, which could mean examining an entire branch. This is
|
|
|
|
|
the worst case, and this is what we plan for.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
|
|
|
|
### Types of functions, limits, and simplification
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 06:22:30 +03:00
|
|
|
|
```
|
2015-02-01 06:16:20 +03:00
|
|
|
|
Logarithmic Function - log n
|
|
|
|
|
Linear Function - an + b
|
|
|
|
|
Quadratic Function - an^2 + bn + c
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Polynomial Function - an^z + . . . + an^2 + a*n^1 + a*n^0, where z is some
|
|
|
|
|
constant
|
2015-02-01 06:16:20 +03:00
|
|
|
|
Exponential Function - a^n, where a is some constant
|
2015-02-01 06:22:30 +03:00
|
|
|
|
```
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2020-02-27 22:58:17 +03:00
|
|
|
|
These are some fundamental function growth classifications used in
|
|
|
|
|
various notations. The list starts at the slowest growing function
|
|
|
|
|
(logarithmic, fastest execution time) and goes on to the fastest
|
|
|
|
|
growing (exponential, slowest execution time). Notice that as ‘n’
|
|
|
|
|
or the input, increases in each of those functions, the result
|
|
|
|
|
increases much quicker in quadratic, polynomial, and exponential,
|
|
|
|
|
compared to logarithmic and linear.
|
|
|
|
|
|
|
|
|
|
It is worth noting that for the notations about to be discussed,
|
|
|
|
|
you should do your best to use the simplest terms. This means to
|
|
|
|
|
disregard constants, and lower order terms, because as the input
|
|
|
|
|
size (or n in our f(n) example) increases to infinity (mathematical
|
|
|
|
|
limits), the lower order terms and constants are of little to no
|
|
|
|
|
importance. That being said, if you have constants that are 2^9001,
|
|
|
|
|
or some other ridiculous, unimaginable amount, realize that
|
|
|
|
|
simplifying skew your notation accuracy.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
|
|
|
|
Since we want simplest form, lets modify our table a bit...
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 06:22:30 +03:00
|
|
|
|
```
|
2015-02-01 06:16:20 +03:00
|
|
|
|
Logarithmic - log n
|
|
|
|
|
Linear - n
|
|
|
|
|
Quadratic - n^2
|
|
|
|
|
Polynomial - n^z, where z is some constant
|
|
|
|
|
Exponential - a^n, where a is some constant
|
2015-02-01 06:22:30 +03:00
|
|
|
|
```
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2015-03-05 02:48:10 +03:00
|
|
|
|
### Big-O
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Big-O, commonly written as **O**, is an Asymptotic Notation for the worst
|
|
|
|
|
case, or ceiling of growth for a given function. It provides us with an
|
2020-02-27 22:58:17 +03:00
|
|
|
|
_**asymptotic upper bound**_ for the growth rate of the runtime of an algorithm.
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Say `f(n)` is your algorithm runtime, and `g(n)` is an arbitrary time
|
|
|
|
|
complexity you are trying to relate to your algorithm. `f(n)` is O(g(n)), if
|
2017-02-09 18:30:34 +03:00
|
|
|
|
for some real constants c (c > 0) and n<sub>0</sub>, `f(n)` <= `c g(n)` for every input size
|
|
|
|
|
n (n > n<sub>0</sub>).
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2015-10-08 06:11:24 +03:00
|
|
|
|
*Example 1*
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 20:13:22 +03:00
|
|
|
|
```
|
2015-10-08 06:11:24 +03:00
|
|
|
|
f(n) = 3log n + 100
|
2015-02-01 06:16:20 +03:00
|
|
|
|
g(n) = log n
|
2015-02-01 20:13:22 +03:00
|
|
|
|
```
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2015-10-08 06:11:24 +03:00
|
|
|
|
Is `f(n)` O(g(n))?
|
|
|
|
|
Is `3 log n + 100` O(log n)?
|
2015-03-05 02:48:10 +03:00
|
|
|
|
Let's look to the definition of Big-O.
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 20:27:32 +03:00
|
|
|
|
```
|
2015-10-08 06:11:24 +03:00
|
|
|
|
3log n + 100 <= c * log n
|
2015-02-01 20:27:32 +03:00
|
|
|
|
```
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2019-02-26 14:42:57 +03:00
|
|
|
|
Is there some pair of constants c, n<sub>0</sub> that satisfies this for all n > n<sub>0</sub>?
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 20:27:32 +03:00
|
|
|
|
```
|
2015-10-08 06:11:24 +03:00
|
|
|
|
3log n + 100 <= 150 * log n, n > 2 (undefined at n = 1)
|
2015-02-01 20:27:32 +03:00
|
|
|
|
```
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-03-05 02:48:10 +03:00
|
|
|
|
Yes! The definition of Big-O has been met therefore `f(n)` is O(g(n)).
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2015-10-08 06:11:24 +03:00
|
|
|
|
*Example 2*
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 20:13:22 +03:00
|
|
|
|
```
|
2015-10-08 06:11:24 +03:00
|
|
|
|
f(n) = 3*n^2
|
2015-02-01 06:16:20 +03:00
|
|
|
|
g(n) = n
|
2015-02-01 20:13:22 +03:00
|
|
|
|
```
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2015-10-08 06:11:24 +03:00
|
|
|
|
Is `f(n)` O(g(n))?
|
|
|
|
|
Is `3 * n^2` O(n)?
|
2015-03-05 02:48:10 +03:00
|
|
|
|
Let's look at the definition of Big-O.
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2015-02-01 20:27:32 +03:00
|
|
|
|
```
|
2015-10-08 06:11:24 +03:00
|
|
|
|
3 * n^2 <= c * n
|
2015-02-01 20:27:32 +03:00
|
|
|
|
```
|
2015-02-04 06:33:53 +03:00
|
|
|
|
|
2024-01-01 15:47:02 +03:00
|
|
|
|
Is there some pair of constants c, n<sub>0</sub> that satisfies this for all n > n<sub>0</sub>?
|
2015-02-01 20:49:19 +03:00
|
|
|
|
No, there isn't. `f(n)` is NOT O(g(n)).
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
|
|
|
|
### Big-Omega
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Big-Omega, commonly written as **Ω**, is an Asymptotic Notation for the best
|
|
|
|
|
case, or a floor growth rate for a given function. It provides us with an
|
2020-02-27 22:58:17 +03:00
|
|
|
|
_**asymptotic lower bound**_ for the growth rate of the runtime of an algorithm.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2017-02-09 18:30:34 +03:00
|
|
|
|
`f(n)` is Ω(g(n)), if for some real constants c (c > 0) and n<sub>0</sub> (n<sub>0</sub> > 0), `f(n)` is >= `c g(n)`
|
|
|
|
|
for every input size n (n > n<sub>0</sub>).
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
|
|
|
|
### Note
|
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
The asymptotic growth rates provided by big-O and big-omega notation may or
|
|
|
|
|
may not be asymptotically tight. Thus we use small-o and small-omega notation
|
|
|
|
|
to denote bounds that are not asymptotically tight.
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
|
|
|
|
### Small-o
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Small-o, commonly written as **o**, is an Asymptotic Notation to denote the
|
|
|
|
|
upper bound (that is not asymptotically tight) on the growth rate of runtime
|
|
|
|
|
of an algorithm.
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
2018-08-15 15:56:57 +03:00
|
|
|
|
`f(n)` is o(g(n)), if for all real constants c (c > 0) and n<sub>0</sub> (n<sub>0</sub> > 0), `f(n)` is < `c g(n)`
|
2017-02-09 18:30:34 +03:00
|
|
|
|
for every input size n (n > n<sub>0</sub>).
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
The definitions of O-notation and o-notation are similar. The main difference
|
|
|
|
|
is that in f(n) = O(g(n)), the bound f(n) <= g(n) holds for _**some**_
|
|
|
|
|
constant c > 0, but in f(n) = o(g(n)), the bound f(n) < c g(n) holds for
|
|
|
|
|
_**all**_ constants c > 0.
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
|
|
|
|
### Small-omega
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Small-omega, commonly written as **ω**, is an Asymptotic Notation to denote
|
|
|
|
|
the lower bound (that is not asymptotically tight) on the growth rate of
|
|
|
|
|
runtime of an algorithm.
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
2018-08-15 15:56:57 +03:00
|
|
|
|
`f(n)` is ω(g(n)), if for all real constants c (c > 0) and n<sub>0</sub> (n<sub>0</sub> > 0), `f(n)` is > `c g(n)`
|
2017-02-09 18:30:34 +03:00
|
|
|
|
for every input size n (n > n<sub>0</sub>).
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
The definitions of Ω-notation and ω-notation are similar. The main difference
|
|
|
|
|
is that in f(n) = Ω(g(n)), the bound f(n) >= g(n) holds for _**some**_
|
|
|
|
|
constant c > 0, but in f(n) = ω(g(n)), the bound f(n) > c g(n) holds for
|
|
|
|
|
_**all**_ constants c > 0.
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
|
|
|
|
### Theta
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Theta, commonly written as **Θ**, is an Asymptotic Notation to denote the
|
|
|
|
|
_**asymptotically tight bound**_ on the growth rate of runtime of an algorithm.
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
2017-02-09 18:30:34 +03:00
|
|
|
|
`f(n)` is Θ(g(n)), if for some real constants c1, c2 and n<sub>0</sub> (c1 > 0, c2 > 0, n<sub>0</sub> > 0),
|
|
|
|
|
`c1 g(n)` is < `f(n)` is < `c2 g(n)` for every input size n (n > n<sub>0</sub>).
|
2016-03-04 23:32:54 +03:00
|
|
|
|
|
|
|
|
|
∴ `f(n)` is Θ(g(n)) implies `f(n)` is O(g(n)) as well as `f(n)` is Ω(g(n)).
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2016-03-12 14:28:00 +03:00
|
|
|
|
Feel free to head over to additional resources for examples on this. Big-O
|
|
|
|
|
is the primary notation use for general algorithm time complexity.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
2020-02-27 22:58:17 +03:00
|
|
|
|
### Endnotes
|
|
|
|
|
It's hard to keep this kind of topic short, and you should go
|
2016-03-12 14:28:00 +03:00
|
|
|
|
through the books and online resources listed. They go into much greater depth
|
|
|
|
|
with definitions and examples. More where x='Algorithms & Data Structures' is
|
|
|
|
|
on its way; we'll have a doc up on analyzing actual code examples soon.
|
2015-02-01 06:16:20 +03:00
|
|
|
|
|
|
|
|
|
## Books
|
|
|
|
|
|
|
|
|
|
* [Algorithms](http://www.amazon.com/Algorithms-4th-Robert-Sedgewick/dp/032157351X)
|
|
|
|
|
* [Algorithm Design](http://www.amazon.com/Algorithm-Design-Foundations-Analysis-Internet/dp/0471383651)
|
|
|
|
|
|
|
|
|
|
## Online Resources
|
|
|
|
|
|
|
|
|
|
* [MIT](http://web.mit.edu/16.070/www/lecture/big_o.pdf)
|
2015-02-01 06:22:30 +03:00
|
|
|
|
* [KhanAcademy](https://www.khanacademy.org/computing/computer-science/algorithms/asymptotic-notation/a/asymptotic-notation)
|
2015-10-18 23:15:33 +03:00
|
|
|
|
* [Big-O Cheatsheet](http://bigocheatsheet.com/) - common structures, operations, and algorithms, ranked by complexity.
|