Big-O and three other related notations were introduced by Paul Bachmann and Edmund Landau. Hence, this set of notations bear the name “Bachmann–Landau Notations”, or alternaively “asymptotic notations”.
Big-O is used to write an equation like this:
f(x) = O( g(x) )

The second notation is the Ω notation. If again x tends to infinity we write this as: f(x) = Ω( g(x) ) as x → ∞ if there exist positive constants C and M such that f(x) ≥ C g(x) for all x > M.
When we write f(x) = Θ( g(x) ) it means that f(x) = O( g(x) ) whilst also f(x) = Ω( g(x) ).
The third notation is ο (Greek omicron). The omicron notation is also called "little-o" notation. Little-o is used like this: f(x) = ο( g(x) ) if the limit as x → ∞ of f(x)/g(x) is 0.
Finally we have the notation ω (lower-case Greek omega) which is used like this: f(x) = ω( g(x) ) if the limit as x → ∞ of f(x)/g(x) is ∞.
When we do not have x → ∞ technical details of the definitions differ slightly:
- f(x) = O( g(x) ) as x → k if there exist positive constants δ and C such that f(x) ≤ C g(x) for all x such that |x - k| < δ.
- f(x) = Ω( g(x) ) as x → k if there exist positive constants δ and C such that f(x) ≥ C g(x) for all x such that |x - k| < δ.
- f(x) = ο( g(x) ) as x → k if there if the limit as x → k of f(x)/g(x) is 0.
- f(x) = ω( g(x) ) as x → k if there if the limit as x → k of f(x)/g(x) is ∞.
Often you will see statements such as f(x) = O( g(x) ) without reference to an explicit limit and will have to determine from context what limit is implied.
Although Big-O is the most commonly used notation in computer science, the Ω and Θ notations are more commonly used in computer science than in mathematics. Little-o and ω notation is rarely used.
![]() |
Visuaisation of Quick Sort Algorithm |
In the theoretical analysis of algorithms it is common to estimate their complexity in the asymptotic sense ie to estimate the complexity as the input tends to infinity. The reason asymptotic estimates are used is because different implementations of the same algorithm may differ in efficiency. However, the efficiencies of any two "reasonable" implementations of a given algorithm are related by a constant multiplicative factor called a “hidden constant”.
For example, the so called “quick sort” algorithm runs in a time proportional to the size of the list being sorted multiplied by the logarithm of the length of the list being sorted where the size of the list tends to infinity. In Big-O notation we write this as O(nlog(n)).