Please help me with the O- notation. As far as I understand:

  • O (h (n)) means that the algorithm behaves in the worst case as c * h (n) , starting with some value n 0 , where c is a constant.
  • Ω (h (n)) means that the algorithm behaves in the best case as c * h (n) , starting with some value n 0 , where c is a constant.
  • ο (h (n)) means that the algorithm behaves in the worst case “asymptotically better” than c * h (n) , starting with some value n 0 , where c is a constant.
  • ω (h (n)) means that the algorithm behaves in the best case “asymptotically worse” than c * h (n) , starting with some value n 0 , where c is a constant.

I can not understand what means Θ (h (n)) , please explain. And if I made a mistake in the previous definitions, please correct.

1 answer 1

See it. Let f(n) be the number of steps of the algorithm.

The complexity of O(h(n)) means that the algorithm requires a number of steps not greater than h(n) multiplied by some constant. That is, lim f(n)/h(n) with increasing n finite.

So, they say that f(n) = Θ(h(n)) if f(n) = O(h(n)) and at the same time h(n) = O(f(n)) . This means that the functions f(n) and h(n) have the same order of growth.


Example: if the real complexity of the algorithm is f(n) = 2n^2 + 30n + log(n^4) , then f(n) = O(n^2) , 2n^2 + 30n + log(n^4) / n^2 = 2 + 30/n + 4 log n/n^2 -> 2 with infinitely growing n .

But at the same time f(n) = O(n^100) . Indeed, 2n^2 + 30n + log(n^4) / n^100 = 2/n^98 + 30/n^99 + 4 log n/n^100 -> 0 with infinitely increasing n . See you That is, O(h(n)) is, as it were, only an upper bound for the complexity of the algorithm, arbitrarily crude.

Further: f(n) = Θ(n^2) . Indeed, f(n) / h(n) -> 2 (as we have already found out), and h(n) / f(n) obviously tends to 0.5.

On the other hand, f(n) ≠ Θ(n^100) , because n^100 / (2n^2 + 30n + log(n^4)) = n^98 * (n^2) / (2n^2 + 30n + log(n^4)) = n^98 * (1 / (2 + 30/n + 4 log n / n)) , and this limit diverges, since the denominator is limited, and n^98 tends to infinity.


Thus, Θ is a more accurate estimate of the complexity of the algorithm.


Note to purist mathematicians: yes, the limit does not necessarily exist, strictly speaking, lim sup needed.

  • It turns out that the difference between O (h (n)) and ο (h (n)) is that ο (h (n)) is always a rough "upper bound" and O (h (n)) can coincide with "exact"? - NEvOl
  • @SergeyNikolaev: So it is, yes. - VladD