Here the question is about the complexity of the algorithms and whether it is possible, knowing the "O" for the algorithm and the time for some N, to estimate the time for another N.

By the definition of O-notation, it begins to asymptotically approach after some N. Maybe a large one.

And can it be theoretically so that for small N we have some kind of "bad" dependence (well, like N 2 or N in general!), Which for large N turns into a "good" (like N or N * log (N) )?

And if it can, is there any practical example? Only not invented, but from real life?

  • one
    [which for large N turns into "good"] By definition, O from something is asymptotic for BIG "n". - pepsicoca1
  • And not only theoretically, but practically it is - Alexander Chernin
  • if (N <42) N! else N. Only the complexity of such an algorithm will still be O (N!), since it is considered for all input data and O (N!) includes O (N). Again I refer to the definition - cpp questions
  • one
    @cppquestions, no, the complexity of this algorithm is O (n), since 42! - this is a constant. - Qwertiy
  • Something with the search for prime numbers was. - Qwertiy

1 answer 1

As I understand it, you are interested precisely in such a way that, for small N , the dependence is not so much bad, how much time is worse than for large N ?

This is unlikely, since it means that for small N a really bad algorithm is used - which makes no sense except to create an algorithm that satisfies your curiosity :)

But this O(N^2) , which will exceed the speed for small N O(N*log N) - it seems to be used in the standard C ++ library in sorting - for large N was a fast sort, which, when the array turned out to be almost orderly , proceeded to sorting inserts, which in these conditions turned out to be very fast.

Formally, the sorting by inserts O(N^2) (although under such conditions - an almost sorted array - tends to O(N) ).

"I think so" (c) Pooh