Here the question is about the complexity of the algorithms and whether it is possible, knowing the "O" for the algorithm and the time for some N, to estimate the time for another N.
By the definition of O-notation, it begins to asymptotically approach after some N. Maybe a large one.
And can it be theoretically so that for small N we have some kind of "bad" dependence (well, like N 2 or N in general!), Which for large N turns into a "good" (like N or N * log (N) )?
And if it can, is there any practical example? Only not invented, but from real life?