There is some sorting, which in the process of operation dynamically receives blocks of memory of size sqrt (N), N is the number of elements of the array being sorted. ( This size IMHO has no relation to the essence of the question, but still ... )
Sort random data received by rand () (Linux, RAND_MAX = 2 ^ 31-1).
It is necessary to determine the dependence of K (the number of blocks) on N (the size of the array) .
I feel that it is kind of logarithmic (at least in a certain range (s) N), but it does not work out. For many data, a doubling of K with four times N is seen. How is it mathematically recorded?
NK 1000 3 2000 4 4000 5 8000 6 12000 7 16000 9 24000 10 32000 11 48000 12 50000 12 56000 13 64000 15 70000 15 75000 16 100000 16 150000 21 200000 24 280000 29 300000 31 400000 33 500000 37 600000 41 800000 48 1000000 54 1200000 57 1600000 68 2000000 78 2400000 81 3200000 95 4000000 105 6400000 131 12800000 189 16000000 211 64000000 420 If you have questions (including on unscheduled K, N), do not hesitate to ask, it’s not a problem to calculate. For large N (N> 10,000,000), do not worry.
I know about the forum for mathematicians, but I am afraid that we will speak different languages there. Obviously things that are obvious to mathematicians will not be very clear to me.

