Just got close to the use of recursion, and almost immediately a question came up.
In the video course and the texts that I managed to study, the advantage of using recursion is a shorter record. In this case, for example, the factorial cycle is described not much longer, and the recording is guaranteed to be understood by the majority of programmers who will read it. To understand recursion, you must first understand recursion.
On the other hand, measurements of the execution time showed that there is no difference in the calculation of factorial (small, true).
But if you describe the addition of, say, a collection of integers, the difference is huge, and not in favor of recursion.
I admit that I am describing crookedly, but so far nothing has given birth to me. So, there is a class SomeNums that implements IEnumerable (int); it has a num list that stores the set itself.
Here is the loop:
public int CycSum() { int acc = 0; foreach (var num in nums) acc += num; return acc; } Here is the recursion:
public int RecSum (int step) { int acc = 0; if (step == 0) return nums[step]; return acc += nums[step] + RecSum (step - 1); } And here I am somewhere in Main doing this:
var n = new SomeNums(Enumerable.Range(1, 1000)); And then I measure the time and make a hundred times the addition of each of the methods. I consider the average, and (quite expected, in general) I get the conclusion:
- Mean time for recursion: 0.034 468
- Average cycle time: 0.006 387
And, of course, if I try to increase the number of members of the set, I get a stack overflow.
I understand that using recursion in the way that I do is incorrect. But are there any correct ways? In general, is there any point in using recursion in C #?
And yes, I know that there is LINQ with its Aggregate, but the post is not about that.

