There is a legacy project made on the SQL Server Integration Services platform. It is engaged in loading data from a third-party source into the database analytic layer. The protocol is organized in such a way that in case of unsuccessful download from the source, all the same data is offered for download the next time. Due to an error with the connection string, the data loading did not work for a year - and now there is not enough memory to hold all the output data in the memory of this memory itself.
Now the question is: can the data prepared inside the Script Component be sent out in a few tricks without accumulating buffer gigabytes in memory? Now the component code, if simplified, looks like this:
public override void CreateNewOutputRows() { var datasource = Enumerable.Repeat(new { a = 5, b = 42 }, 2000000); // Для примера, на самом деле эти данные приходят по WCF foreach (var obj in datasource) { SomeNamedOutputBuffer.AddRow(); // Вот тут на одной из итераций заканчивается память SomeNamedOutputBuffer.A = obj.a; SomeNamedOutputBuffer.B = obj.b; } SomeNamedOutputBuffer.SetEndOfRowset(); }