There is a project in C # with EF6. Data overload. At the moment, data is about 120k records And a very big question has become with performance. With every 1000 records processed, the processing speed drops. And it falls very significantly. And in the end ... the processing of all 120k records is delayed for 2-3 days. I tried Configuration.AutoDetectChangesEnabled=false;

The result is not very different. Saving the context is done almost every logical chain.

Tell me how to optimize the work?

  • one
    Show the code to start, please. - Sergey Rufanov 2:21 pm

2 answers 2

The context in EF is (including) the implementation of the Unit of Work. It should not be long-lived. The context lifetime is one business operation.

Making one context per app, or keeping the context alive for several hours is obviously a bad idea.

Do not keep the context alive for so long. Create a new context for each "logical chain".

If your context lives for a long time only for the sake of some upper foreach (var obj in context.Objects) - rewrite the id collection of objects in memory in a loop. Or on the cycle that chooses the first raw record at the beginning of the iteration. To anything, but in such a way that there is no common using (var context = new) above using (var context = new)

    Let's just say - 120 thousand records - this is not any "large amounts of data." I would even say that these are very small volumes. If each of your lines does not have any mind-blowing volumes, then this generally should be nothing. Since you do not provide any specific data and do not describe your database or model, it is difficult to say something specific here. Although, of course, 2-3 days for such a small number of records is some kind of hellish thrash, which should not be typical even for ORM, including Entity. Almost certain that you somehow mistakenly work with Entity and / or perform some kind of furious logic on equally sensitive data - in normal conditions Entity will not process 120 thousand records for so long.

    However, if you plan to switch to really big data (or even to medium volumes), then it makes sense to abandon Entity in favor of the usual ADO.NET

    • If the data is in the brake network, then anything can be ;-) - cpp_user pm
    • @cpp_user well, and this, of course, and many other reasons can be invented, but here it makes sense to proceed from the principle of Occam's razor - DreamChild