The program parses through the VK API of users (an average of 20-50 threads) and creates records in the database. Each user has an average of 20 groups, 500 photos, 200 friends. I keep records of friends, groups, users and photos in different tables. Total, in order to save the 1st user, I have about 721 requests to insert into the database. Per minute it is 200-300 users - ~ 216 000 requests for insertion into the database. Because of this, the call to context.SaveChanges() takes about 6-10 minutes to execute.
Tried to use context pool - bulk insert , average time is 4-6 minutes.
AutoDetectChangesEnabled = false; or context.AddRange() give roughly the same results.
The only quick solution to which I came is to binary serialize user data and store it in byte[] so that the user has 4 insertion requests. This reduced the time to call context.SaveChanges() to 1.2 seconds. But at the same time a natural problem appeared - in order to change at least something for the user, it is necessary to deserialize and serialize all its data.
Tell me, what are the approaches for storing large amounts of data that are applicable in this case without using serialization?