There is a list of URLs that vary with each program launch. List<FileInfo> URLs = new List<FileInfo>();

 public class FileInfo{ public string FileURL {get;set;} public string FilePath {get;set;} public string Hash {get;set;} } 

We go through foreach through the list and cache the files for availability, if they do not match the standards, we load.

Is it possible to somehow make it so that it would run down the stream on each object of the list, so that they would all be hashed and downloaded asynchronously, and that program execution would continue only after they all loaded? PS I suppose that this can be done through TaskFactory, but I could not implement this.

  • I understand you are doing some kind of auto-update system. As it implemented something like that, at first, it was the usual task, then they advised Dataflow , implemented it and the loading process began to work many times faster and more intelligently. So I advise ;-) - EvgeniyZ
  • @EvgeniyZ I don’t know, I looked at the documentation ( docs.microsoft.com/ru-ru/dotnet/standard/parallel-programming/… ) and I have a feeling that in the example that can be shown there you can simply cut out DataFlow and nothing will change. I could not come up with an implementation, if offered, I will be grateful. - SKProCH

1 answer 1

Only came up with a similar solution:

 List<FileInfo> URLs = new List<FileInfo>(); Task[] LibrariesDownloadTasks = new Task[manifest.Libraries.Count]; for (int i = 0; i < URLs.Count; i++){ LibrariesDownloadTasks[i] = new Task(() => ProcessURL(URLs[i].FileURL, URLs[i].FilePath, URLs[i].Hask)); LibrariesDownloadTasks[i].Start(); } Task.WaitAll(LibrariesDownloadTasks); 

And in private void ProcessURL(string FileURL, string FilePath, string Hash) all the processing takes place.