There is a list of lst . In it the list url. Through Parallel.For I do get requests and manipulate data. As we understand, the data from the list is taken in a random order and in the event of a failure it is not possible to continue working normally. Is there any method for Parallel.For take data one by one, well, or some other option to save the loop position for all streams.

  • They are taken in order, they can run out of order. What does it mean нормально продолжить работу ? How important is the sequence? Or is it enough to simply memorize successfully worked out list items? - slippyk
  • just remember to successfully work out - Rajab

3 answers 3

It is possible and through one collection:

  private void button2_Click(object sender, EventArgs e) { ConcurrentDictionary<Uri, Enum> cd = new ConcurrentDictionary<Uri, Enum>(); cd.TryAdd(new Uri("http://1.com"), EProgress.Waiting); cd.TryAdd(new Uri("http://2.com"), EProgress.Waiting); cd.TryAdd(new Uri("http://3.com"), EProgress.Waiting); Parallel.ForEach(cd, item => { try { cd[item.Key] = EProgress.Processing; // to do something cd[item.Key] = EProgress.Complete; } catch (Exception) { cd[item.Key] = EProgress.Error; throw; } }); } public enum EProgress { Waiting, // Ожидают обработки Processing, // В процессе обработки Complete // Удачно завершенные } 

After the execution, you can look at which element is an error, and in what state the rest.

    It seems to me that it makes sense to switch to task-based parallelism, since you still have a load on the network, and do not bother with low-level multithreading.

    If ProcessUrl is an asynchronous function for processing a single address, we get the following trivial code:

     var resultTasks = lst.Select(ProcessUrl); var allResults = await Task.WhenAll(resultTasks); var successfulResults = allResults.Where(r => r != null) // отфильтруем неуспешные .Select(r => r.Value) // вернёмся к типу byte .ToList(); // и материализуем, наконец. 

    According to the documentation , Task.WhenAll returns results preserving the order:

    Result list of


    Here is an example of what “single address processing” might look like:

     // обработка одного URL, может вернуть null при ошибке async Task<int?> ProcessUrl(Uri uri) { try { using (var client = new HttpClient()) using (var stream = await client.GetStreamAsync(uri)) { // окей, теперь у нас есть поток // интенсивный подсчёт, ограниченный сетью, делаем прямо byte sumOfBytes = await SumBytesInStream(stream); // вычисление, интенсивное по процессору, выполняем на пуле потоков return await Task.Run(() => EncodeByte(sumOfBytes)); } } catch (HttpRequestException e) { return null; // неудача, вернём null } } 

    Secondary functions:

     async Task<byte> SumBytesInStream(Stream s) { byte[] buf = new byte[65536]; byte sum = 0; while (true) { var actuallyRead = await s.ReadAsync(buf, 0, buf.Length); if (actuallyRead == 0) return sum; sum += buf.Take(actuallyRead).Sum(); } } byte EncodeByte(byte b) { // тут много сложной криптографии return b + 1; } 

      Better, of course, tasking to operate and easier to get and faster. Store, in my highly conditional view, it costs 2 collections. As soon as the request is processed correctly, the address is added to the collection of checked ones. Accordingly, for work we use competitive collections, for example, ConcurrentQueue