@VladD , not enough space in the comments, so leave as an answer. Perhaps I will have questions on the same topic, so I would not want to close this topic yet. I'm just getting settled, and the company is quite serious, at first glance. Perhaps, on the basis of my first post, you appreciated the complexity of the task, apparently I made up a little bit myself, but the fact is that they only want multithreading, GZip compression and that's it. About the division of data flow into blocks, I myself began to argue, because if done without it, then the implementation of this task is quite trivial. I just wanted to stand out and go on such difficulties, I think this will be a good plus. Yes, and the work is very necessary at the moment.
UPD 1: Then what should I do, immediately write these blocks to a file? I just have no other options yet. The extension is known in advance, and I think the location can be determined by the location of the file that needs to be compressed.
UPD 2: Okay, today I will try to make the alternate packing of blocks. I just can not make out the source of this project, it's hard, would leave at least comments on the methods (
UPD 3: It seems to be written, but I just don’t understand how to start a task in a cycle for compressing a block. Here is the code . I'm already completely confused. And yes, you need to free up memory after the block has been compressed? It is necessary to rewrite the CompressChunk method, otherwise I’ve freaked out something wrong there ...
UPD 4: In general, my strength is exhausted and hope to finish everything in time, too. I sent the letter and everything I could write. @VladD , thank you so much for your help and explanations, otherwise I would have trampled on the spot. Also thanks @Veiked . I'll finish this thing sometime later.
У меня возник вопрос: хватит ли для этой задачи обычных функций для работы с потоками или, например, лучше использовать tpl?The question is rather the opposite should be :) TPL is for convenience a wrapper. For the second part, you can try using Producer / Consumer (look at the website in theИсследованияsection) The size of your “equal” part depends on the amount of memory. That is, divide the file into parts so that there is enough RAM (well, accordingly, there should be a limit on the number of simultaneously processed parts) - Veikedo