I wrote several methods, read the file in parts, then compress it, put everything in a queue with ready blocks and then write the blocks to a file, use data compression, and the output is a file larger than the original file, tell me why?

using System; using System.Collections.Generic; using System.IO; using System.Linq; using System.Text; using System.IO.Compression; namespace ConsoleApplication57 { class Program { static void Main(string[] args) { string path = @"d:\Black Widow.m4a"; string path_compres = @"d:\Compress.gz"; // создаем очередь c блоками ланных Queue<KeyValuePair<int, byte[]>> queue_block = new Queue<KeyValuePair<int, byte[]>>(); // создаем очередь с готовыми обработанными блоками Queue<KeyValuePair<int, byte[]>> readyQueue = new Queue<KeyValuePair<int, byte[]>>(); // открываем поток using (var fs = new FileStream(path, FileMode.Open)) // добавляем в очередь блоки foreach (KeyValuePair<int, byte[]> block in Read_Blockk(fs)) { queue_block.Enqueue(block); } // сжимаем и добавляем в готовую очередь while (queue_block.Count>0) { var block = queue_block.Dequeue(); var compressionBlock = COmpress(block.Key, block.Value); readyQueue.Enqueue(compressionBlock); } // пишем в файл блоки while (readyQueue.Count!=0) { Write_Final_File(path_compres,readyQueue.Dequeue().Value); } Console.ReadKey(); } public static IEnumerable<KeyValuePair<int,byte[]>> Read_Blockk(Stream stream) { const int size_block=1024 * 1024; // определяем размер буфера=1мб int index = 0; // номер блока while (stream.Position<stream.Length) { // выделяем память под массив буффера. byte[] buffer=new byte[System.Math.Min(size_block,stream.Length-stream.Position)]; stream.Read(buffer, 0, buffer.Length); yield return new KeyValuePair<int, byte[]>(index++,buffer); } } public static KeyValuePair<int, byte[]> COmpress(int index,byte[] block) { using (var ms = new MemoryStream()) using (var gzStream=new GZipStream(ms,CompressionMode.Compress)) { int ind = index; gzStream.Write(block,0,block.Length); gzStream.Close(); return new KeyValuePair<int, byte[]>(ind++,ms.ToArray()); } } public static void Write_Final_File(string path, byte[] ReadyBlock) { using (var fsWrite = new FileStream(path, FileMode.Append, FileAccess.Write)) fsWrite.Write(ReadyBlock,0,ReadyBlock.Length); } } } 
  • Option 3. 1) Either the size is small. In addition to compression, headings are still being written, headings also require space. 2) The content of this algorithm cannot be compressed or an attempt to compress again (already compressed information is usually not compressed, i.e. it is not compressed jpg, mp3, etc). 3) Trite archiver glitch - builds an extended table instead of a short one. I caught such a glitch - nick_n_a
  • @nick_n_a I compress mp3, everything compresses I even unclench and it plays)), only the question with the weight of the file bothers me, what better to check compression, what data will be compressed well - Vladimr Vladimirovoch
  • uh, you confuse rar and zip , if you squeezed rar , and say that zip not compressed - then this is somehow not logical. zip little poorer. Then take the rar-library (if you really get it) and compress it using the specified compression method (BEST). - nick_n_a
  • one
    Questions should include the desired behavior, specific problem or error. Questions without an explicit description of the problem are useless for other visitors. What you squeeze, what you squeeze, how, what you get, what you expect to get, why you expect exactly this and no other - these are all the necessary details .. - Kromster
  • one
    Anti-bike SharpCompress :) there is no deep sense to write another wrapper around the native Compress , if compression quality is required, you must take the appropriate tools. - NewView

1 answer 1

If you believe MSDN :

Compression functionality in DeflateStream and GZipStream is provided as a stream. Data is read based on byte by byte, so it is not possible to perform several passes to determine the best compression method for whole files or large data blocks. DeflateStream and GZipStream are best used classes of uncompressed data sources. If the source data is already compressed, using these classes can really increase the size of the stream.