Task: damage the file to make it impossible to read it. (And successful recovery if it is deleted programmatically on the same machine (in the way available to the average Google search user))
Hiding the fact of overwriting a file does not interest. The volume of files and their number is any (can be calculated along the way)
Type of storage - any (can be calculated along the way)
View any files ...
What algorithm will consider speed and efficiency?
The first thing that came to mind was to open the file in a stream, and walk with a certain step, "poking" zero bytes. What can be improved?
C # code:
private void FileDamage(Stream stream) { using (stream) { const int upper = 1000; var step = stream.Length / upper; if (step == 0) step = 1; for (long l = 0; l < upper; l++) { stream.Seek(step, SeekOrigin.Current); stream.WriteByte(0); } } } PS I do not want to completely overwrite the file (or even repeatedly). The question is what is sufficient in this case.