I use a script for backup (cited below). There was a problem when transferring to another hosting (most likely lack of memory). It gives an error on an arbitrary file when writing to the archive. The script is recursive. In addition to creating an archive, it issues it for downloading and deletes old versions from the server, if there are more than 3.

Please help optimize this script (Backups using server tools should not be advised, because there is a need for just such an approach). Link to the script file

Closed due to the fact that the essence of the question is not clear to the participants of Dmitriy Simushev , aleksandr barakin , Streletz , D-side , sercxjo 31 May '16 at 12:33 .

Try to write more detailed questions. To get an answer, explain what exactly you see the problem, how to reproduce it, what you want to get as a result, etc. Give an example that clearly demonstrates the problem. If the question can be reformulated according to the rules set out in the certificate , edit it .

  • As a basis used recursive script by Marvin Menzerath (2012-2014). - DaVinchi
  • You would put the script where it can be completely viewed in the browser. Few people want to download something. What do logs say? - naym
  • Everything related to the question should be in the question itself . Links can only serve as a supplement. - Dmitriy Simushev

2 answers 2

There are several solutions: 1. Most likely it is timed out. increase the timeout. 2. put git ipo krone to do autocommit and push on the original 3. or score apparently your deshmansky host, or generally free, and something optimized for it is unrealistic

add output to script

function zipData($source, $destination) { if (extension_loaded('zip')) { if (file_exists($source)) { $zip = new ZipArchive(); echo 'create zip archive'.PHP_EOL; if ($zip->open($destination, ZIPARCHIVE::CREATE)) { $source = realpath($source); echo realpath($source).PHP_EOL; if (is_dir($source)) { $iterator = new RecursiveDirectoryIterator($source); // skip dot files while iterating $iterator->setFlags(RecursiveDirectoryIterator::SKIP_DOTS); $files = new RecursiveIteratorIterator($iterator, RecursiveIteratorIterator::SELF_FIRST); foreach ($files as $file) { // echo $file . '<BR>' ; echo $file.PHP_EOL; if (strpos($file, 'backup') === false) { $file = realpath($file); if (is_dir($file)) { $zip->addEmptyDir(str_replace($source . '/', '', $file . '/')); } else if (is_file($file)) { $zip->addFromString(str_replace($source . '/', '', $file), file_get_contents($file)); } } } } else if (is_file($source)) { $zip->addFromString(basename($source), file_get_contents($source)); } } return $zip->close(); } } return false; } 

and run using php -f backup.php where backup.php your script. After we look where and why it fell off. It is also necessary to add display_error(E_ALL) at the beginning of the script to display errors in the execution process.

  • Hosting: digitalocean.com (max tariff) It is important for me that this particular script works! The problem is not timeout or memory. The script is cut down after about 3 -10 seconds (in different ways). Such problems arose during the transfer to the hosting. If there are any optimization suggestions, then please code. - DaVinchi

If the problem is still with the memory for the script, you can try to increase the value in the same memory_limit. In addition, you can divide the volume of the script into several parts and perform separately.