It so happened that it is necessary to download videos by direct links from other sources. We will receive links from people and process them by downloading content from them.

The main language of the PHP project. Interested in how to most effectively download videos to your server?

1. system( 'wget http://www.php.ru/mp4.mp4' );

2

  $handle = fopen($name, 'wb'); fwrite($handle, $file); fclose($handle); 

3. file_put_contents ( "filexxx.mp4", file_get_contents ( 'file.mp4' ));

File size up to 500 MB.

These are just some options, but it seems to me they are adequate representatives of such methods.

    3 answers 3

    No need to solve this problem only in PHP.

    Queue of tasks (eg Gearman ). In her push "task" - a link to the file in this case. And the Workflow - Bash script - is constantly spinning and waiting for the receipt of a new task, and downloads the specified file with the wget command and at the end of the execution it stuffs the url, the path, the date-time into the database.

    And your front-end may be interested in ajax every second, the file is already loaded. or not yet.


    wget does not eat memory. How to check (idea and code from here ). Open two terminals to the server. In one prepare something like this bash script and run:

     while true do ps -eo size,command | awk '/wget/&&!/ps.*awk/{print $1, "KB", $2}' sleep 5 done 

    It will show every 5 seconds the memory taken by the wget process. In the second terminal, start downloading some large file with wget. For example I took the Debian distribution (3.7Gb):

     wget http://cdimage.debian.org/debian-cd/8.3.0/i386/iso-dvd/debian-8.3.0-i386-DVD-1.iso 

    In the first window, I consistently showed the same modest memory:

     ./test.sh 428 KB wget 428 KB wget 428 KB wget 428 KB wget 428 KB wget 428 KB wget 428 KB wget 

    If you are downloading a disk-in-memory partition, then yes, it may eat memory — but not wget, but the file being downloaded.

    • As far as I have seen and realized that when downloading even wget by bash, the RAM is clogged. If not, I recognize this answer more correctly than mine and put a tick (dispute) - Vlad
    • Where have you seen such misinformation? Updated the answer after a personally conducted experiment. - Sergiks
    • Если вы скачиваете в партицию диска-в-памяти, то да, может и жрать память – но не wget I meant it (did not put it correctly). I watched it on my equipment during the jump. I make a complaint)) - Vlad

    Almost no option has a place to live if you do not have a heap of free RAM and you do not use the script alone. Otherwise, you will be disappointed after a while in the form of a server crash due to a lack of RAM or processor power.

    I solved this problem with the help of the usual line-by-line reading of the file (weight 250 MB). The only thing that is desirable is to run this case through exec or create a bridge script so that it does it.

    Well, the solution here is taken directly from the php.net manual, which proved to be perfect. A little is done, but not the essence. You can even optimize it.

     $handle = fopen($url, "r"); // тут и URL откуда собираемся качать $res_f = fopen('result.mp4', 'a+'); // Файл,куда сохраняем while (!feof($handle)) { $buffer = fgets($handle, 4096); fwrite($res_f, $buffer); // запись в файл результата } fclose($handle); 

      I use file_put_contents for such purposes and I advise you.

      • But doesn’t a memory / system have huge files, the memory is not clogged? After all, it’s very likely that several requests and everything will go away or am I wrong? - Vlad
      • @Maksym, if the serv is not particularly powerful and there will be a lot of requests, and the files are large, then obviously wget - Rammsteinik