I have a large table A of 4 GB (lines 46549259), and when making selections from other tables, I need to fill in Table C. The indices in Table A are stamped. 1 request is 1,657s.

I tried to do a demon under the article http://bx-cert.ru/advices/50/realizatsiya-prostogo-demona-na-php/

I have a centos virtual machine. I'm afraid to start a demon on the hosting, I'm not sure yet ... So I copied the code.

<?$stop = false; /** * pcntl_fork() - данная функция разветвляет текущий процесс */ $pid = pcntl_fork(); if ($pid == -1) { /** * Не получилось сделать форк процесса, о чем сообщим в консоль */ die('Error fork process' . PHP_EOL); } elseif ($pid) { /** * В эту ветку зайдет только родительский процесс, который мы убиваем и сообщаем об этом в консоль */ die('Die parent process' . PHP_EOL); } else { /** * Бесконечный цикл */ while(!$stop) { /* * Тело демона */ for ($i = 0; $i < 10; $i++) { file_put_contents('/var/data/tmp/' . $i . '.txt', time()); sleep(2); } } } /** * Установим дочерний процесс основным, это необходимо для создания процессов */ posix_setsid(); ?> 

and saved it under the name daemon.php. I put the folder into the /var/data/ folder in the virtual machine folder. There was no data , I created this folder. I went to the virtual console and wrote the command /usr/bin/php /var/data/daemon.php & > /dev/null showed the process and the word "Stopped". and the daemon itself failed, there are no 10 files in the /var/data/tmp/ folder? why? If you throw me a demon example, I will be very grateful! I do not know what to do. Is it enough for me to do in my case (table transfer) a single-processor daemon like in this article, or do I need a multiprocessor daemon?

  • You have to be more careful with demons. Care must be taken to avoid memory leaks. And what doesn’t suit the script that will run on crown and process 20-30 records? - naym
  • Thanks for the thought, I no longer hoped to get an answer. And I tried to run in norup mode. The script handles my limits. That is, if I have 46549259 rows in the table, so I set the first stage of processing 300000 rows. But the request processes by limits on these lines. I did not have enough disk space, as the log "Got error 28 from storage engine" writes. I can not imagine "how to handle 20-30 records." Is this a limit in the request, or is the script itself containing 20-30 records? - user191862
  • If the latter, I correctly understand that the script should be run with parameters? - user191862

0