In general, with the increase in attendance began problems on the hosting. As I understand it because of the missing parameter of the Entry processes , at the moment this value is 30.

What is a website - a php-engine, I think it gives an insignificant load (there are similar ones), there is a search form that, with the help of curl searches for information on other sites and gives it to the user.

Hoster terebil about the 30/30 Entry processes , answer: The Количество запущенных процессов может накапливается когда скрипты зацикливаются, либо пытаются получить доступ с каких-то внешних ресурсов, который в данный момент недоступны.

My questions:

  1. Is the faults on the screen is not loaded page of the site? That is, 23,000 faults is equal to the fact that users have downloaded the pages of the site and saw the 5хх Resource Limit Is Reached error 5хх Resource Limit Is Reached exactly these 23,000 times?

  2. Do I understand correctly that users creating form curl-requests and produce these Entry processes ? Due to the fact that these curl requests take a long time to respond and the Entry processes do not have time to work out?

Screenshots are attached below.

Screen 1 Screen 2

    1 answer 1

    Do I understand correctly that users creating form curl-requests and produce these Entry processes? Due to the fact that these curl requests take a long time to respond and the Entry processes do not have time to work out?

    Without seeing your site completely unequivocally difficult to say. Is your hosting not Cloudlinux? (ie, is it a dedicated server with a dedicated number of resources that your site consumes, or do neighboring sites of other users consume resources?). What number of sites are you looking for? curl_multi not using?)

    If we assume that the problem is in CURL, then it is quite possible that if your timeout is not set, then we can assume the following situation:

    You have only 30 entry process. A person entered the site entered a request ... The request turned to the site (which has traffic overload or problems with hosting) .. You expect an answer from him .. You have 1 process hanging .. Then this person refreshes the page, enters another request and creates a second process .. Then another 28 people come in and do a search during, say, 1-2 minutes .. you have a limit of 30 simultaneously running processes ...

    Solution: Chop the hanging processes, the faster you chop the less likely that in N seconds there will be 30 people who will turn to the overloaded site. (i.e. 2 conditions №1 should coincide here - 30 people on your site are looking for something for N sec №2 - the remote site should have an overload in order not to give you an answer in these same 10 seconds) ... Reducing time expectations you are postponing a problem for the future i. this is a temporary solution ...

    Solutions:

    # 1) Enlarge Entry process

    # 2) Reduce CURL wait time (nailing processes)

    Try setting these parameters for CURL:

     ini_set('max_execution_time', 20) curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 10); curl_setopt ($ch, CURLOPT_TIMEOUT, 10 

    No. 3) I recommend to make the receipt of all the information from those sites on Crown, and that users search for information that you have already leaked from those sites a couple of hours ago. Ie once every 3-4 hours you merge all the information from remote resources .. All users are looking for information from you (without any external requests). Advantages of the approach - the load will be several times less, traffic is not consumed when searching for users, information is always available. No need to increase ENTRY PROCESS, etc.

    №4) Check the site for inaccessibility before each search request (or even better not before each, but before each 5-10th request), if the site is not available put a flag that is relevant, say 5-10 minutes, while the flag is relevant when searching for any user Search does not produce, and issue a message to everyone - repeat the request after N minutes ..

    Be sure to get logs .... who write how many visitors were ... at what time ..... what did they send ... maybe your search form was just kicking up .... Captcha I hope you didn’t forget to deliver for search?

    • Yes, Cloudlinux, I do not use curl_multi. How to chop processes, what do you mean? data comes quickly, 1-2 seconds. Another thing is that I do not track the availability of sites, maybe they were inaccessible and therefore there were hanging? Yes, I will try timeouts, I haven't used them yet. Kroner will not work, requests are different each time, but sometimes they repeat, I want to make caching request-responses to the database, but for the time being it is rather weak in php and time is not enough. - Jean-Claude
    • Then you make low timeouts ... if it doesn’t help ... then nada will check the site for availability before sending the request and if the site is not available block all other users from searching for N minutes ... - Lesiuk Alexey
    • Entry processes are the number of parallel-hanging processes, that is, if everything is fine, then the scripts work so quickly that the process dies in a matter of a split second and is free to execute again ... in your case, 30 processes are busy and not released .... - Lesiuk Alexey