Do I understand correctly that users creating form curl-requests and produce these Entry processes? Due to the fact that these curl requests take a long time to respond and the Entry processes do not have time to work out?
Without seeing your site completely unequivocally difficult to say. Is your hosting not Cloudlinux? (ie, is it a dedicated server with a dedicated number of resources that your site consumes, or do neighboring sites of other users consume resources?). What number of sites are you looking for? curl_multi not using?)
If we assume that the problem is in CURL, then it is quite possible that if your timeout is not set, then we can assume the following situation:
You have only 30 entry process. A person entered the site entered a request ... The request turned to the site (which has traffic overload or problems with hosting) .. You expect an answer from him .. You have 1 process hanging .. Then this person refreshes the page, enters another request and creates a second process .. Then another 28 people come in and do a search during, say, 1-2 minutes .. you have a limit of 30 simultaneously running processes ...
Solution: Chop the hanging processes, the faster you chop the less likely that in N seconds there will be 30 people who will turn to the overloaded site. (i.e. 2 conditions №1 should coincide here - 30 people on your site are looking for something for N sec №2 - the remote site should have an overload in order not to give you an answer in these same 10 seconds) ... Reducing time expectations you are postponing a problem for the future i. this is a temporary solution ...
Solutions:
# 1) Enlarge Entry process
# 2) Reduce CURL wait time (nailing processes)
Try setting these parameters for CURL:
ini_set('max_execution_time', 20) curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 10); curl_setopt ($ch, CURLOPT_TIMEOUT, 10
No. 3) I recommend to make the receipt of all the information from those sites on Crown, and that users search for information that you have already leaked from those sites a couple of hours ago. Ie once every 3-4 hours you merge all the information from remote resources .. All users are looking for information from you (without any external requests). Advantages of the approach - the load will be several times less, traffic is not consumed when searching for users, information is always available. No need to increase ENTRY PROCESS, etc.
№4) Check the site for inaccessibility before each search request (or even better not before each, but before each 5-10th request), if the site is not available put a flag that is relevant, say 5-10 minutes, while the flag is relevant when searching for any user Search does not produce, and issue a message to everyone - repeat the request after N minutes ..
Be sure to get logs .... who write how many visitors were ... at what time ..... what did they send ... maybe your search form was just kicking up .... Captcha I hope you didn’t forget to deliver for search?