0

I have a website in shared hosting with the Entry Processes limit of 30. I fetch data periodically from another URL using cURL function on a PHP cron job. The relevant code is as below.

    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "GET");
    curl_setopt($ch, CURLOPT_HTTPHEADER, $header);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);

Most of the time my website runs pretty well with only around 4-5 entry processes being used. Whenever this target $url is doesn't response for some reason (and it happens frequently). I quickly get into Entry Process limit and all further requests are denied.

The CURLOPT_CONNECTTIMEOUT doesn't seem to be working as expected. How can I avoid this situation ? I have checked other cURL options but none seem to be working.

2
  • 1
    Have you forgot to call curl_close($ch) ? Commented Jan 22, 2020 at 10:30
  • I have closed curl. Just forgot to include on question. Commented Jan 22, 2020 at 10:44

1 Answer 1

2

A 10 seconds timeout is very long. Depending on how many requests you serve and if all your requests trigger this call, all your available processes could be just waiting for answers from that server.

You could consider lowering that number.

There is also a second timeout for curl CURLOPT_TIMEOUT. Try setting this in addition. If the connection to the server is done within 10 seconds and then the server takes 60 seconds to serve your request, the current timeout doesn't help you because it only limits the connect time.

If you don't want to be dependent on what cURL is doing you could also set a time limit for the php process itself with set_time_limit(). If you set that to e. g. 30 seconds, php will stop the execution after that time no matter if cURL is done yet or not. This should be done before the curl calls.

Sign up to request clarification or add additional context in comments.

3 Comments

Thank you for the detailed answer. This script just runs once per minute in background. Web request doesn't trigger this. So even it is stuck it should take one more entry process. But the whole 30 entry process are filled if remote URL is non responsive. What can be the cause.
This sounds like the requests are pending for a very long time and then the processes waiting for requests pile up. Have you tried adding that second timeout? In theory cURL should have a default timeout and should not wait indefinitely but you never know. You could also set a time limit for the php process itself with set_time_limit(). If you set that to e. g. 30 seconds, php will stop the execution after that time no matter if cURL is done yet or not.
Thank you for the second parameter. I have tested with it and seems like the connection is dropped if server doesn't respond. I am testing on live. Currently the remote URL is working. If may fail to response after some time than I will check. Thank you for your answer.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.