4

On my Linux server I have the following cron:

* * * * * php /var/www/core/v1/general-api/artisan schedule:run >> /dev/null 2>&1

The CRON works correctly. I have a scheduled command defined in my Kernel.php as such:

    protected function schedule(Schedule $schedule)
    {
        $schedule->command('pickup:save')
            ->dailyAt('01:00');
        $schedule->command('queue:restart')->hourly();
    }

The scheduled task at 1AM runs my custom command php artisan pickup:save. The only thing this command does is dispatch a Job I have defined:

    public function handle()
    {
        $job = (new SaveDailyPropertyPickup());
        dispatch($job);
    }

So this job is dispatched and since I am using the database driver for my Queues, a new row is inserted into the jobs table.

Everything works perfectly up to here.

Since I need a queue listener to process the queue and since this queue listener has to run basically forever, I start the queue listener like this:

nohup php artisan queue:listen --tries=3 &

This will write all the logs from nohup to a file called nohup.out in my /home directory

What happens is this: The first time, queue is processed and the code defined in the handle function of my SaveDailyPropertyPickup job is executed.

AFTER it is executed once, my queue listener just exits. When I check the logs nohup.out, I can see the following error:

In Process.php line 1335:

  The process "'/usr/bin/php7.1' 'artisan' queue:work '' --once --queue='default' 
  --delay=0 --memory=128 --sleep=3 --tries=3" exceeded the timeout of 60 seconds.

I checked this answer and it says to specify timeout as 0 when I start the queue listener but there are also answers not recommending this approach. I haven't tried it so I dont know if it will work in my situation.

Any recommendations for my current situation?

The Laravel version is 5.4

Thanks

1
  • What does the job do, maybe it takes longer than 60 seconds? Commented Mar 30, 2019 at 0:59

1 Answer 1

1

Call it with timeout parameter, figure out how long your job takes and scale from there.

nohup php artisan queue:listen --tries=3 --timeout=600

In your config you need to update retry after, it has to be larger than timeout, to avoid the same job running at the same time. Assuming you use beanstalkd.

    'beanstalkd' => [
        ...
        'retry_after' => 630,
        ...
    ],

In more professional settings, i often end up doing a queue for short running jobs and one for long running operations.

Sign up to request clarification or add additional context in comments.

8 Comments

Hi.. Thanks for the answer. So the job in question takes quite a while. Last I checked at least 15-18 minutes. It has to query a DB, process the resultset then insert it into another DB. Another small issue I would have with explicitly specifying the timeout would be that, each day as the amount of data in the DB accumulates, the job will take longer and longer. So is it OK to, for example, specify a timeout like 1000?
I would create two queues one for quick jobs and one for long running jobs. High timeout is not good on quick job that crashes then it have to wait the timeout time until it can conclude it crashed. Another strategy could be to split your job into smaller jobs, lets say you want to parse 1000 elements make a job for 10 of these elements and then create a 100 jobs.
Understood. So is it bad practice to have a arbitrary, high timeout value on a job that takes a long time? Also I updated the server with your suggested changes, I will let you know if the problem is solved at 1AM today (in 7 hours)
So is it bad practice to have a arbitrary, high timeout value on a job that takes a long time? yes, i seen in production environments with an environment that had problems a lot of jobs crashed, since the timeout was 4 hours, it took a while before it figured out it crashed and retried it
to update you. The scheduler works now and it doesnt time out anymore. The first time the scheduler ran it took 21 minutes so I set the --timeout to 1350 and retry_after to 1400. It works now, but the time taken to complete this will be always increasing as the amount of data increases. I tried to break it up into smaller jobs as you suggested but I cant break the time consuming part up, unfortunately. The parts I CAN break up dont take that long. Anyway, you answered my original question so thank you very much for your help
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.