5

My use case:

My application will have any number of Users, and each of those Users will have various Jobs they want to process. These jobs should be able to happen concurrently -- some of them, anyway.

So, for example, if User1 sends through a command like UpdateStock and CompleteSale, and User2 sends through a command like UpdateStock, each of those commands needs to be queued up and allowed to run concurrently with each other.

I've been looking into Laravel Queues and it looks like, if I were to put all 3 jobs into one queue, they would only run one at a time, but if I put them into 3 different queues, they'd all be able to run concurrently. Do I understand that right?

So that brings me to my next issue then: I'll need one queue, per user, per command. But of course new users are signing up all the time, and if I need to run php artisan queue:listen --queue=user1:command1 & php artisan queue:listen --queue=user1:command2 & php artisan queue:listen --queue=user2:command1 ... to get this to work then... how is that even possible? When a new user could be added any minute...

I mean I could write some php that executes php artisan queue:listen --queue=userx:commandx programmatically after every time I add something to the queue, but I'm not sure that makes sense. What if that command has already run -- will it have 2 processes trying to process the same queue?

Perhaps I'm thinking about it the wrong way though. I'm feeling kinda lost about the whole thing. I could use some advice.

2 Answers 2

4

To answer the first point: Yes, If you have 3 Queues, each will process concurrently provided there are Queue workers listening to each of them.

As for the rest of the scenario, it sounds like another approach is needed as

one queue, per user, per command

sounds a little overkill!

In my experience, i would suggest having a handful of queues like "low", "default" & "high" priority queues with the higher priorit queues having more workers (and potentially a shorter wait time). You can configure these using programs like supervisor if you are running in a linux environment.

[program:high-priority-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/app/artisan queue:work redis --sleep=3 --tries=3 --queue=high
autostart=true
autorestart=true
user=user
startsecs=0
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/www/app/worker.log

[program:default-priority-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/app/artisan queue:work redis --sleep=3 --tries=3
autostart=true
autorestart=true
user=user
startsecs=0
numprocs=4
redirect_stderr=true
stdout_logfile=/var/log/www/app/worker.log

[program:low-priority-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/app/artisan queue:work redis --sleep=3 --tries=3 --queue=low
autostart=true
autorestart=true
user=user
startsecs=0
numprocs=2
redirect_stderr=true
stdout_logfile=/var/log/www/app/worker.log

If you are concerned about ensuring a sequence of jobs are excecuted in order, you may also want to look into Job Chaining (https://laravel.com/docs/6.x/queues#job-chaining)

Sign up to request clarification or add additional context in comments.

7 Comments

So quick question: say I set up Supervisor and set my high queue to have numprocs=8 -- does that mean that if I add 8 jobs to the 'high' queue, they will (or could) all be processed concurrently? Because they're all happening in one of the 8 processes dedicated to that queue?
Also, on your middle worker, you didn't specify a queue -- does that mean that that only processes jobs that were registered without specifying a queue, or specifying the default queue, and it won't process jobs that specify another queue?
So the first queue in laravel is called "default" and is set as the default, you can be specific if you want but it's up to you. As for the processes, supervisor would spawn 8 workers for the high priority queue and each would poll the queue every 3 seconds. This means you would have concurrent throughput for up to 8 jobs in the "high" queue. When a job is picked up, it should be locked by the worker that picked it up but this was an issue in earlier versions of Laravel.
That's brilliant advice Spholt, I might end up going through this route. I thought my only routes for concurrency were (a) countless queues, or (b) using Guzzle async requests and trying to get real clever with them, but this looks like it's probably a better option than literally every other option I thought of.
I'm going to leave the question open for a while but I doubt I'll get an answer more useful than this. I'll mark this as correct in a couple hours.
|
1

if I were to put all 3 jobs into one queue, they would only run one at a time, but if I put them into 3 different queues, they'd all be able to run concurrently. Do I understand that right?

Yes, the jobs are put after each other if you only have one queue.

What you are really talking about here is scaling.

The way I think most would go about this was to have sufficient queues. And then be able to scale that up. What you are proposing yourself (starting a queue for every new user) is also an option. Having multiple listeners for the same queue will give you problems. I did it by mistake some years ago, and the jobs were not probably "locked", meaning the same job would be executed multiple times.

Laravel Horizon brings you balancing options, meaning you could have new jobs go into free queues. So you do not have to dedicate one queue per user.

In case you need total concurrency (the jobs have to start at the exact same time), balancing in that sense is probably not enough.

Having multiple queues update the same stock numbers sounds a little risky though. I could think of some database locking issues and possible race conditions.

To give you a more specific recommendations I would need to understand why you need one queue per user in your case/example.

2 Comments

To give you a more specific recommendations I would need to understand why you need one queue per user in your case/example. -- the thinking is this: these jobs are all for doing things like updates and fetches from apis like Amazon and Ebay and Shopify. Each of these apis implements a 'leaky bucket' with their apis - I can make, say, 40 calls immediately, but only one every half second after that - so each of my customers has a unique bucket, so User1 and User2 and User...N can technically all make requests to Amazon at the same time.
So I might even have one queue per user, per Channel -- a queue for User1Amazon, User2Amazon, User1Ebay, User2Ebay -- or a queue per user per job type -- User1UpdateStock, User2UpdateStock, User1UpdateOrder, User2UpdateOrder

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.