Round-Robin processing of jobs in certain queue

I have a number of users of a site that may request long-running tasks to be completed. This is obviously done using hangfire. The issue being we don’t want User B to have to wait an unfair amount of time for their results because User A submitted many jobs.

At the moment the site is configured to just queue each new job in the DEFAULT queue. Is it possible to have multiple servers configured to pick jobs from one queue after another?

I’ve experimented with adding jobs to custom queues for each user. However that means we’d need to spool up a new server for each one as it doesn’t appear possible to attach queues to servers once they’re running?

What if you add a static set of queues, like priority-1, priority-2, priority-N and assign each user to its own priority so you can have one server instance processing all of these queues?

Thanks for the reply.

I did think of this. The reason being each set of queues would need to be defined when the server is started. If 2 users get assigned to the same queue, we still have the same issue.

Ideally we’d have the ability to have a server subscribe to EVERY queue (or a filtered set somehow) and then take a job from each queue one after another. Regardless of when the job was added.

But what if you have a strict number of queues:

  • priority-1
  • priority-2
  • priority-3
  • priority-4
  • priority-5

Each user is being assigned to one of these static priorities, and there are no other priorities as well. Why this will not work for you? How do you rank your users?

All the users are equal but if one user submits 500 jobs in one go. Then another user submits only 1 job. We don’t want a situation to be possible where the 1 job user has to wait for the 500 to be processed.
We don’t know how many jobs to expect or from how many users.
So it would appear it’s still possible to have a situation where a user wanting just 1 job would have to wait a long time.

Another approach – have, say, two queues, normal and low and put all the jobs by default to the normal queue. However, if a user exceeds a threshold, say, 10 jobs, put his jobs to the low queue. How do you think?

(Apologies for digging this back up)

Using a high/low priority system is probably not the best way to do things either. A user’s job may actually create additional jobs. (It is a file upload requiring processing… But might be a zipped file that triggers jobs for each contained file)

If I were to create a queue per user I’m concerned repercussions:
If we have user-1, user-2, user-3 queues… As queues are in priority doesn’t that mean that user-1’s jobs will still take priority? Does this mean that it will execute all of user-1’s tasks before starting user-2?

I am doing it slightly differently, but maybe it will help you in your case. I start a queue per unique object, using a queuename generator. So every user would probably end up with his own queue. There can be some collisions off course, but not very probable. (37 characters per position, 20 positions), you can choose to prefix with usr_ you still have 16 positions, that’s quite a lot of unique users :wink:

I start a regular hangfire server for the default queue. In parallel I start a service, get all queues not default, make a queue out of them. Following after that I do something like: While MyQueue.DeQueue; Run a server for a given time, e.g. 2 seconds, dispose the server end while.

When the queue is empty, start all over.