JobFilterAttribute Enforce Retry First

Is there a way to prevent processing of additional jobs in a queue if there is a job in retry for the same queue?

Example:
enqueue Job1 () => account.Update(500)
enqueue Job2 () => account.Update(-100)
enqueue Job3 () => account.Update(-100)

If Job1 fails for whatever reason, I don’t want Job2 and Job3 to process until Job1 becomes successful or exceeds retry count.

I’d prefer not to have Job2 and Job3 fail and go into retry with it if that’s an option.

Setting PerformingContext Canceled = true in OnPerforming() in a JobFilterAttribute just seemed to tell a job it was successful.

Could you use continue for this? Assuming you can enqueue you work in such a way, it should solve your problem.

BackgroundJob.ContinueWith(jobId, () => Console.WriteLine("Continuation!"));

Andrew,

Thank you for the reply.

I suppose in this case that would be feasible though not practical. I would have to have a method that finds an existing job and attaches itself to the bottom of a ContinueWith “chain”.
enqueue Job1 () => account.Update(500) returns job id “13”
BackgroundJob.ContinueWith(“13”,() => account.Update(-100)) returns job id 26
BackgroundJob.ContinueWith(“26”,() => account.Update(-100))

I’m also not sure if there is support for breaking the chain and saying, ignore the prior and start here in the case of a job that can’t process.

What I found after posting this question was that I was trying to get Hangfire to work like a message queue preserving order. Not something its really designed to do.

I did find some ways to bend things to make it work, but I ultimately ended up getting away from hangfire queues and using a message queue to hold all messages in order. I used a JobFilterAttribute to ensure that any one queue wasn’t being processed by more than one job at a time to guarantee message order.