Enqueuing jobs sharing the same inerface, is it possible?


So in my case, I have multiple background jobs sharing the same interface, registered using Autofac. I noticed that when I try to Enqueue a job using that interface as a type parameter in the generic overload of the method, it is not started. I also noticed that the job type in the database json is the interface and not the concrete job type and the AutofacJobActivator is throwing an exception that states that it can not resolve the component. According to the exception, the component is not registered.

In my app I am resolving the jobs using named instances. I was unable to find such option using Hangfire. Is there a way to use named instances in Hangfire or some other workaround for this case?

Thanks! :slight_smile:


Can you post what your Autofac registration looks like please? I’ve seen something like this when not registering As<T>() the correct type. You might also have to tell the autofac activator to use the .WithAttributeFiltering() option so it knows to look for named types. If you have multiple jobs which all implement IJob for example, and you Enqueue<JobOne>(x=>x.Run()) then you need to register As<JobOne>(). If you Enqueue<IJob>(x => x.Run()) then you need to tell Hangfire which IJob to spin up and I am not sure that is possible.

1 Like

Hi teabaggs,

Thank you for the response!

My jobs are registered like this:


I tried to register them both ways, with As<> and without. No luck.

I think I’m facing the last case you mentioned. In my code I have something like this:

 IJob myJob = LifetimeScope.ResolveNamed<IJob>("firstJob");
 BackgroundJob.Enqueue<IJob>(job => myJob.Execute(userId));

Am I missing something or truly this isn’t possible with Hangfire?

Hi Iliyan,

You won’t be able to do what you want out of the box. There are a few things happening that prevent this from working like you want. First Hangfire doesn’t care that myJob was previously resolved to a concrete type because Enqueue and Enqueue<T> don’t really do any work, they just saves some details about the job and parameters and schedule into whatever job storage you are using. Later Hangfire will see it has work to do and try to activate the job and execute it. The problem you are facing is that BackgroundJob.Enqueue<IJob>(job => myJob.Execute(userId)); does not provide Hangfire or the Autofac activator enough information to decide which type of IJob to create. Notice that the “firstJob” name is not included anywhere in the signature of the Enqueue method. Also note that even if it were the Autofac activator would not care because it doesn’t resolve named types.

So now you have a few options. Since it looks like you know the concrete type since you are resolving the named instance on the line before you can Enqueue<MyFirstJob>(job => job.Execute(userId)); rather than the IJob interface and that’ll work. You could add additional marker interfaces like IFirstJob and Enqueue<IFirstJob>(...) then Autofac will know which concrete type to activate.

Or you could poke around Hangfire and Autofac activator source and make named type resolution work. This is of course the more involved solution. You would have to do a few things. First, save the name information with the type and other job context information when a job is enqueued and saved to job storage. When a job is picked up to be processed and executed you would need to pass the name as well as the type to the job activator. Then in the activator you could resolve a named instance of a type. This path looks like it would require modifying core Hangfire code so I would not recommend it but you should post an issue on GitHub and see what other people think. If more meta information were included in a job context, in this case we’re talking about the name property, and the job activator were given the job context itself with that metadata rather than just a type then the activator could figure it out and resolve a named type.

1 Like

I think it can actually be solved without altering the Hangfire core codebase.

  1. Add extension methods like EnqueueNamed<T>(string name, ...) for IBackgroundJobClient etc.
  2. Save name as one of the job parameters.
  3. Provide a custom job activator (based on current Autofac activator code), which would check if there’s service name present among job parameters, and resolve a named or nameless instance accordingly.

But I’d rather suggest not to use named services at all, because it feels like a rather dubious feature of Autofac. It should be DI container to decide which implementation to use, while named services are (somewhat) making the client to decide.

I did the following to create a factory method for creating jobs.


public abstract class BaseJob
    public string SomeSharedProperty{get;}

Contains any public objects you may want to share between your IJob Implemented classes.

IJob Interface

public interface IJob<in TJobData> where TJobData : IJobData
    Task ExecuteAsync(TJobData jobData, PerformContext context);

IJobData Interface (just for constraints on the JobData classes

public interface IJobData

JobCreator factory method

public class JobCreator<TJob, TJobData>
    where TJob : IJob<TJobData>, new()
    where TJobData : IJobData
    public async Task ExecuteAsync(TJobData jobData, PerformContext context)
        var job = new TJob();
        await job.ExecuteAsync(jobData, context);

JobClient.Enqueue<JobCreator<ForgotPasswordJob, ForgotPasswordJobData>>(job => job.ExecuteAsync(jobData,null));

ForgotPasswordJob inherits BaseJob, and implements IJob<ForgotPasswordJob>

Sample Job class

public class ForgotPasswordJob : BaseJob, IJob<ForgotPasswordJobData>

        public async Task ExecuteAsync(ForgotPasswordJobData jobData, PerformContext context)