How to attach job filters to jobs created with BatchJob.StartNew() [Hangfire Pro]


#1

I’d like to attach a callback OnStateElection to a job created using BatchJob.StartNew(). Is this possible? If not, will global filters be applied to batch jobs?


#2

I’ve been trying to do something similar, I’d really like a way to inject a filter into a specific instance of the BatchJobClient so I can setup job data specifically for any job that instance of the client creates. I couldn’t find a good way to do this mostly because GlobalBatchFilters is internal (and for good reason). Maybe i’m going about this incorrectly.

Problem stems from webserver creating a job which is handled on distributed servers that create more jobs but also need to provide a “scope/tenant” which is defined in my custom JobActivator to drive the correct database connections via DI. I’ll likely write up a similar ticket with some example but the concept is identical to your question.


#3

Yes, your scenario sounds very similar to mine. My current work around is to embed a metadata object to each job via JobStorage.Current.GetConnection().SetJobParameter();. However, this requires the filter to process this object on every job and ultimately do nothing 99% of the time.


#4

My current solution is to decorate my enqueued methods that need Scope Id with a hand-rolled c# attribute MyScopeIdParameterAttribute and include the method argument into any of the jobs being created from my “entry job”. My entry job gets the scope parameter no problem as it’s easily accessible from the web server which enques the entry job and works normally as an IClientFilter.

When my JobActivator prepares the scoped IOC container it checks if the job being executed has an MyScopeIdParameterAttribute and if so just parses the job args. Bad part is the methods signature have a parameter the methods never use but is used by hangfire infrastructure to setup correct dependencies.

I tried the writing jobstorage stuff you did, it didnt seem better for my codebase.

I still think the ideal solution is to tuck an IClientFilter into an instance of the batch job client via job activator and ONLY that instance of the client gets it.

Maybe someone who knows more will chime in I’m still pretty new to hangfire


#5

We have a single job that starts our workflow, we pass a batch client into the start job and attach a filter as needed. Maybe this will help you Marty. Let me know if you have questions, we use StructureMap but the concept here should be re-usable for many things.

We ended using BeginScope to prepare a custom Client for usage in our prepare job via the JobActivator.

// The important code from JobActivator
public override JobActivatorScope BeginScope(JobActivatorContext context)
{
	var jobContextInfo = context.GetJobParameter<JobContextInfo>("JobContextInfo");
	var nestedContainer = container.GetNestedContainerForTenant(jobContextInfo.TenantId);
	nestedContainer.Configure(x => x.For<JobContextInfo>().Use(jobContextInfo));

	// Copy all global providers and then add our scoped filters
	var filterProvider = new JobFilterProviderCollection();
	foreach (var jobFilterProvider in JobFilterProviders.Providers)
	{
		filterProvider.Add(jobFilterProvider);
	}

	var customFilterCollection = new JobFilterCollection
	{
		nestedContainer.GetInstance<JobContextFilterAttribute>(),
	};

	filterProvider.Add(customFilterCollection);

	// Create batch job client with our filters
	var client = new BatchJobClient(JobStorage.Current, new BatchFactory(), new BackgroundJobFactory(filterProvider));
	nestedContainer.Configure(x => x.For<IBatchJobClient>().Use(client));

	return new StructureMapDependencyScope(nestedContainer);
}

// The custom filter to supply the data
public class JobContextFilterAttribute : IClientFilter
{
	private readonly JobContextInfo contextInfo;

	public JobContextFilterAttribute(JobContextInfo contextInfo)
	{
		this.contextInfo = contextInfo;
	}

	public void OnCreating(CreatingContext filterContext)
	{
		filterContext.SetJobParameter("JobContextInfo", this.contextInfo);
	}
}