You’re correct that child containers might help solve this.
For SimpleInjector, what you will need to do is use the ExecutionContextScopeLifestyle (SimpleInjector.Extensions.ExecutionContextScoping), and register your DbContext with a hybrid lifestyle, using the Lifestyle.CreateHybrid method:
Container.Register<IFoo, Foo>(Lifestyle.CreateHybrid(() => System.Web.HttpContext.Current != null,
new WebApiRequestLifestyle(),
new ExecutionContextScopeLifestyle())
);
The tricky part is that now you have to actually create and start up an Execution Scope, and then end it when you are executing a job. I found the easiest way to do this is to actually leverage the Hangfire job activator, and call the following from within ActivateJob():
_container.BeginExecutionContextScope()
The final piece is to create a Hangfire job filter, and handle the OnPerformed hook, in order to dispose of your execution context scope once the job has completed (performed):
public class SimpleInjectorExecutionContextScopeEndFilter : JobFilterAttribute, IServerFilter
{
private readonly Container _container;
public SimpleInjectorExecutionContextScopeEndFilter(Container container)
{
_container = container;
}
public void OnPerforming(PerformingContext filterContext)
{
}
public void OnPerformed(PerformedContext filterContext)
{
var scope = _container.GetCurrentExecutionContextScope();
if(scope != null)
scope.Dispose();
}
}
You can then register this in your HF bootstrapper similarly to your Activator:
GlobalConfiguration.Configuration.UseActivator(new SimpleInjectorJobActivator(container));
GlobalJobFilters.Filters.Add(new SimpleInjectorExecutionContextScopeEndFilter(container));
When enqueueing jobs from your web controllers, just make sure you specify the registered service that should be constructed by the container from the Hangfire execution scope (e.g. don’t inject the dependency in your controller and use it in your queueing code):
[HttpGet]
[Route("api/foo/job")]
public IHttpActionResult QueueFoo(string fooData)
{
var queueId = BackgroundJob.Enqueue<IFoo>(foo=>
foo.DoStuff(fooData, User.Identity.Name));
return Ok(queueId);
}
I will caveat with this: If you are doing a lot of concurrent work, parallel threads, etc., this will not be a good approach when using DB Connections. You will eventually encounter odd errors saying that there are already open Db readers, or that Multiple Active Result sets aren’t enabled, etc. This is because the scope of the DB connection is the Job, not the thread, so if you open a connection and while that query/command is being executed, you parallelize other queries, ADO.NET won’t be able to handle this very well.
You will need to take a step back and look at your architecture and find what you’re trying to do in parallel, and how granular your HF job needs to be. If you do need to create parallel worker threads within a job, you may want to switch from injected DB contexts to creating atomic DB contexts within the repository methods, with a using stmt to dispose appropriately. Welcome to the world of complex web and non-web scopes with databases!