async await

A proper thread safe memory cache

The Core 2.2 IMemoryCache is in theory thread safe. But if you call GetOrCreateAsync from multiple threads the factory Func will be called multiple times. Which could be a bad thing. A very simple fix to this is using a semaphore.

Declare it and only let one concurrent request be granted.

private readonly SemaphoreSlim _cacheLock = new SemaphoreSlim(1);

Let one request the cache and when done release the semaphore.

await _cacheLock.WaitAsync();
var data = await _cache.GetOrCreateAsync(key, entry => ...);
_cacheLock.Release();

Parallel executed Tasks with isolated scopes

My current customers infrastructure is heavily dependent on external suppliers of data. Because of the nature of the data the system often have to-do the requests in real time while the end-customer is waiting for the response. Parallel tasks comes in handy when you want to aggregate data from several end points, both because it puts less strain on the Thread Pool and that your response time will be faster because you do not need to wait for each to complete (Parallel vs Sequential).

The problem starts with frameworks that does not play nice with sharing their resources over multiple Tasks/Threads, an example of this is the Entity Framework DbContext. One way is to marshall the lifetime of the context yourself and spawn one for each parallel task. But this is not a solid design, if you use a IOC you want any object in the current graph to receive the same instance of the DbContext without bothering with lifetime code. I created a little class called TaskRunner for this purpose (more…)