-
Notifications
You must be signed in to change notification settings - Fork 0
Home
DeeDee is a high performance, low memory library used for dispatching and handing of in-process requests. It allows you to create a pipeline of actions that can be performed on these requests and supports short circuiting.
As this library uses Source Generators
, it targets .NET 6
Install package from Nuget Install-Package DeeDee
If your application already uses Microsoft.Extensions.DependencyInjection
, simply call services.AddDispatcher()
.
If you use some other DI framework, you will have to make sure to wire up ServiceProvider
delegate to represent your container.
For example:
services.AddSingleton<DeeDee.Models.ServiceProvider>(ctx => ctx.GetService);
as well as registering the IDispatcher
in your container.
service.AddSingleton<IDispatcher, Dispatcher>()
There are 2 different types of requests that are supported.
-
IRequest
that will return no data. This maps tovoid
for sync orTask
for async calls. -
IRequest<TResponse>
that returns aTResponse
. This maps toTResponse
for sync calls andTask<TResponse>
for async calls.
Examples
Send a request that will return nothing
public class GetARequest : IRequest
{
}
public class HandlerA : IPipelineActionAsync<GetARequest>
{
public Task InvokeAsync(GetARequest request, PipelineContext context, NextAsync next, CancellationToken cancellationToken = default)
{
return Task.CompletedTask;
}
}
await _dispatcher.SendAsync(new GetARequest());
Send a request that will return an int
public class GetBRequest : IRequest<int>
{
}
public class HandlerB : IPipelineActionAsync<GetBRequest, int>
{
public Task<int> InvokeAsync(GetBRequest request, PipelineContext<int> context, NextAsync<int> next, CancellationToken cancellationToken = default)
{
return Task.FromResult(42);
}
}
var result = await dispatcher.SendAsync(new GetBRequest());
Due to the dynamic code generation, the SendAsync
method will now have 2 overloads, one for each request that exists with full IDE visibility.
You can build up a pipeline of these handlers too
public class GetBRequest : IRequest<int>
{
public bool Toggle {get;}
public GetBRequest(bool toggle)
{
Toggle = toggle;
}
}
[Step(1)]
public class ValidateHandlerB : IPipelineActionAsync<GetBRequest, int>
{
public Task<int> InvokeAsync(GetBRequest request, PipelineContext<int> context, NextAsync<int> next, CancellationToken cancellationToken = default)
{
//if not toggled, then short circuit rest of the pipeline
if(!request.Toggle)
return Task.FromResult(42);
//you can store data that will be available to all handlers in the pipeline
context.AddItem("SomeKey", "SomeValue");
//call the next pipeline
return next(request, context, cancellationToken);
}
}
[Step(2)]
public class HandlerB : IPipelineActionAsync<GetBRequest, int>
{
public Task<int> InvokeAsync(GetBRequest request, PipelineContext<int> context, NextAsync<int> next, CancellationToken cancellationToken = default)
{
//retrieve data from bag
if(!context.TryGetValue("SomeKey", out var value))
throw new Exception();
//either set the Result if you have more actions in your pipeline
context.Result = 42;
return next(request, context, cancellationToken);
//OR directly return the result
// return Task.FromResult(42);
}
}
var result = await dispatcher.SendAsync(new GetBRequest());
The Step()
attribute determines the order of your handlers in the pipeline;
The PipelineContext
uses a custom FrugalDictionary
that will only allocate when it reaches 11 or more items.
The sync version implements IPipelineAction
Due to the dynamic code generation, all types are known during compile time. This avoids reflection and constant ConcurrentDictionary
lookups.
Below is the benchmarks performed against the highly popular Mediatr
library.
| Method | Mean | Error | StdDev | Ratio | RatioSD | Gen 0 | Gen 1 | Gen 2 | Allocated |
|-------- |----------:|----------:|----------:|------:|--------:|-------:|------:|------:|----------:|
| DeeDee | 60.89 ns | 2.324 ns | 6.742 ns | 1.00 | 0.00 | 0.0057 | - | - | 24 B |
| Mediatr | 337.21 ns | 10.659 ns | 30.583 ns | 5.59 | 0.80 | 0.1106 | - | - | 464 B |
DeeDee is almost 6x faster and allocates almost 20x less.