-
Notifications
You must be signed in to change notification settings - Fork 0
Auto Batched Requests
One of the best ways to improve performance, efficiency and reduce latency is to minimize the number of network requests required, which is one of the reasons we've always encouraged Coarse-grained API designs - which also lend themselves to better encapsulation and re-use.
A common use-case that can be improved are clients making multiple requests to the same API, but due to the lack of a better alternative batched API or control over the server implementation, will default to making multiple N+1 web service requests.
Thanks to it's message-based design, ServiceStack is able to enable high-level generic functionality like Request Batching which is now implicitly available for all Services, without any additional effort - where multiple requests of the same type can be sent together in a single HTTP Request.
This is enabled in all .NET Service Clients via the new SendAll()
and SendAllOneWay()
API's, e.g:
var client = new JsonServiceClient(BaseUrl);
var requests = new[]
{
new Request { Id = 1, Name = "Foo" },
new Request { Id = 2, Name = "Bar" },
new Request { Id = 3, Name = "Baz" },
};
List<Response> responses = client.SendAll(requests);
The API works as you would expect where multiple requests can be sent together and the Service Client will return a list of all responses in the same order as the requests were sent.
And on the back-end, your Services are none the wiser, remaining focused on handling a single Request DTO. In the case below the Service does some work then stores the response in Redis before returning it:
public class MyServices : Service
{
public object Any(Request request)
{
var response = DoWork(request);
Redis.Store(response);
return response;
}
}
From the Service's point of view nothing changes. Request DTO's still get executed one at a time, through all existing filters just as if they we're sent on their own. They're just delivered together within a single HTTP Request, in this case POST'ed as JSON to the /json/reply/Request[]
pre-defined route:
If a client was previously calling the same API 100 times, the existing overhead of 100 HTTP Requests would be reduced to just 1 HTTP Request when batched. Although the above Service would still be calling Redis 100 times to store each Response.
If later this API has become really hot and you want to improve it even further, you can later add a custom implementation that accepts a Request[]
and it will only get called once, with access to all the Request DTO's together. In this case we can use a custom implementation and take advantage of Redis's own batched API's and reduce this further to 1 Redis operation:
public class MyServices : Service
{
public object Any(Request request)
{
var response = DoWork(request);
Redis.Store(response);
return response;
}
public object Any(Request[] requests)
{
var responses = requests.Map(DoWork);
Redis.StoreAll(responses);
return responses;
}
}
So with this custom implementation we've gone from 100 HTTP Requests + 100 Redis Operations to 1 HTTP Request + 1 Redis Operation.
Another scenario where you may consider using a Custom Batched Implementation is if you wanted to execute all requests within a single RDBMS transaction, which with OrmLite would look something like:
public class MyServices : Service
{
public object Any(Request request)
{
var response = DoWork(request);
Db.Insert(request);
return response;
}
public object Any(Request[] requests)
{
using (var trans = Db.OpenTransaction())
{
var responses = requests.Map(x => Any(x));
trans.Commit();
return responses;
}
}
}
Just like with normal Batched Requests, Custom Batched implementations are still executed one at a time through all request/response filters, taking advantage of any existing logic/validation.
If you instead only wanted multiple Requests to be treated as a single Request through the entire pipeline you can create a new Request DTO that inherits from List<TRequest>
which then gets treated as a normal Request DTO e, g:
public class Requests : List<Request> {}
public class MyServices : Service
{
...
public object Any(Requests requests)
{
var responses = requests.Map(DoWork);
Redis.StoreAll(responses);
return responses;
}
}
More examples of Auto Batched Requests and its behavior can be found in the ReplyAllTests suite.
- Why ServiceStack?
- Important role of DTOs
- What is a message based web service?
- Advantages of message based web services
- Why remote services should use separate DTOs
-
Getting Started
-
Designing APIs
-
Reference
-
Clients
-
Formats
-
View Engines 4. Razor & Markdown Razor
-
Hosts
-
Security
-
Advanced
- Configuration options
- Access HTTP specific features in services
- Logging
- Serialization/deserialization
- Request/response filters
- Filter attributes
- Concurrency Model
- Built-in profiling
- Form Hijacking Prevention
- Auto-Mapping
- HTTP Utils
- Dump Utils
- Virtual File System
- Config API
- Physical Project Structure
- Modularizing Services
- MVC Integration
- ServiceStack Integration
- Embedded Native Desktop Apps
- Auto Batched Requests
- Versioning
- Multitenancy
-
Caching
-
HTTP Caching 1. CacheResponse Attribute 2. Cache Aware Clients
-
Auto Query
-
AutoQuery Data 1. AutoQuery Memory 2. AutoQuery Service 3. AutoQuery DynamoDB
-
Server Events
-
Service Gateway
-
Encrypted Messaging
-
Plugins
-
Tests
-
ServiceStackVS
-
Other Languages
-
Amazon Web Services
-
Deployment
-
Install 3rd Party Products
-
Use Cases
-
Performance
-
Other Products
-
Future