A benchmarking project that measures the cold start and execution time difference between Native AOT and JIT compiled .NET 10 Lambda functions. It also serves as a reusable template for building production .NET Lambda functions on AWS.
- Overview
- Architecture
- Project Structure
- Prerequisites
- Deployment
- Running the Benchmark
- Analysing Results
- Sample Results
- Using as a Template
The project deploys a .NET 10 Lambda function that exposes a GET /test-data HTTP API endpoint. The function queries a DynamoDB table and returns the results. It is designed to be deployed in two configurations to compare performance:
| Configuration | Description |
|---|---|
| AOT | Published with PublishAot=true, producing a self-contained native binary. Faster cold starts, smaller init overhead. |
| JIT | Deployed with standard dotnet10 runtime. The CLR compiles code at runtime (Just-In-Time). |
The benchmark script forces a cold start on every invocation by updating an environment variable before each call, and records Init Duration and Execution Duration from the Lambda REPORT log line.
HTTP Request
│
▼
API Gateway (HTTP API)
│
▼
AWS Lambda (dotnet10 / Native AOT)
│
▼
Amazon DynamoDB
The Lambda function uses:
- Amazon Lambda Annotations for attribute-based function and route declaration
- Source-generated JSON serialisation (
System.Text.Json) for AOT compatibility — no runtime reflection - Dependency injection via
[LambdaStartup]andMicrosoft.Extensions.DependencyInjection - DynamoDB SDK via
AWSSDK.DynamoDBv2
The DynamoDB table name is injected via the DYNAMODB_TABLE_NAME environment variable.
dotnet-lambda-bench/
├── DotnetLambdaBench.sln # Solution file
├── bench.sh # Benchmark runner script
├── analyse.js # Results analysis script (Node.js)
├── aot_cold_starts.csv # Collected AOT benchmark data
├── jit_cold_starts.csv # Collected JIT benchmark data
│
├── DotnetLambdaBench.LambdaAot/ # Lambda function project
│ ├── Function.cs # Handler: GET /test-data
│ ├── Startup.cs # DI container configuration
│ ├── Assembly.cs # Assembly-level Lambda attributes
│ ├── serverless.template # SAM template (managed by Annotations)
│ └── aws-lambda-tools-defaults.json # Lambda CLI deployment defaults
│
├── DotnetLambdaBench.DataAccess/ # Data access layer
│ ├── Domain/Data.cs # Domain model
│ └── Queries/DataQuery.cs # DynamoDB query logic
│
└── .github/
├── workflows/ci.yml # CI/CD pipeline (deploys on push to main)
└── actions/deploy-dotnet-lambda/ # Reusable composite action for deployment
└── action.yml
- .NET 10 SDK
- AWS CLI configured with appropriate credentials
jq— required bybench.shfor JSON manipulation- Node.js — required by
analyse.js - An AWS Lambda function deployed and accessible via the AWS CLI
- A DynamoDB table with items queryable by a partition key
Id
Pushing to main automatically deploys the Lambda function. The workflow:
- Checks out the code
- Authenticates to AWS using OIDC (
id-token: write) - Publishes the project with Native AOT for
linux-x64 - Zips the publish output and updates the Lambda function code
The workflow is defined in .github/workflows/ci.yml. Update the following values to target your environment:
role-to-assume: arn:aws:iam::<ACCOUNT_ID>:role/<ROLE_NAME>
aws-region: <YOUR_REGION>
function-name: <YOUR_FUNCTION_NAME>AOT (Native AOT):
cd DotnetLambdaBench.LambdaAot
dotnet publish -c Release -r linux-x64 /p:PublishAot=true
cd bin/Release/net10.0/linux-x64/publish
zip -r ../../../../../lambda.zip .
aws lambda update-function-code \
--function-name <YOUR_FUNCTION_NAME> \
--zip-file fileb://lambda.zipJIT (Standard runtime):
cd DotnetLambdaBench.LambdaAot
dotnet publish -c Release -r linux-x64 /p:PublishAot=false
# zip and deploy as aboveNote: AOT requires a Linux
x64build target because Lambda runs on Linux. Build on Linux or use a Docker-based build if cross-compiling from macOS/Windows.
The bench.sh script forces a cold start on each iteration by updating the COLD_START_BUSTER environment variable before every invocation, causing Lambda to initialise a fresh execution environment.
Edit the variables at the top of bench.sh:
FUNCTION_NAME="dotnet-lambda-bench" # Your Lambda function name
ITERATIONS=100 # Number of cold-start iterations
OUTPUT_FILE="cold_starts.csv" # Output CSV pathchmod +x bench.sh
# Run for AOT deployment
./bench.sh
mv cold_starts.csv aot_cold_starts.csv
# Redeploy as JIT, then run again
./bench.sh
mv cold_starts.csv jit_cold_starts.csvThe script records Init Duration and Execution Duration from the Lambda REPORT log line into a CSV file:
Iteration,InitDuration_ms,ExecutionDuration_ms
1,214.80,111.86
2,165.31,117.05
...
Iterations where no Init Duration is present (i.e. a warm start was recorded) are logged as a warning and recorded with 0 for init duration.
The analyse.js script reads a cold starts CSV and outputs Average, P90 and P99 statistics for both Init Duration (cold start overhead) and Execution Duration (handler runtime).
Update the filename in analyse.js to point to the CSV you want to analyse, then run:
# Analyse JIT results
node analyse.js
# To analyse AOT results, update the filename in analyse.js to 'aot_cold_starts.csv'
node analyse.jsExample output:
========================================
📊 BENCHMARK RESULTS (100 Iterations)
========================================
🥶 COLD STARTS (Init Duration):
Average: 175.83 ms
P90: 218.57 ms
P99: 228.68 ms
🔥 EXECUTION TIME (Handler Duration):
Average: 114.42 ms
P90: 122.81 ms
P99: 125.76 ms
========================================
Benchmarks were run over 100 iterations on a 512 MB Lambda function in the af-south-1 region using the dotnet10 runtime.
| Metric | AOT | JIT | AOT Improvement |
|---|---|---|---|
| Average | ~176 ms | ~481 ms | ~2.7× faster |
| P90 | ~219 ms | ~729 ms | ~3.3× faster |
| P99 | ~229 ms | ~751 ms | ~3.3× faster |
| Metric | AOT | JIT | AOT Improvement |
|---|---|---|---|
| Average | ~114 ms | ~527 ms | ~4.6× faster |
| P90 | ~123 ms | ~554 ms | ~4.5× faster |
| P99 | ~126 ms | ~562 ms | ~4.5× faster |
Execution time includes the DynamoDB query. The significant JIT execution overhead on cold starts reflects JIT compilation cost on first execution.
This project is structured to serve as a starting point for .NET Lambda functions. Key patterns to carry forward:
AOT requires ahead-of-time knowledge of serialised types. Register all types in a JsonSerializerContext:
[JsonSourceGenerationOptions(PropertyNamingPolicy = JsonKnownNamingPolicy.CamelCase)]
[JsonSerializable(typeof(APIGatewayHttpApiV2ProxyRequest))]
[JsonSerializable(typeof(APIGatewayHttpApiV2ProxyResponse))]
[JsonSerializable(typeof(YourResponseType))]
public partial class LambdaFunctionJsonSerializerContext : JsonSerializerContext { }And register the serialiser in Assembly.cs:
[assembly: LambdaSerializer(typeof(SourceGeneratorLambdaJsonSerializer<LambdaFunctionJsonSerializerContext>))]Add services in Startup.cs:
[LambdaStartup]
public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
services.AddAWSService<IAmazonDynamoDB>();
services.AddSingleton<string>(_ => Environment.GetEnvironmentVariable("YOUR_ENV_VAR")
?? throw new InvalidOperationException("YOUR_ENV_VAR is not set."));
services.AddScoped<YourService>();
}
}When constructor-injected types may be trimmed, preserve them explicitly:
[DynamicDependency(DynamicallyAccessedMemberTypes.All, typeof(YourFunction))]
public YourFunction(YourService service) { ... }In your .csproj:
<PublishAot>true</PublishAot>
<StripSymbols>true</StripSymbols> <!-- Reduces binary size on Linux -->
<TrimMode>partial</TrimMode> <!-- Trims only assemblies marked as trimmable -->The composite GitHub Action in .github/actions/deploy-dotnet-lambda/action.yml can be reused across repositories. It accepts project-path, function-name, region, and publish-aot as inputs, making it straightforward to add to any .NET Lambda workflow.