A Deep Dive into AOT vs ReadyToRun vs Regular .NET
AWS Lambda Benchmarks & Best Practices
User Request
⬇️
[ 6.7 seconds waiting... ]
⬇️
Response
"Why is this so slow?" 🤔
Three Ways to Run .NET
| Model | Description |
|---|---|
| Regular .NET | JIT compilation at runtime |
| ReadyToRun (R2R) | Hybrid: precompiled + JIT fallback |
| NativeAOT | Full ahead-of-time compilation |
Each has its place in the ecosystem
Application Start Flow:
1. Load IL assemblies
2. Initialize runtime (CLR) ⏱️ Slow
3. JIT compile on first call ⏱️ Slow
4. Execute native code
5. Tier 0 → Tier 1 optimization
IL = Intermediate Language | JIT = Just-In-Time
Application Start Flow:
1. Load R2R + IL assemblies
2. Initialize runtime (CLR) ⏱️ Still needed
3. Execute precompiled code ⚡ Fast
4. JIT only for generics/edge cases
Key Point: Ships BOTH IL and precompiled native images
Application Start Flow:
1. Execute native binary ⚡ Instant
2. No runtime initialization ⚡ Eliminated
3. No JIT compilation ⚡ Eliminated
4. Predictable performance ⚡ Consistent
🎯 Single native executable, no CLR needed!
REGULAR .NET:
[Download] → [Extract] → [Init CLR] → [Load IL] → [JIT] → [Execute]
NATIVEAOT:
[Download] → [Extract] → [Execute]
AOT eliminates the two slowest steps! ⚡
The Serverless Challenge
Traditional .NET cold starts include:
NativeAOT eliminates steps 3 & 4 entirely!
AWS Lambda Managed Runtimes:
dotnet8 ✅ Available
dotnet9 ❌ Not yet
dotnet10 ❌ Not yet
With NativeAOT (provided.al2023):
.NET 8 ✅ Works
.NET 9 ✅ Works
.NET 10 ✅ Works
.NET 11+ ✅ Will work
Don't wait for AWS to support new runtimes!
Real AWS Lambda Benchmarks
| Parameter | Value |
|---|---|
| Platform | AWS Lambda |
| Runtimes | dotnet8, provided.al2023 |
| Memory | 512MB (Regular/R2R), 256MB (AOT) |
| Business Logic | DynamoDB write operation |
| Cold Start Test | Invoke after 10min inactivity |
| Warm Test | 100 rapid invocations |
7.1× faster (Regular → AOT .NET 10)
6.5× faster (Regular → AOT .NET 9)
52% less memory (Regular → AOT .NET 10)
| Function | .NET | Cold Start | Warm Avg | Memory |
|---|---|---|---|---|
| Regular | 8 | 6680 ms | 91 ms | 88-93 MB |
| ReadyToRun | 8 | 4389 ms | 99 ms | 89-96 MB |
| AOT | 8 | 1447 ms | 19 ms | 46-48 MB |
| AOT | 9 | 1006 ms | 14 ms ⚡ | 43-46 MB |
| AOT | 10 | 951 ms ⚡ | 17 ms | 42-45 MB ⚡ |
| Mode | Avg Duration | Memory | Cost |
|---|---|---|---|
| Regular | 91 ms | 512 MB | $9.80 |
| ReadyToRun | 99 ms | 512 MB | $10.40 |
| AOT .NET 9 | 14 ms | 256 MB | $2.60 |
| AOT .NET 10 | 17 ms | 256 MB | $2.70 |
Savings: 73% lower cost with AOT! 💰
Project Configuration
<!-- LambdaRegularDemo.csproj -->
<PropertyGroup>
<OutputType>Library</OutputType>
<TargetFramework>net8.0</TargetFramework>
<RuntimeIdentifier>linux-x64</RuntimeIdentifier>
<PublishAot>false</PublishAot>
<PublishReadyToRun>false</PublishReadyToRun>
</PropertyGroup>
OutputType = Library (uses Lambda runtime)
<!-- LambdaReadyToRunDemo.csproj -->
<PropertyGroup>
<OutputType>Library</OutputType>
<TargetFramework>net8.0</TargetFramework>
<RuntimeIdentifier>linux-x64</RuntimeIdentifier>
<PublishReadyToRun>true</PublishReadyToRun>
<TrimMode>partial</TrimMode>
</PropertyGroup>
Just add PublishReadyToRun=true! One line change.
<!-- LambdaAOTDemo9.csproj -->
<PropertyGroup>
<OutputType>Exe</OutputType> <!-- MUST be Exe -->
<TargetFramework>net9.0</TargetFramework>
<RuntimeIdentifiers>linux-x64</RuntimeIdentifiers>
<PublishAot>true</PublishAot>
<SelfContained>true</SelfContained>
<StripSymbols>true</StripSymbols>
<TrimMode>partial</TrimMode>
<InvariantGlobalization>true</InvariantGlobalization>
</PropertyGroup>
Key: OutputType=Exe produces native executable
dotnet publish -c Release -o ./publish
zip -r lambda.zip ./publish/*
# Output: Multiple DLLs
dotnet publish -c Release -o ./publish
mv ./publish/LambdaAOTDemo9 ./publish/bootstrap
zip -r lambda.zip ./publish/bootstrap
# Output: Single 'bootstrap' binary
Regular Package (1.37 MB):
├── LambdaRegularDemo.dll
├── Shared.dll
├── Amazon.Lambda.Core.dll
├── AWSSDK.DynamoDBv2.dll
├── ... (many more DLLs)
└── runtimeconfig.json
NativeAOT Package (5.56 MB):
└── bootstrap
Single native executable!
Best Practices for AOT
NativeAOT uses STATIC ANALYSIS at build time.
Dynamic reflection breaks this!
❌ PROBLEMATIC:
• Type.GetType("MyClass")
• Activator.CreateInstance(type)
• Assembly.Load("Plugin")
• JsonSerializer.Serialize(obj) // Default uses reflection!
var json = JsonSerializer.Serialize(input);
[JsonSerializable(typeof(Dictionary<string, string>))]
[JsonSerializable(typeof(Guid))]
public partial class AOTJsonContext
: JsonSerializerContext { }
// Usage:
var json = JsonSerializer.Serialize(input,
AOTJsonContext.Default.DictionaryStringString);
// ✅ AOT-Friendly: Constructor Injection
public class Function
{
private readonly IDynamoDBRepository _repo;
public Function(IDynamoDBRepository repo)
{
_repo = repo; // Type known at compile time
}
public async Task<Guid> FunctionHandler(
Dictionary<string, string> input,
ILambdaContext context)
{
return await _repo.CreateAsync(cts.Token);
}
}
services.AddFromAssembly(
typeof(Startup).Assembly);
public void ConfigureServices(
IServiceCollection services)
{
services.AddSingleton<IAmazonDynamoDB,
AmazonDynamoDBClient>();
services.AddSingleton<IDynamoDBRepository,
DynamoDbRepository>();
services.AddTransient<Function>();
}
warning IL2026: Using member 'Type.GetType(string)'
which has 'RequiresUnreferencedCodeAttribute' can break
functionality when trimming application code.
Action Items:
<EnableTrimAnalyzer>true</EnableTrimAnalyzer>[JsonSerializable] for serializationStep-by-Step Approach
Don't migrate everything at once. Start with non-critical functions.
<PropertyGroup>
<EnableTrimAnalyzer>true</EnableTrimAnalyzer>
</PropertyGroup>
Identify Blockers:
<!-- Quick win: 34% faster cold starts -->
<PropertyGroup>
<PublishReadyToRun>true</PublishReadyToRun>
<TrimMode>partial</TrimMode>
</PropertyGroup>
Decision Guide
Need cold start < 1s? → AOT
Traffic > 1M/month? → AOT
Budget-sensitive? → AOT
Complex reflection? → Regular or R2R
Plugin architecture? → Regular
Rapid prototyping? → Regular
Migration testing? → ReadyToRun
| Scenario | Recommendation | Why |
|---|---|---|
| User-facing APIs | ✅ AOT | 7× faster cold start |
| Event processors | ✅ AOT | Frequent cold starts |
| Scheduled tasks | ✅ AOT | Always cold start |
| High volume (>10M/mo) | ✅ AOT | 73% cost savings |
| Plugin systems | ✅ Regular | Dynamic loading needed |
| Heavy ORM (EF Core) | ⚠️ Regular/R2R | Not fully AOT-compatible |
| MVPs/Prototypes | ✅ Regular | Fastest iteration |
| Limitation | Regular | R2R | AOT |
|---|---|---|---|
| Reflection | ✅ Full | ✅ Full | ❌ Limited |
| Dynamic loading | ✅ Yes | ✅ Yes | ❌ No |
| Entity Framework | ✅ Full | ✅ Full | ⚠️ Partial |
| Build time | ⚡ Fast | ⚡ Fast | 🐌 2-5× slower |
| Cold start | Slow | Medium | Fast |
// Even if staying on Regular .NET:
// 1. Adopt JSON source generation
[JsonSerializable(typeof(MyModel))]
public partial class AppJsonContext : JsonSerializerContext { }
// 2. Use constructor injection
public MyService(IRepository repo) => _repo = repo;
// 3. Enable trim analyzers
// <EnableTrimAnalyzer>true</EnableTrimAnalyzer>
// 4. Avoid reflection patterns
// No Type.GetType(), Assembly.Load()
This makes future AOT migration trivial!
Microsoft Documentation:
AWS Documentation:
This Repository:
Let's discuss!
GitHub: github.com/whitewAw
Repository: dotnet-lambda-aot-performance-comparison
Go Native! 🚀