Concurrency Lazy Task and Async/Await
This uses a single-flight async cache: one in-flight fetch per key, shared via a cached Task, with eviction on failure and TTL to avoid stampedes.
using System;
using System.Collections.Concurrent;
using System.Threading.Tasks;
public sealed class CurrencyRateCache
{
/*
* We store one Lazy<Task<decimal>> per currency pair.
*
* Why Task?
* - Represents in-flight or completed async work
* - Multiple callers can await the same Task
*
* Why Lazy?
* - Ensures the Task is created and started only once
*
* Why ConcurrentDictionary?
* - Atomic GetOrAdd under concurrency
*/
private readonly ConcurrentDictionary<string, Lazy<Task<CacheEntry>>> _cache
= new();
// Public API used by the controller / service
public async Task<decimal> GetRateAsync(string from, string to)
{
var key = $"{from}-{to}";
var now = DateTime.UtcNow;
// Atomically get or create the Lazy<Task<CacheEntry>>
var lazyTask = _cache.GetOrAdd(
key,
_ => CreateLazyEntry(key));
CacheEntry entry;
try
{
// Everyone awaits the SAME task here
entry = await lazyTask.Value;
}
catch
{
// If the task failed, remove it so future calls can retry
_cache.TryRemove(key, out _);
throw;
}
// Check expiration AFTER awaiting the task
if (entry.ExpiresAt > now)
{
return entry.Rate;
}
// Expired: remove and retry (single-flight again)
_cache.TryRemove(key, out _);
return await GetRateAsync(from, to);
}
/*
* Creates a Lazy wrapper around the async fetch.
* The fetch will execute at most once per key.
*/
private Lazy<Task<CacheEntry>> CreateLazyEntry(string key)
{
return new Lazy<Task<CacheEntry>>(
async () =>
{
var rate = await FetchFromProviderAsync(key);
return new CacheEntry(
rate,
ExpiresAt: DateTime.UtcNow.AddMinutes(5 + JitterSeconds()));
},
isThreadSafe: true);
}
/*
* Simulates an external API call.
* In real life this would be HTTP, gRPC, etc.
*/
private async Task<decimal> FetchFromProviderAsync(string key)
{
await Task.Delay(500); // simulate latency
return Random.Shared.Next(80, 120) / 100m;
}
/*
* Small random offset to avoid thundering herd on expiration.
*/
private static int JitterSeconds()
=> Random.Shared.Next(-10, 10);
/*
* Immutable cache entry.
* Immutable data avoids synchronization issues.
*/
private sealed record CacheEntry(decimal Rate, DateTime ExpiresAt);
}
Key Idea
I cache the Task, not the value
- Prevents duplicate fetches
- Everyone awaits the same work
Lazy ensures single execution
- Task is created only once
- Thread-safe initialization
Failed tasks are evicted
- Prevents poisoned cache
- Allows retries
Expiration is checked after await
- Avoids race conditions
- Correct under concurrency
No blocking locks
- Fully async
- High throughput
Check-list:
|
Aspect |
Result |
|
Concurrency |
✅ Safe |
|
Performance |
✅ High |
|
Async correctness |
✅ |
|
Lock contention |
❌ None |
Very strong advice: Do not use this in an interview unless the interviewers are open to discussion!
Most of the technical interviews are rushed. You often hear the phrase "I am time-conscious" while asking about the most complicated tasks! Simply because they picked a task from a quiz site without considering the time it will take to consider the wide context, and expect you to type like crazy.
If you mention phrases like "can be done without using locking" or "the scenario doesn't fit the purpose in real-life production environments", they will simply assume you are clueless about concurrency :D async/await, e.t.c.
In all fairness, what you need to do in an interview set-up is to learn how to describe these tough concepts in just a few words.
My advice would be to bite the bullet, try your best to implement the scenario with locks or SemaphoreSlim without missing the key concepts. Don't forget to mention the main pitfalls and the things to avoid, and do not assume they also know!
In your daily tasks, you can apply concurrency without locking the entire application :D
No files yet, migration hasn't completed yet!