top of page

Creating an in-memory cache for .NET 6 Web API

Caching is the technique of storing frequently accessed data at a temporary location for quicker access in the future. ASP.NET Core supports two types of caching out of the box:

  • In-Memory Caching – This stores data on the application server memory.

  • Distributed Caching – This stores data on an external service that multiple application servers can share.

In-Memory Caching in ASP.NET Core is the simplest form of cache in which the application stores data in the memory of the web server. This is based on the IMemoryCache interface which represents a cache object stored in the application’s memory. Since the application maintains an in-memory cache on the server memory, if we want to run the app on multiple servers, we should ensure sessions are sticky. A Sticky session is a mechanism in which we make all requests from a client go to the same server.

Implementing in-memory cache

1. Register the Cache Service

Adding a local cache in .NET 6 is simple. For the first step, all we need is to register the cache service on the application startup. Check the code below:

var builder = WebApplication.CreateBuilder(args);

// Add services to the container.

// Learn more about configuring Swagger/OpenAPI at

var app = builder.Build();

// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())




This code above is part of the Program class from web API. Check line 9 from this code and you will realize that the cache service was registered through the AddMemoryCache method.

2. Get or Add new data to the Cache Service

For this sample, I created a new controller to test the cache service. This controller has a GET method that will simulate a long-running process when its result is not in the local cache.

After registering the cache service in the Web API, we need to initiate an instance from the IMemoryCache interface in the controller to handle the data in the local cache.

See the code below:

using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Caching.Memory;

namespace MemoryCache.API.Controllers;

public class CacheController : ControllerBase 
    private readonly IMemoryCache _memoryCache;    
    public CacheController(IMemoryCache memoryCache)    
        _memoryCache = memoryCache;    
    public async Task<IActionResult> Get(string cacheKey)    
        if (_memoryCache.TryGetValue(cacheKey, out var item))        
            return Ok(new {item = item});        
        item = await LongRunningProcess();                
        var options = new MemoryCacheEntryOptions()        
            AbsoluteExpirationRelativeToNow = 
            SlidingExpiration = TimeSpan.FromSeconds(1200)        
          _memoryCache.Set(cacheKey, item, options);        
          return Ok(new {item = item});    
      private static async Task<int> LongRunningProcess()    
          await Task.Delay(1000); 
          var random = new Random();        
          return random.Next(1000, 2000);    

The first step of this method is to verify if there is any data in the local cache associated with the provided key (line 20). In case the data has been found in the local cache, this will be returned to the consumer. Otherwise, the GET method will call the LongRunningProcess method to simulate a slow execution (line 25).

The LongRunningProcess method simulates slow processing that returns a random number. Inside this method, the delay of 1 second for each call (line 39), then returns a random number.

Backing to the GET method, when the LongRunningProcess returns a number, this will be stored in the local cache, but first, a MemoryCacheEntryOptions to define how this data will be persisted in the local cache (line 27):

  • AbsoluteExpirationRelativeToNow: gets or sets an absolute expiration time.

  • SlidingExpiration: gets or sets how long a cache entry can be inactive before it will be removed.

The last step is to add the data to the local cache and return this data to the consumer (line 33). Note that before returning an OK message to the consumer, the data is stored in the local cache through IMemoryCache.Set method.

Through this method, we provide the data that will be stored in the local cache and for how long this data will keep there, following the rules defined in the MemoryCacheEntryOptions object.

Advantages and Disadvantages of in-memory caching

We have seen how In-Memory caching improves the performance of data access. However, it has some limitations as well that we need to be aware of.

  1. Faster Data Access – When we’re accessing data from the cache, it will be very fast as no additional network communication is involved outside of the application.

  2. Highly Reliable – In-memory cache is considered highly reliable as it resides within the app server’s memory. The cache will work fine as long as the application is running.

  3. Easy to Implement – Implementing an In-memory cache is very easy with a few simple steps without any additional infrastructure or third-party components and hence it is a good option for small to mid-scale apps.

  1. Sticky Session Overhead – For large-scale apps running on multiple application servers, there will be an overhead to maintain sticky sessions.

  2. Server Resource Consumption – If not properly configured, It may consume a lot of the app server’s resources, especially memory.

Resource: Medium - Luis Rodrigues

The Tech Platform



bottom of page