In Memory caching in .NET Core

What is in-memory cache?

A powerful technique that stores data within the memory of your server, making data retrieval lightning-fast.

In-memory caching is a common scaling technique that can enhance performance for frequently needed data.

When should we use it?

We should use in-memory caching:

  • When we need to access some data frequently that doesn’t change.
  • While scaling our applications for performance.
  • Ideal for high-traffic applications.

Pros and cons of using in-memory cache

Benefits:

  • Speed
  • Scalability
  • Reliability

Cons:

  1. Volatility: Data stored in memory is lost when the application ends or the server restarts.
  2. Cost: More RAM is required, which can increase costs.
  3. Complexity: Implementing caching mechanisms can be complex and requires careful planning.

How to implement it in .NET 6.0

Add service of memory cache in Program.cs file of your API to enable in memory cache.

In Memory caching in .NET Core

Inject IMemoryCache in your desired class where you are going to add caching code

In Memory caching in .NET Core

The next step is how to set the cache, these are a few available methods that we can use:

In Memory caching in .NET Core

We are going to use TryGetValue and Set for retrieving the cached values and setting it respectively.

I have created a DTO to store the data in the cache. In Memory caching in .NET Core

Let’s see now how can we use it, we need to pass the key while getting the cached results.

Similarly while setting the cache we have to pass the key( which in our case is a number in string format), data object, and cache entry options In Memory caching in .NET Core

What are the Cache Entry Options

hese options help us in defining the behavior of our cached data, it has two important things worth mentioning:

  • Sliding Expiration Time: It represents the max timespan value for which a cached value can be inactive, which later on can be removed. Suppose a cache value can be inactive for 5 minutes
  • Absolute Expiration Time: It is the duration after which a cache value would be automatically removed, suppose expire cache values after one hour

You can explore other methods of caching for example to remove the cache we can use the Remove method and pass it to the cached object.

How to handle concurrent requests

We can handle concurrent requests either by locking the thread or using semaphores, I prefer to use semaphores because it gives us the facility of setting the maximum number limit for concurrent requests.

We can add it like this: In Memory caching in .NET Core

Cache Scenario with GitHub Code

I have implemented caching for a web API where the user receives the requests and verifies if a number is prime. Meanwhile, it stores the results of already requested numbers.Get the code from my GitHub Repo

This article was originally published at https://mwaseemzakir.substack.com/ on .

Whenever you're ready, there are 3 ways I can help you:

  1. Subscribe to my youtube channel : For in-depth tutorials, coding tips, and industry insights.
  2. Promote yourself to 9,000+ subscribers : By sponsoring this newsletter
  3. Patreon community : Get access to all of my blogs and articles at one place