top of page
  • Rishabh Kochar

In-Memory Caching Technique

Updated: Aug 19, 2021



In-Memory Cache

Photo by Clément Hélardot on Unspalsh Caching isn’t an architecture, it’s just about optimization. Caching is an extremely important part of an application! It provides fast response time, enabling effortless performance improvements in many use cases. In many situations caching is really useful e.g. when the computation of a value is very expensive or when loading of resources is involved.

We recently encountered a situation where we need to cache certain data, which used to take around 600ms to get from the source, as the data was not much we didn’t want to use Redis as the cost to maintain Redis server will be quite high for just certain amount of data. So we explored In-Memory caching, there we came across different in-memory caching options.

  1. ConcurrentHashMap

  2. Google’s Guava

  3. Caffeine

The basic approach was using ConcurrentHashMap, but the disadvantage of using Maps for caching is that you have to implement the eviction of entries yourself, e.g. to keep the size to a given limit. When you develop for a concurrent environment the task gets more complicated, the code looks messy.

public class Cache {      private static final long MAX_SIZE = 100;      private final ConcurrentHashMap  map;          public Cache() {           map = new ConcurrentHashMap  ();      }          public String getEntry(String key) {           String result = createCacheEntry(key);           removeOldestCacheEntryIfNecessary();           return result;      }          private String createCacheEntry(String key) {           String result = map.get(key);           if (result == null) {                String putResult = map.putIfAbsent(key, createRandom());                if (putResult != null) {                     result = putResult;                }          }           return result;      }          private void removeOldestCacheEntryIfNecessary() {           if (map.size() > MAX_SIZE) {                String keyToDelete = map.keys().nextElement();                map.remove(keyToDelete);           }      }          private String createRandom() {           return "I am resource which you want to cache...!!!!!";      } } 

Then we came across The most smelly part of this code is eviction apart from null checks, to keep the map size below 100 we need to intercept every add operation.


Google’s Guava Cache, Guava provides a very powerful memory-based caching mechanism by an interface LoadingCache<K, V>. Values are automatically loaded in the cache and it provides many utility methods useful for caching needs.

The above piece of code can be written as
public class Cache {      private static final long MAX_SIZE = 100;      private final LoadingCache  cache;          public Cache() {           cache = CacheBuilder.newBuilder().maximumSize(MAX_SIZE).build(          new CacheLoader  () {             @Overridepublic String load(String key) throws Exception {                        return createRandom();                   }    });      }          public String getEntry(String key) {            return cache.getUnchecked(key);      }          private String createRandom() {          return "I am resource which you want to cache...!!!!!";      } }  

The ugly code has now vanished. The thread-safe storing and the eviction is all done by Guava’s internal implementation. Also, Guava provides a nice API which makes our code much more readable. If you want to understand the caches in detail, the Guava User Guide does a great job.


The problem with Google’s Guava or I can say we found there was a better library available where the hit rate of the caching is quite better than guava and Caffeine is actually a rewrite of Guava’s cache that uses an API that returns CompletableFutures out of the box, allowing asynchronous automatic loading of entries into a cache. So at last we went forward with the Caffeine.

public class CaffeineConfiguration {      private static final long MAX_SIZE = 100;      private static final long TTL = 30;          @Beanpublic Caffeine caffeineConfig() {           return Caffeine.newBuilder().maximumSize(MAX_SIZE).expireAfterWrite(TTL, TimeUnit.MINUTES);      }          @Bean(name = "caffeineCacheManager")public CacheManager caffeineCacheManager(Caffeine caffeine) {            CaffeineCacheManager caffeineCacheManager = new CaffeineCacheManager();            caffeineCacheManager.setCaffeine(caffeine);            return caffeineCacheManager;      } }  public class Cache {     @Cacheable(value = "cacheName", key = "#key", cacheManager = "caffeineCacheManager")public String getEntry(String key) {          createRandom();      }          private String createRandom() {           return "I am resource which you want to cache...!!!!!";     } } 

So another advantage which I found is, we can configure Caffeine as CacheManager, so in the future, if you want to use Redis or any other Caching library only you have to do is update the cache Manager, the rest will be taken care of by Cacheable annotation.


You can check the benchmark for Google’s Guava and Caffeine here. And here is a Guide to Caffeine. Also read our blog on AI with Human Capital Management Software

26 views0 comments

Recent Posts

See All
bottom of page