[#52] Understanding Caching: Types, Benefits and Considerations.

Published by

on

In today’s fast-moving digital world, getting information quickly isn’t just a bonus—it’s essential. That’s where caching comes in. Think of it as a super-smart shortcut for your data. It helps speed things up and keeps everything running smoothly.

Let’s break down what caching is, why it’s so important, and how you can use it to make your apps faster and more efficient.

WHAT IS CACHING? Caching is the process of storing data for future reference to speed up data access and reduce the load on the primary data source. Think of it as a way to remember frequently asked questions and their answers so you don’t have to look them up repeatedly.

Lets use a basic life Example: The Chocolate or Crisps Dilemma

Without Cache

Person A: "Hey Tanda, do you like chocolate or crisps?"
Tanda: "I like chocolate, not crisps."
Person A: "Thanks."
Person B: "Hey Tanda, do you like chocolate or crisps?"
Tanda: "I like chocolate, not crisps."
Person C: "I need to ask..."
Tanda: "I like chocolate, not crisps."

Without a cache, I gotta repeat the same information multiple times, which is inefficient and time-consuming.

With Cache:

Person A: "Hey Tanda, do you like chocolate or crisps?"
Tanda: "Hey Katy The Cache, tell anyone else who asks me in the next month that 'I like chocolate, not crisps.'"
Person B: "Tanda, do you like chocolate or crisps?"
Katy The Cache: "Tanda likes chocolate, not crisps."
Person B: "Thanks."
Person C: "I need to ask Tanda what she likes too."
Katy The Cache: "She likes chocolate, not crisps."
Person D: "I need to ask..."
Katy The Cache: "She likes chocolate, not crisps."

With a cache, the same information is readily available, saving time and effort for everyone involved. You ain’t gotta ask me twice!

Let’s bring this back to my time at ASOS and think about a product page. Imagine you’re looking at a Nike Hoodie.

Some details like the product name, description or available sizes don’t change often, so it makes sense to cache that information for a while. However, the price is a different story—it can change during sale times, so it needs to be kept up-to-date and shouldn’t be cached for too long.

Without getting too technical, scaling an application to handle a lot of requests can get pricey, especially when many of those requests are for the same information. For example, on Black Friday, when tons of people are browsing products, we boost our cache capacity to manage the surge in traffic. This lets us handle all those requests smoothly without having to scale our entire system to crazy levels.

So lets look at the following cache types:

Cache TypeProsCons
In-Memory Cache
Short-term storage within the application.
Storing user session data.
Reduces calls to data source (database)
Lowers web service load.
Provides quick data access.
Increased maintenance.
Scalability issues.
Synchronization challenges with the database.
Persistent In-Process CacheSimilar to the above in-memory cache but with better persistence.
Persistence: means that the data is stored in a way that ensures it remains available and intact even after the program has finished running or the system has been turned off.
Distributed Cache. A shared cache accessible by multiple processes, often used in large-scale applications.
Example: Redis cache for distributed systems.
Scalability across multiple servers.
Consistency in data access.
Complexity in setup and maintenance.
Potential latency issues if not managed properly.

So..how do you choose between a Local In-Memory Cache and Distributed Cache (Redis Cache)

In-Memory CacheDistributed Cache/Redis Cache
Ideal for: Applications with limited, short-term caching needs.

Keep note that that it requires careful handling to ensure cache persistence across app restarts.
Ideal for: Large-scale applications needing shared, distributed caching.

Excellent for handling large volumes of data with high availability. Imagine *name drops again* ASOS and the number of requests they would get for the same product over and over again.

In summary if anyone asks you the pros and cons of caching you can say the following

+ pros– cons
Reduced Database Calls: Frees up resources and improves database performance.
Reduced Web Service Load: Lessens the demand on external services.
Quick Data Access: Speeds up response times for users.
Maintenance: Requires careful management and updating.
Scalability Issues: Especially for in-memory caches.
Data Synchronisation: Keeping cache in sync with the primary data source can be challenging.

So, that’s a quick rundown on caching and why it’s important. Hopefully, it gave you some good insights. Using a cache is key for speeding up data access and meeting what users expect from modern apps. By knowing the different types of caches and their pros and cons, you can pick the right one for your needs. Whether you go for an in-memory cache for fast, short-term use or a distributed cache like Redis for bigger projects, caching will really boost your app’s performance and user experience.

I first wrote this blog post back in 2021 while working in the Product team at ASOS, and honestly, my appreciation for caches has only grown since then. Hopefully, this gives you a better idea of why caching is so important.

#techwithtanda

Leave a comment