March 19, 2024

The ContactSunny Blog

Tech from one dev to another

Why caching is important to improve your system’s performance

5 min read
We often hear about caching data on servers, but exactly does that mean? And how do we cache data? What are the different approaches?
server-stack

Caching is nothing new, we’ve been caching stuff for ages now. We do that in our everyday lives as well. It would be wrong to say that caching is a very technical concept, it’s not. Usually, what we do in everyday life, as a part of common sense, will be implemented in the tech world. Caching is no different. Let me give you a couple of practical examples.

We all need cash, and we always keep cash on us. Nowadays though the usage of cash is going down, thanks to internet banking on instant money transfer. But we still keep some cash on us. When we go shopping and want to pay by cash, we can just go to a bank or to an ATM and draw some cash. But we still keep some on us. We do this to avoid wasting time going to the ATM. We don’t have to go to the ATM each and every time if we already have some. So, we’re caching some cash on us.

Similarly, food and groceries. Now, I read this example somewhere else. So I don’t want to claim the credit of coming up with this. But imagine this, even though most of us stay within walking distance to grocery stores, we still stock at least a week’s worth of groceries at home. Why? We can just walk up to the store whenever we want to cook. But no, to save time of going to the store every few hours, we cache groceries, milk, vegetables, fruits, etc. This is another example of caching. Now, let’s see what it means to cache data in a system.


Why and how to cache data

There are many stages in which data in cached in a system. First, we’ll see why need caching of data in any system. Let’s suppose the most common example, a web application with a frontend, a backend, and a database. This is a very common architecture or setup you would find today. So, whenever a user who is working on the frontend wants to get some data on the screen, the frontend makes an API call to the backend. The backend, in turn, makes a call to the database, which in turn returns the data. The backend then returns this data to the frontend. The frontend will show this data to the user.

This process of data transfer will involve multiple calls to multiple systems. And that means latency, a lots of it depending on the scale of the system, the distributed nature of the system, and other terms. And latency is never good. If this data is requested by the user frequently, there will be frequent calls to the server, which is not good in any way. To avoid this, the frontend can keep this frequently requested data in the browser, or its local cache. Now, there are issue with refreshing the data in the cache, and how frequently it should be done, but we’ll see that later.

Next, let’s suppose that the frontend is not implementing a cache for some reason. Another option is caching at the API level. Usually, we’ll see that certain API calls are cached, and the results to such API calls will also be cached. This way, the API will not reach our server at all. Instead, the cache layer will just return the data requested by the frontend. If this is not an API call and instead the frontend is asking for a resource, such as an image file or a CSS file, we’ll be looking at something like a CDN solution, where the data could be cached at the regional server level.

At the next phase, we’ll be caching the data at the database level. You can even call this caching at the backend level. In most cases, the backend services will be using an in-memory database solution such as Redis or Memcached to avoid making a call to the database layer. This is because reading data from the memory is much more quicker than reading data from storage.

So, most objects will be cached in memory, so the database is not called frequently. This improves the performance of a system drastically, this is from my own experience.


How to refresh data in a cache

After caching data, the next obvious question is how do we refresh that data in the cache? This is completely dependent on the system you’re dealing with. Some systems cache data for seconds, some do it for days. So I can’t really give a generic answer here. But we can talk about two approaches or design patterns using which we can update the data in a cache. So the two design patterns are synchronous update and asynchronous update. Let’s see each of them in detail.

In the synchronous pattern, you’d be updating the cache whenever you update the data in your database. Let’s take the example of a user updating his profile. User profiles are some of the most common examples of objects that are cached. So when a user updates this profile, the data goes into the database. As soon as the data is written to the database, the system should invalidate the cache and update the cache with this new data. Because the cache is updated along with the database, this is called the synchronous approach.

Next, in the asynchronous approach, the data in the cache is not updated as soon as the database is updated. This is true in most distributed systems. You must have experienced services telling you the data in the system will be updated in roughly 24 hours. They give that 24 hour delay so that the data in the cache and also in the distributed database layer will be updated by that time. In database terms, this is called eventual consistency. The cache layer will have its own expiry time, only after which the cache will be updated.


So, I hope I was clear enough about this, and if not, you can always contact me in the comments below or connect with me on Twitter. And if you think I’ve got it wrong somewhere, please do correct me.


And if you like what you see here, or on my Medium blog, and would like to see more of such helpful technical posts in the future, consider supporting me on Patreon and Github.

Become a Patron!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.