Caching mechanism

Caching is the simplest way to increase the application's throughput. The principle is very easy. Once the data is read from data storage, it is kept as close as possible to the processing server. In future requests, the data is served directly from the data storage or cache. The essence of caching is minimizing the amount of work that a server has to do. HTTP has a built-in cache mechanism embedded in the protocol itself. This is the reason it scales so well.

With respect to microservices, we can cache at three levels, namely client side, proxy, and at server side. Let's look at each of them.

First we have client-side caching. With client side caching, clients store cached results. So the client is responsible for doing ...

Get Building Microservices with .NET Core now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.