![]() You can easily scale the cache by adding more servers. The underlying infrastructure determines the location of the cached data in the cluster. An application instance simply sends a request to the cache service. Many shared cache services are implemented by using a cluster of servers and use software to distribute the data across the cluster transparently. It locates the cache in a separate location, which is typically hosted as part of a separate service, as shown in Figure 2.Īn important benefit of the shared caching approach is the scalability it provides. ![]() Shared caching ensures that different application instances see the same view of cached data. If you use a shared cache, it can help alleviate concerns that data might differ in each cache, which can occur with in-memory caching. Therefore, the same query performed by these instances can return different results, as shown in Figure 1.įigure 1: Using an in-memory cache in different instances of an application. If this data isn't static, it's likely that different application instances hold different versions of the data in their caches. Think of a cache as a snapshot of the original data at some point in the past. If you have multiple instances of an application that uses this model running concurrently, each application instance has its own independent cache holding its own copy of the data. This process will be slower to access than data that's held in memory, but it should still be faster and more reliable than retrieving data across a network. If you need to cache more information than is physically possible in memory, you can write cached data to the local file system. The size of a cache is typically constrained by the amount of memory available on the machine that hosts the process. It can also provide an effective means for storing modest amounts of static data. It's held in the address space of a single process and accessed directly by the code that runs in that process. The most basic type of cache is an in-memory store. Server-side caching is done by the process that provides the business services that are running remotely. Client-side caching is done by the process that provides the user interface for a system, such as a web browser or desktop application. In both cases, caching can be performed client-side and server-side.
0 Comments
Leave a Reply. |