Monday, August 02, 2010

Website Performance: Choose Appropriate Cache

Caching plays a key role in speeding up a web application. Before looking at what and when to cache, some consideration should be given to the appropriate cache which is suitable for your environment.

In case your application is deployed on a single server, then caching content on that server itself will suffice.

For distributed architecture (application deployed on multiple servers), distributed cache should be used.

Usually, the argument against a distributed cache is that it will involve accessing a remote machine which is expensive.

Let's look at how expensive this operation is.

Following are few interesting numbers picked from a presentation by Jedd Dean (from Google):

Time taken to read 1 MB sequentially from memory - 250,000 ns (thats nano seconds)
Time taken for round trip within the same data center - 500,000 ns

So, reading 1 MB from a remote server's memory should take roughly 750,000 ns (0.75 ms).

Considering that 1 second page load time is good enough, this is less than 1/1000th of the time. Thus, when we talk about web applications, reading from a remote server's memory will not degrade performance by any noticeable amount.

When using a distributed cache, it's advisable to use a bit more than what is required. This ensures that failure of a single server does not overload the application.

Example: If you need 4 GB of memory and you are using 4 servers (with 1 GB cache on each), add another (5th) server with 1 GB of memory for caching. This way, when 1 server goes down, the application performance will not get impacted much.

One of the most popular distributed cache implementation is Memcached. It's used by companies like Wikipedia, Flickr, YouTube and Twitter.

No comments: