For a while I have been looking for a reason to use the new Windows Azure Caching service, an in memory cache hosted by Microsoft.
Unfortunately, due to its price point starting at 100 dkk/month, for 128 MB it is not really feasible to use for my small private projects.
But the other day at work, I was asked to help one of our consultant teams improving their Azure solution, for a customer that had already prepaid lots of resources in windows azure that would just disappear if not used, so suddenly the price was not an issue.
Their challenge was that they had built a web site hosted by a web role in a cloud services that relied heavily on an in memory cache. They had built it around the HttpRuntime.Cache, but had some major performance issues when the cache was cold and had to be repopulated from a slow backend system. So, they were looking for a fast and low impact (code-wise) alternative to their current cache implementation. Windows Azure Caching Service seemed like a good fit, as the total need for caching was well within the 128 MB of the smallest plan.
Here is what I learned from helping them change their caching implementation.
- In their project that cache items was saved without an expiration, and they were frequently refreshed by a timer job. Cleaning up that cache becomes somewhat of an annoyance when configured with no expiration, because you have no way of querying the cache to find old items and delete them, unless you know their cache key. If you want to wipe the cache you are force to delete it and recreate it. (A process that takes 5-10 minutes)
- When you recreate a cache with the same name as you used before, you might run into DNS problems, where your worker/web role cannot find your cache. Our solution was to use a new name. Being patient would probably work too 🙂
- Azure Caching Service has some really nice features when it comes to serialization of your cache objects. A good read up on it can be found here http://blog.stephencleary.com/2013/12/azure-cache-serialization-with-json.html. We choose to go with the BinaryFormatter instead of the NetDataContractSerializer because that later would have required us to do more changes to the existing code.
- Installing the cache service in the project was quite painless, we used the Nuget package, which added the needed assemblies and configuration changes. We did however have to specific the AccessKey for the cache, although the documentation say it is not required if your cache is hosted within same Azure Subscription as your Cloud Service. Also we had to change the CopyLocal property on serveral of the assemblies to get them into the deployment package, as they would otherwise be missing when running the solution in Azure.
- It is not possible for a single object in the cache to be larger than 8 MB. This was not something we were aware of up front, and it became a show stopper for us, since the cache objects in the implementation were large lists of data.
We ended up not using the cache because of the size limitations, as it would require a lot of rewriting in the application to use the cache in a better way. So instead we ended up serializing the large lists of objects into azure blob storage, then read the data from there when the cache was cold. This turned out to perform fast, and in the end, it is a lot cheaper solution for the customer.
In retrospect, I wish they had designed their application differently, in terms of cache usage, but with the limited time we had to change things, I feel like the solution was great.