Caching HTTP response can dramatically improve performance of your app if what you generate is in reality not very dynamic. There are many free caching frameworks in Java. Most popular seem to be ehcache, oscache, jcs and JBoss Cache.
ehcache is quite simple to use and its code is clean. They have a CachingFilter that you can put in your webapp server to cache transparently HTTP responses. However as the framework only allows you to store Objects (which makes sense for most uses), I was wondering how they cached the HttpResponse which is a stream. I was a bit disappointed by the answer, they just create a copy ByteArrayOutputStream and call toBytes() to store it in the cache. While this is optimal for a memory cache store (the whole response will anyway be in the cache, although I am not sure if they check for particularly big responses to avoid caching those or try to cache those) I don't think it is that good for a disk cache store.
Ideally one would like the response to be stored using a buffer, to avoid having the whole response in memory. This would enable a much higher concurrent use. I think it is doable by writing your own CachingFilter and by using the concurrent utils Queue to block writing when the buffer is full.
I googled for this kind of stuff without success. I only found solutions similar to ehcache one (for example sun CachingResponseWrapper and CachingFilter or oscache CacheFilter (a bit more careful, but still a toBytes())) I wonder why it is not already done and public.
Resin has integreated cache system into app-server. To use CacheFilters indicates cache lifetime to the app-server.
ReplyDelete(sorry poor english...)