Memcache Status
I use this script to keep track of the cache usage for various sites. It presently only supports memcached because that's all that I use, but leave a comment with patches for other systems and I'll add them.
- memcache
- cache
- status
I use this script to keep track of the cache usage for various sites. It presently only supports memcached because that's all that I use, but leave a comment with patches for other systems and I'll add them.
A very simple decorator that caches both on-class and in memcached: @method_cache(3600) def some_intensive_method(self): return # do intensive stuff` Alternatively, if you just want to keep it per request and forgo memcaching, just do: @method_cache() def some_intensive_method(self): return # do intensive stuff`
If any cache keys you generate include user (staff or public) supplied data, they may: be too long, contain invalid characters (at least by memcache's standards), or both. This helper will sub out any invalid characters and md5 the key if it's too long. Additionally, it'll insert the CACHE_MIDDLEWARE_KEY_PREFIX from django.conf.settings for you. In your memcache instances are used by multiple applications (Django or otherwise) this can help ensure your keys are unique to that a particular app on a particular site.
This was inspired by [this memcache status snippet](http://www.djangosnippets.org/snippets/54/) However, this version uses the quasi-internal cache._cache.get_status(), and it compiles a list of stats for each server that you specify in your CACHE_BACKEND setting.
Based on [snippet #1212](http://djangosnippets.org/snippets/1212/) along with it's comments. Replaced the for loop with translate. example usage: from django.core.cache import cache from mysnippet import cache_key_clean expensive_func = lambda x: 'x{0}x'.format(x) input_string = "I wanted a nice value." key = cache_key_clean(input_string) result = cache.get(key) if result is None: result = expensive_func(input_string) cache.set(key, result)
Django later than 1.3 (not sure of exact version) wasn't using prefix settings in cache tags or functions used in views. Just for whole page caching. This is small custom cache snippet based on memcached.CacheClass. Feel free adding any comments.
Standard memcache client uses pickle as a serialization format. It can be handy to use json, especially when another component (e.g. backend) doesn't know pickle, but json yes.
I've been working with a data set where a single object won't fit into memcached's 1 Mb slab limit. The two functions have been useful to me for debugging the size of the data structure once pickled, and if said pickled data structure is greater than 1 Mb. These functions assume CACHE_BACKEND is memcached, obviously.