That's true.
So, does it looks interesting for someone?
Or I'm the only one who do signal-based cache invalidation? :)
On Nov 15, 2:27 am, "Honza Král" <[EMAIL PROTECTED]> wrote:
> On Nov 14, 2007 9:13 PM, Sergey Kirillov <[EMAIL PROTECTED]> wrote:
>
>
>
> > For shared cache backends, like Me
On Nov 14, 2007 9:13 PM, Sergey Kirillov <[EMAIL PROTECTED]> wrote:
>
> For shared cache backends, like Memcached it works fine. You just need
> to be sure that signal handlers will be registered in all processes
> (i.e. put them in models.py)
sorry, my bad - we do a similar thing but dynamic, and
For shared cache backends, like Memcached it works fine. You just need
to be sure that signal handlers will be registered in all processes
(i.e. put them in models.py)
On Nov 14, 4:20 pm, "Honza Král" <[EMAIL PROTECTED]> wrote:
> On Nov 14, 2007 2:41 PM, Sergey Kirillov <[EMAIL PROTECTED]> wrote:
On Nov 14, 2007 2:41 PM, Sergey Kirillov <[EMAIL PROTECTED]> wrote:
>
> Hi all
>
> In my project I frequently encountered a situation where I need to
> cache some data, and then invalidate it on signal.
>
> So I wrote following decorator:
>
> def cached(slot_name, timeout=None):
> def decorator
Hi all
In my project I frequently encountered a situation where I need to
cache some data, and then invalidate it on signal.
So I wrote following decorator:
def cached(slot_name, timeout=None):
def decorator(function):
def invalidate():
cache.delete(slot_name)
d