How can I combine these two:
Werkzeug's @cached_property decorator: http://werkzeug.pocoo.org/docs/0.11/utils/#werkzeug.utils.cached_property
SQLAlchemy's @hybrid_property decorator:
http://docs.sqlalchemy.org/en/latest/orm/extensions/hybrid.html#sqlalchemy.ext.hybrid.hybrid_property
Use case:
I have a hybrid property that performs a fairly expensive calculation, and it's okay if the result is cached. I tried wrapping a test function with them both, and no matter which one comes first they both complain that the second decorator is not callable.
This is a bit tricky to get right, since both cached_property and hybrid_property expect to wrap a method and to return a property. You end up extending either one of them or both.
The nicest thing I could come up is this. It basically inlines the logic of cached_property into hybrid_property's __get__. Note that it caches the property values for the instances, but not for the class.
from sqlalchemy.ext.hybrid import hybrid_property
_missing = object() # sentinel object for missing values
class cached_hybrid_property(hybrid_property):
def __get__(self, instance, owner):
if instance is None:
# getting the property for the class
return self.expr(owner)
else:
# getting the property for an instance
name = self.fget.__name__
value = instance.__dict__.get(name, _missing)
if value is _missing:
value = self.fget(instance)
instance.__dict__[name] = value
return value
class Example(object):
@cached_hybrid_property
def foo(self):
return "expensive calculations"
At first I thought you could simply use functools.lru_cache instead of cached_property. Then I realized that you likely want an instance-specific cache instead of global cache indexed by the instance, which is what lru_cache provides. There's no standard library utility for caching method calls per instance.
To illustrate the problem with lru_cache, consider this simplistic version of caching:
CACHE = {}
class Example(object):
@property
def foo(self):
if self not in CACHE:
CACHE[self] = ... # do the actual computation
return CACHE[self]
This will store the cached values of foo for every Example instance your program generates - in other words, it can leak memory. lru_cache is a bit smarter, since it limits the size of the cache, but then you might end up re-computing some of the values you needed if they go out of the cache. A better solution is to attach the cached values to instances of Example they belong to, like done by cached_property.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With