Lifter offers a built-in caching mechanism that can be used store query results and retrieve them later.
This is an efficient way to reduce the I/O of your application, avoid reaching rate limites, suffer from network latency, etc.
Caching is configured on store creation, via the following API:
from lifter import caches from lifter import models from lifter.backend import http class MyModel(models.Model): class Meta: app_name = 'my_app' name = 'my_model' cache = caches.DummyCache() store = http.RESTStore(identifier='my_store', cache=cache) manager = store.query(MyModel)
You can use the same
Cache instance accross multiple store if you want, this won’t lead to cache collisions.
How does it work?¶
When a cache is configured for a given store and the store execute a query, the following happens:
- The store identifier, the model app, the model name and the query are hashed together to form a cache key
- The cache is then queried using that key
- If a result is found with that key, it’s returned directly without sending the query to the underlying backend
- If no result is found, the query is processed normally, but result will be stored in the cache for further use
Once a cache is configured for a store, it is automatically used:
# This will execute the query and store results in the cache manager.all().count() # For this one, the query won't execute, since the value is present in the cache manager.all().count()
For the previous example, the cache key will look like:
The following arguments are available to all cache instances, all are optional.
The default timeout in seconds that will be used for cached values.
None, meaning the value will never expire.
Returns a context manager to bypass the cache:
with cache.disable(): # Will ignore the cache manager.count()
Returns a context manager to force enabling the cache if it is disabled:
with cache.enable(): manager.count()
get(key, default=None, reraise=False)¶
Get the given key from the cache, if present. A default value can be provided in case the requested key is not present, otherwise, None will be returned.
- key (bool) – the key to query
- default – the value to return if the key does not exist in cache
- reraise – wether an exception should be thrown if now value is found, defaults to False.
cache.set('my_key', 'my_value') cache.get('my_key') >>> 'my_value' cache.get('not_present', 'default_value') >>> 'default_value' cache.get('not_present', reraise=True) >>> raise lifter.exceptions.NotInCache
set(key, value, timeout=<class 'lifter.caches.NotSet'>)¶
Set the given key to the given value in the cache. A timeout may be provided, otherwise, the
Cache.default_timeoutwill be used.
- key (str) – the key to which the value will be bound
- value – the value to store in the cache
- timeout (integer or None) – the expiration delay for the value. None means it will never expire.
# this cached value will expire after half an hour cache.set('my_key', 'value', 1800)
Available cache backends¶
At the moment, the only cache backend available is the
DummyCache, that store values in a Python dictionary.
You can use it’s code as a starting point to implement your own backends, using Redis or Memcached, for example.