Modules¶
wheezy.caching¶
-
class
wheezy.caching.
CacheClient
(namespaces, default_namespace)[source]¶ CacheClient serves mediator purpose between a single entry point that implements Cache and one or many namespaces targeted to concrete cache implementations.
CacheClient let partition application cache by namespaces effectively hiding details from client code.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
-
-
class
wheezy.caching.
CacheDependency
(cache, time=0, namespace=None)[source]¶ CacheDependency introduces a wire between cache items so they can be invalidated via a single operation, thus simplifing code necessary to manage dependencies in cache.
-
class
wheezy.caching.
MemoryCache
(buckets=60, bucket_interval=15)[source]¶ Effectively implements in-memory cache.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
>>> c = MemoryCache() >>> c.add('k', 'v', 100) True >>> c.add('k', 'v', 100) False
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
>>> c = MemoryCache() >>> c.add_multi({'k': 'v'}, 100) [] >>> c.add_multi({'k': 'v'}, 100) ['k']
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = MemoryCache() >>> c.decr('k') >>> c.decr('k', initial_value=10) 9 >>> c.decr('k') 8
-
delete
(key, seconds=0, namespace=None)[source]¶ Deletes a key from cache.
If
key
is not found return False>>> c = MemoryCache() >>> c.delete('k') False >>> c.store('k', 'v', 100) True >>> c.delete('k') True
There is item in cache that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.delete('k') False
-
delete_multi
(keys, seconds=0, namespace=None)[source]¶ Delete multiple keys at once.
>>> c = MemoryCache() >>> c.delete_multi(('k1', 'k2', 'k3')) True >>> c.store_multi({'k1':1, 'k2': 2}, 100) [] >>> c.delete_multi(('k1', 'k2')) True
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.get_multi(('k', )) {}
-
flush_all
()[source]¶ Deletes everything in cache.
>>> c = MemoryCache() >>> c.set_multi({'k1': 1, 'k2': 2}, 100) [] >>> c.flush_all() True
-
get
(key, namespace=None)[source]¶ Looks up a single key.
If
key
is not found return None>>> c = MemoryCache() >>> c.get('k')
Otherwise return value
>>> c.set('k', 'v', 100) True >>> c.get('k') 'v'
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.get('k')
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
>>> c = MemoryCache() >>> c.get_multi(('k1', 'k2', 'k3')) {} >>> c.store('k1', 'v1', 100) True >>> c.store('k2', 'v2', 100) True >>> sorted(c.get_multi(('k1', 'k2')).items()) [('k1', 'v1'), ('k2', 'v2')]
There is item in cache that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.get_multi(('k', )) {}
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = MemoryCache() >>> c.incr('k') >>> c.incr('k', initial_value=0) 1 >>> c.incr('k') 2
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 1, 1) >>> c.incr('k')
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
>>> c = MemoryCache() >>> c.replace('k', 'v', 100) False >>> c.add('k', 'v', 100) True >>> c.replace('k', 'v', 100) True
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
>>> c = MemoryCache() >>> c.replace_multi({'k': 'v'}, 100) ['k'] >>> c.add_multi({'k': 'v'}, 100) [] >>> c.replace_multi({'k': 'v'}, 100) []
-
set
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, regardless of previous contents in cache.
>>> c = MemoryCache() >>> c.set('k', 'v', 100) True
-
set_multi
(mapping, time=0, namespace=None)[source]¶ Set multiple keys’ values at once.
>>> c = MemoryCache() >>> c.set_multi({'k1': 1, 'k2': 2}, 100) []
-
store
(key, value, time=0, op=0)[source]¶ There is item in cached that expired
>>> c = MemoryCache() >>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.store('k', 'v', 100) True
There is item in expire_buckets that expired
>>> c = MemoryCache() >>> i = int((int(unixtime()) % c.period) ... / c.interval) - 1 >>> c.expire_buckets[i] = (allocate_lock(), [('x', 10)]) >>> c.store('k', 'v', 100) True
-
store_multi
(mapping, time=0, op=0)[source]¶ There is item in cached that expired
>>> c = MemoryCache() >>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.store_multi({'k': 'v'}, 100) []
There is item in expire_buckets that expired
>>> c = MemoryCache() >>> i = int((int(unixtime()) % c.period) ... / c.interval) - 1 >>> c.expire_buckets[i] = (allocate_lock(), [('x', 10)]) >>> c.store_multi({'k': 'v'}, 100) []
-
-
class
wheezy.caching.
NullCache
[source]¶ NullCache is a cache implementation that actually doesn’t do anything but silently performs cache operations that result no change to state.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
>>> c = NullCache() >>> c.add('k', 'v') True
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
>>> c = NullCache() >>> c.add_multi({}) []
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = NullCache() >>> c.decr('k')
-
delete
(key, seconds=0, namespace=None)[source]¶ Deletes a key from cache.
>>> c = NullCache() >>> c.delete('k') True
-
delete_multi
(keys, seconds=0, namespace=None)[source]¶ Delete multiple keys at once.
>>> c = NullCache() >>> c.delete_multi([]) True
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
>>> c = NullCache() >>> c.get_multi([]) {}
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = NullCache() >>> c.incr('k')
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
>>> c = NullCache() >>> c.replace('k', 'v') True
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
>>> c = NullCache() >>> c.replace_multi({}) []
-
wheezy.caching.client¶
client
module.
-
class
wheezy.caching.client.
CacheClient
(namespaces, default_namespace)[source]¶ CacheClient serves mediator purpose between a single entry point that implements Cache and one or many namespaces targeted to concrete cache implementations.
CacheClient let partition application cache by namespaces effectively hiding details from client code.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
-
wheezy.caching.dependency¶
dependency
module.
-
class
wheezy.caching.dependency.
CacheDependency
(cache, time=0, namespace=None)[source]¶ CacheDependency introduces a wire between cache items so they can be invalidated via a single operation, thus simplifing code necessary to manage dependencies in cache.
wheezy.caching.encoding¶
encoding
module.
-
wheezy.caching.encoding.
base64_encode
(key)[source]¶ Encodes
key
with base64 encoding.>>> result = base64_encode('my key') >>> result == 'bXkga2V5'.encode('latin1') True
-
wheezy.caching.encoding.
encode_keys
(mapping, key_encode)[source]¶ Encodes all keys in mapping with
key_encode
callable. Returns tuple of: key mapping (encoded key => key) and value mapping (encoded key => value).>>> mapping = {'k1': 1, 'k2': 2} >>> keys, mapping = encode_keys(mapping, ... lambda k: str(base64_encode(k).decode('latin1'))) >>> sorted(keys.items()) [('azE=', 'k1'), ('azI=', 'k2')] >>> sorted(mapping.items()) [('azE=', 1), ('azI=', 2)]
-
wheezy.caching.encoding.
hash_encode
(hash_factory)[source]¶ Encodes
key
with given hash function.See list of available hashes in
hashlib
module from Python Statndard Library.Additional algorithms may also be available depending upon the OpenSSL library that Python uses on your platform.
>>> try: ... from hashlib import sha1 ... key_encode = hash_encode(sha1) ... r = base64_encode(key_encode('my key')) ... assert r == 'RigVwkWdSuGyFu7au08PzUMloU8='.encode('latin1') ... except ImportError: # Python2.4 ... pass
wheezy.caching.lockout¶
lockout
module.
-
class
wheezy.caching.lockout.
Counter
(key_func, count, period, duration, reset=True, alert=None)[source]¶ A container of various attributes used by lockout.
-
class
wheezy.caching.lockout.
Locker
(cache, forbid_action, namespace=None, key_prefix='c', **terms)[source]¶ Used to define lockout terms.
-
class
wheezy.caching.lockout.
Lockout
(name, counters, forbid_action, cache, namespace, key_prefix)[source]¶ A lockout is used to enforce terms of use policy.
-
forbid_locked
(wrapped=None, action=None)[source]¶ A decorator that forbids access (by a call to forbid_action) to func once the counter threshold is reached (lock is set).
You can override default forbid action by action.
See test_lockout.py for an example.
-
guard
(func)[source]¶ A guard decorator is applied to a func which returns a boolean indicating success or failure. Each failure is a subject to increase counter. The counters that support reset (and related locks) are deleted on success.
-
wheezy.caching.logging¶
logging module.
wheezy.caching.memcache¶
memcache
module.
-
class
wheezy.caching.memcache.
MemcachedClient
(*args, **kwargs)[source]¶ A wrapper around python-memcache Client in order to adapt cache contract.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
-
wheezy.caching.memory¶
memory
module.
-
class
wheezy.caching.memory.
CacheItem
(key, value, expires)[source]¶ A single cache item stored in cache.
-
class
wheezy.caching.memory.
MemoryCache
(buckets=60, bucket_interval=15)[source]¶ Effectively implements in-memory cache.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
>>> c = MemoryCache() >>> c.add('k', 'v', 100) True >>> c.add('k', 'v', 100) False
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
>>> c = MemoryCache() >>> c.add_multi({'k': 'v'}, 100) [] >>> c.add_multi({'k': 'v'}, 100) ['k']
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = MemoryCache() >>> c.decr('k') >>> c.decr('k', initial_value=10) 9 >>> c.decr('k') 8
-
delete
(key, seconds=0, namespace=None)[source]¶ Deletes a key from cache.
If
key
is not found return False>>> c = MemoryCache() >>> c.delete('k') False >>> c.store('k', 'v', 100) True >>> c.delete('k') True
There is item in cache that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.delete('k') False
-
delete_multi
(keys, seconds=0, namespace=None)[source]¶ Delete multiple keys at once.
>>> c = MemoryCache() >>> c.delete_multi(('k1', 'k2', 'k3')) True >>> c.store_multi({'k1':1, 'k2': 2}, 100) [] >>> c.delete_multi(('k1', 'k2')) True
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.get_multi(('k', )) {}
-
flush_all
()[source]¶ Deletes everything in cache.
>>> c = MemoryCache() >>> c.set_multi({'k1': 1, 'k2': 2}, 100) [] >>> c.flush_all() True
-
get
(key, namespace=None)[source]¶ Looks up a single key.
If
key
is not found return None>>> c = MemoryCache() >>> c.get('k')
Otherwise return value
>>> c.set('k', 'v', 100) True >>> c.get('k') 'v'
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.get('k')
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
>>> c = MemoryCache() >>> c.get_multi(('k1', 'k2', 'k3')) {} >>> c.store('k1', 'v1', 100) True >>> c.store('k2', 'v2', 100) True >>> sorted(c.get_multi(('k1', 'k2')).items()) [('k1', 'v1'), ('k2', 'v2')]
There is item in cache that expired
>>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.get_multi(('k', )) {}
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = MemoryCache() >>> c.incr('k') >>> c.incr('k', initial_value=0) 1 >>> c.incr('k') 2
There is item in cached that expired
>>> c.items['k'] = CacheItem('k', 1, 1) >>> c.incr('k')
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
>>> c = MemoryCache() >>> c.replace('k', 'v', 100) False >>> c.add('k', 'v', 100) True >>> c.replace('k', 'v', 100) True
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
>>> c = MemoryCache() >>> c.replace_multi({'k': 'v'}, 100) ['k'] >>> c.add_multi({'k': 'v'}, 100) [] >>> c.replace_multi({'k': 'v'}, 100) []
-
set
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, regardless of previous contents in cache.
>>> c = MemoryCache() >>> c.set('k', 'v', 100) True
-
set_multi
(mapping, time=0, namespace=None)[source]¶ Set multiple keys’ values at once.
>>> c = MemoryCache() >>> c.set_multi({'k1': 1, 'k2': 2}, 100) []
-
store
(key, value, time=0, op=0)[source]¶ There is item in cached that expired
>>> c = MemoryCache() >>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.store('k', 'v', 100) True
There is item in expire_buckets that expired
>>> c = MemoryCache() >>> i = int((int(unixtime()) % c.period) ... / c.interval) - 1 >>> c.expire_buckets[i] = (allocate_lock(), [('x', 10)]) >>> c.store('k', 'v', 100) True
-
store_multi
(mapping, time=0, op=0)[source]¶ There is item in cached that expired
>>> c = MemoryCache() >>> c.items['k'] = CacheItem('k', 'v', 1) >>> c.store_multi({'k': 'v'}, 100) []
There is item in expire_buckets that expired
>>> c = MemoryCache() >>> i = int((int(unixtime()) % c.period) ... / c.interval) - 1 >>> c.expire_buckets[i] = (allocate_lock(), [('x', 10)]) >>> c.store_multi({'k': 'v'}, 100) []
-
-
wheezy.caching.memory.
expires
(now, time)[source]¶ time
is below 1 month>>> expires(10, 1) 11
more than month
>>> expires(10, 3000000) 3000000
otherwise
>>> expires(0, 0) 2147483647 >>> expires(0, -1) 2147483647
-
wheezy.caching.memory.
find_expired
(bucket_items, now)[source]¶ If there are no expired items in the bucket returns empty list
>>> bucket_items = [('k1', 1), ('k2', 2), ('k3', 3)] >>> find_expired(bucket_items, 0) [] >>> bucket_items [('k1', 1), ('k2', 2), ('k3', 3)]
Expired items are returned in the list and deleted from the bucket
>>> find_expired(bucket_items, 2) ['k1'] >>> bucket_items [('k2', 2), ('k3', 3)]
wheezy.caching.null¶
interface
module.
-
class
wheezy.caching.null.
NullCache
[source]¶ NullCache is a cache implementation that actually doesn’t do anything but silently performs cache operations that result no change to state.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
>>> c = NullCache() >>> c.add('k', 'v') True
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
>>> c = NullCache() >>> c.add_multi({}) []
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = NullCache() >>> c.decr('k')
-
delete
(key, seconds=0, namespace=None)[source]¶ Deletes a key from cache.
>>> c = NullCache() >>> c.delete('k') True
-
delete_multi
(keys, seconds=0, namespace=None)[source]¶ Delete multiple keys at once.
>>> c = NullCache() >>> c.delete_multi([]) True
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
>>> c = NullCache() >>> c.get_multi([]) {}
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
>>> c = NullCache() >>> c.incr('k')
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
>>> c = NullCache() >>> c.replace('k', 'v') True
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
>>> c = NullCache() >>> c.replace_multi({}) []
-
wheezy.caching.patterns¶
patterns
module.
-
class
wheezy.caching.patterns.
Cached
(cache, key_builder=None, time=0, namespace=None, timeout=10, key_prefix='one_pass:')[source]¶ Specializes access to cache by using a number of common settings for various cache operations and patterns.
-
add
(key, value, dependency_key=None)[source]¶ Sets a key’s value, if and only if the item is not already.
-
get_multi
(keys)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
-
get_or_add
(key, create_factory, dependency_key_factory)[source]¶ Cache Pattern: get an item by key from cache and if it is not available use create_factory to aquire one. If result is not None use cache add operation to store result and if operation succeed use dependency_key_factory to get an instance of dependency_key to link with key.
-
get_or_create
(key, create_factory, dependency_key_factory=None)[source]¶ Cache Pattern: get an item by key from cache and if it is not available see one_pass_create.
-
get_or_set
(key, create_factory, dependency_key_factory=None)[source]¶ Cache Pattern: get an item by key from cache and if it is not available use create_factory to aquire one. If result is not None use cache set operation to store result and use dependency_key_factory to get an instance of dependency_key to link with key.
-
get_or_set_multi
(make_key, create_factory, args)[source]¶ Cache Pattern: get_multi items by make_key over args from cache and if there are any missing use create_factory to aquire them, if result available use cache set_multi operation to store results, return cached items if any.
-
one_pass_create
(key, create_factory, dependency_key_factory=None)[source]¶ Cache Pattern: try enter one pass: (1) if entered use create_factory to get a value if result is not None use cache set operation to store result and use dependency_key_factory to get an instance of dependency_key to link with key; (2) if not entered wait until one pass is available and it is not timed out get an item by key from cache.
-
replace_multi
(mapping)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
-
set
(key, value, dependency_key=None)[source]¶ Sets a key’s value, regardless of previous contents in cache.
-
wraps_get_or_add
(wrapped=None, make_key=None)[source]¶ Returns specialized decorator for get_or_add cache pattern.
Example:
kb = key_builder('repo') cached = Cached(cache, kb, time=60) @cached.wraps_get_or_add def list_items(self, locale): pass
-
wraps_get_or_create
(wrapped=None, make_key=None)[source]¶ Returns specialized decorator for get_or_create cache pattern.
Example:
kb = key_builder('repo') cached = Cached(cache, kb, time=60) @cached.wraps_get_or_create def list_items(self, locale): pass
-
-
class
wheezy.caching.patterns.
OnePass
(cache, key, time=10, namespace=None)[source]¶ A solution to Thundering Head problem.
see http://en.wikipedia.org/wiki/Thundering_herd_problem
Typical use:
with OnePass(cache, 'op:' + key) as one_pass: if one_pass.acquired: # update *key* in cache elif one_pass.wait(): # obtain *key* from cache else: # timeout
-
wheezy.caching.patterns.
key_builder
(key_prefix='')[source]¶ Returns a key builder that allows build a make cache key function at runtime.
>>> def list_items(self, locale='en', sort_order=1): ... pass
>>> repo_key_builder = key_builder('repo') >>> make_key = repo_key_builder(list_items) >>> make_key('self') "repo-list_items:'en':1" >>> make_key('self', 'uk') "repo-list_items:'uk':1" >>> make_key('self', sort_order=0) "repo-list_items:'en':0"
Here is an example of make key function:
def key_list_items(self, locale='en', sort_order=1): return "repo-list_items:%r:%r" % (locale, sort_order)
wheezy.caching.pylibmc¶
pylibmc
module.
-
class
wheezy.caching.pylibmc.
MemcachedClient
(pool, key_encode=None)[source]¶ A wrapper around pylibmc Client in order to adapt cache contract.
-
add
(key, value, time=0, namespace=None)[source]¶ Sets a key’s value, if and only if the item is not already.
-
add_multi
(mapping, time=0, namespace=None)[source]¶ Adds multiple values at once, with no effect for keys already in cache.
-
decr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically decrements a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then decremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
get_multi
(keys, namespace=None)[source]¶ Looks up multiple keys from cache in one operation. This is the recommended way to do bulk loads.
-
incr
(key, delta=1, namespace=None, initial_value=None)[source]¶ Atomically increments a key’s value. The value, if too large, will wrap around.
If the key does not yet exist in the cache and you specify an initial_value, the key’s value will be set to this initial value and then incremented. If the key does not exist and no initial_value is specified, the key’s value will not be set.
-
replace
(key, value, time=0, namespace=None)[source]¶ Replaces a key’s value, failing if item isn’t already.
-
replace_multi
(mapping, time=0, namespace=None)[source]¶ Replaces multiple values at once, with no effect for keys not in cache.
-
wheezy.caching.utils¶
utils
module.
-
wheezy.caching.utils.
total_seconds
(delta)[source]¶ Returns a total number of seconds for the given delta.
delta
can bedatetime.timedelta
.>>> total_seconds(timedelta(hours=2)) 7200
or int:
>>> total_seconds(100) 100
otherwise raise
TypeError
.>>> total_seconds('100') # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: ...