skip to navigation
skip to content

fastcache 0.3.1

C implementation of Python 3 functools.lru_cache

Latest Version: 1.0.2

C implementation of Python 3 functools.lru_cache. Provides speedup of 10-30x over standard library. Passes test suite from standard library for lru_cache.

Provides 2 Least Recently Used caching function decorators:

clru_cache - built-in (faster)
>>> from fastcache import clru_cache
>>> @clru_cache(maxsize=128,typed=False,state=None)
... def f(a, b):
...     return (a, )+(b, )
>>> type(f)
>>> <class '_lrucache.cache'>
lru_cache - python wrapper around clru_cache (slower)
>>> from fastcache import lru_cache
>>> @lru_cache(maxsize=128,typed=False,state=None)
... def f(a, b):
...     return (a, )+(b, )
>>> type(f)
>>> <class 'function'>

(c)lru_cache(maxsize=128, typed=False, state=None)

Least-recently-used cache decorator.

If maxsize is set to None, the LRU features are disabled and the cache can grow without bound.

If typed is True, arguments of different types will be cached separately. For example, f(3.0) and f(3) will be treated as distinct calls with distinct results.

If state is a list, the items in the list will be incorporated into argument hash.

Arguments to the cached function must be hashable.

View the cache statistics named tuple (hits, misses, maxsize, currsize) with f.cache_info(). Clear the cache and statistics with f.cache_clear(). Access the underlying function with f.__wrapped__.


File Type Py Version Uploaded on Size
fastcache-0.3.1.tar.gz (md5, pgp) Source 2014-07-02 11KB