This part of the documentation covers all the interfaces of requests-cache
Core functions for configuring cache and monkey patching requests
Configure cache storage and patch requests library to transparently cache responses
Parameters: |
|
---|
Returns True if cache has url, False otherwise
Context manager for temporary disabling cache
>>> with requests_cache.disabled():
... request.get('http://httpbin.org/ip')
... request.get('http://httpbin.org/get')
Context manager for temporary enabling cache
>>> with requests_cache.enabled():
... request.get('http://httpbin.org/ip')
... request.get('http://httpbin.org/get')
Clear cache
Undo requests monkey patch
Redo requests monkey patch
Returns internal cache object
Deletes all cache for url
Contains BaseCache class which can be used as in-memory cache backend or extended to support persistence.
Base class for cache implementations, can be used as in-memory cache.
To extend it you can provide dictionary-like objects for url_map and responses or override public methods.
url -> key_in_cache mapping
key_in_cache -> response mapping
Save response to cache
Parameters: |
|
---|
Note
Response is reduced before saving (with reduce_response()) to make it picklable
Retrieves response and timestamp for url if it’s stored in cache, otherwise returns default
Parameters: |
|
---|---|
Returns: | tuple (response, datetime) |
Note
Response is restored after unpickling with restore_response()
Delete url from cache. Also deletes all urls from response history
Clear cache
Returns True if cache has url, False otherwise
Reduce response object to make it compatible with pickle
Restore response object after unpickling
sqlite3 cache backend
sqlite cache backend.
Reading is fast, saving is a bit slower. It can store big amount of data with low memory usage.
Parameters: |
|
---|
Dictionary-like objects for saving large data sets to sqlite database
DbDict - a dictionary-like object for saving large datasets to sqlite database
It’s possible to create multiply DbDict instances, which will be stored as separate tables in one database:
d1 = DbDict('test', 'table1')
d2 = DbDict('test', 'table2')
d3 = DbDict('test', 'table3')
all data will be stored in test.sqlite database into correspondent tables: table1, table2 and table3
Parameters: |
|
---|
Transactions can be commited if this property is set to True
Commits pending transaction if can_commit or force is True
Parameters: | force – force commit, ignore can_commit |
---|
Context manager used to speedup insertion of big number of records
>>> d1 = DbDict('test')
>>> with d1.bulk_commit():
... for i in range(1000):
... d1[i] = i * 2
Same as DbDict, but pickles values before saving
Parameters: |
|
---|
Dictionary-like objects for saving large data sets to mongodb database
MongoDict - a dictionary-like interface for mongo database
Parameters: |
|
---|
Same as MongoDict, but pickles values before saving
Parameters: |
|
---|