Bug Reports, Feedback, and Discussion#

If you discover a bug or want to request a new feature, please create an issue.

If you want to discuss ideas about the project in general, or have a more open-ended question or feedback, please use Discussions.

Development Status#

Requests-cache is under active development! Contributions are very welcome, and will be attributed on the Contributors page.

How to Help#

If you are interested in helping out, here are a few ways to get started:

  • Give feedback on open issues

  • Make or suggest improvements for the documentation; see #355 for details.

  • See the help-wanted issue label

  • See the shelved issue label for features that have been previously proposed and are not currently planned, but not completely ruled out either

  • If you find an issue you want to work on, please comment on it so others know it’s in progress

Dev Installation#

To set up for local development (requires poetry):

git clone
cd requests-cache
poetry install -v -E all

Linting & Formatting#

Code linting and formatting tools used include:

All of these will be run by GitHub Actions on pull requests. You can also run them locally with:

nox -e lint

Pre-Commit Hooks#

Optionally, you can use pre-commit to automatically run all of these checks before a commit is made:

pre-commit install

This can save you some time in that it will show you errors immediately rather than waiting for CI jobs to complete, or if you forget to manually run the checks before committing.

You can disable these hooks at any time with:

pre-commit uninstall


Test Layout#

  • Tests are divided into unit and integration tests:

    • Unit tests can be run without any additional setup, and don’t depend on any external services.

    • Integration tests depend on additional services, which are easiest to run using Docker (see Integration Tests section below).

  • See for pytest fixtures that apply the most common mocking steps and other test setup.

Running Tests#

  • Run pytest to run all tests

  • Run pytest tests/unit to run only unit tests

  • Run pytest tests/integration to run only integration tests

For CI jobs (including PRs), these tests will be run for each supported python version. You can use nox to do this locally, if needed:

nox -e test

Or to run tests for a specific python version:

nox -e test-3.10

To generate a coverage report:

nox -e cov

See nox --list for a ful list of available commands.

Integration Test Containers#

A live web server and backend databases are required to run integration tests, and docker-compose config is included to make this easier. First, install docker and install docker-compose.

Then, run:

docker-compose up -d
pytest tests/integration

Integration Test Alternatives#

If you can’t easily run Docker containers in your environment but still want to run some of the integration tests, you can use pytest-httpbin instead of the httpbin container. This just requires installing an extra package and setting an environment variable:

pip install pytest-httpbin
pytest tests/integration/

For backend databases, you can install and run them on the host instead of in a container, as long as they are running on the default port.


Sphinx is used to generate documentation.

To build the docs locally:

nox -e docs

To preview:

# MacOS:
open docs/_build/html/index.html
# Linux:
xdg-open docs/_build/html/index.html

You can also use sphinx-autobuild to rebuild the docs and live reload in the browser whenever doc contents change:

nox -e livedocs


Sometimes, there are differences in the Readthedocs build environment that can cause builds to succeed locally but fail remotely. To help debug this, you can use the readthedocs/build container to build the docs. A configured build container is included in docs/docker-compose.yml to simplify this.

Run with:

# Optionally add --build to rebuild with updated dependencies
docker-compose -f docs/docker-compose.yml up -d
docker exec readthedocs make all

Pull Requests#

Here are some general guidelines for submitting a pull request:

  • If the changes are trivial, just briefly explain the changes in the PR description

  • Otherwise, please submit an issue describing the proposed change prior to submitting a PR

  • Add unit test coverage for your changes

  • If your changes add or modify user-facing behavior, add documentation describing those changes

  • Submit the PR to be merged into the main branch

Notes for Maintainers#


  • Releases are built and published to PyPI based on git tags.

  • Milestones will be used to track progress on major and minor releases.

  • GitHub Actions will build and deploy packages to PyPI on tagged commits on the main branch.

Release steps:

  • Update the version in both pyproject.toml and requests_cache/

  • Update the release notes in

  • Generate a sample cache for the new version (used by unit tests) with python tests/

  • Merge changes into the main branch

  • Push a new tag, e.g.: git tag v0.1 && git push origin --tags

  • This will trigger a deployment. Verify that this completes successfully and that the new version can be installed from pypi with pip install

  • A readthedocs build will be triggered by the new tag. Verify that this completes successfully.

Downstream builds:

  • We also maintain a Conda package, which is automatically built and published by conda-forge whenever a new release is published to PyPI. The feedstock repo only needs to be updated manually if there are changes to dependencies.

  • For reference: repology lists additional downstream packages maintained by other developers.


Pre-release builds are convenient for letting testers try out in-development changes. Versions with the suffix .dev (among others) can be deployed to PyPI and installed by users with pip install --pre, and are otherwise ignored by pip install:

# Install latest pre-release build:
pip install -U --pre requests-cache

# Install latest stable build
pip install -U requests-cache