Scanpy provides extensive developer documentation, most of which applies to this repo, too. This document will not reproduce the entire content from there. Instead, it aims at summarizing the most important information to get you started on contributing.
We assume that you are already familiar with git and with making pull requests on GitHub. If not, please refer to the scanpy developer guide.
Installing dev dependencies#
In addition to the packages needed to use this package, you need additional python packages to run tests and build
the documentation. It’s easy to install them using
cd cookiecutter-scverse-instance pip install -e ".[dev,test,doc]"
This template uses pre-commit to enforce consistent code-styles. On every commit, pre-commit checks will either automatically fix issues with the code, or raise an error message. See pre-commit checks for a full list of checks enabled for this repository.
To enable pre-commit locally, simply run
in the root of the repository. Pre-commit will automatically download all dependencies when it is run for the first time.
Alternatively, you can rely on the pre-commit.ci service enabled on GitHub. If you didn’t run
pushing changes to GitHub it will automatically commit fixes to your pull request, or show an error message.
If pre-commit.ci added a commit on a branch you still have been working on locally, simply use
git pull --rebase
to integrate the changes into yours. While the pre-commit.ci is useful, we strongly encourage installing and running pre-commit locally first to understand its usage.
Remember to first install the package with
pip install '-e[dev,test]'
Most IDEs integrate with pytest and provide a GUI to run tests. Alternatively, you can run all tests from the command line by executing
in the root of the repository. Continuous integration will automatically run the tests on all pull requests.
Publishing a release#
Updating the version number#
Before making a release, you need to update the version number. Please adhere to Semantic Versioning, in brief
Given a version number MAJOR.MINOR.PATCH, increment the:
MAJOR version when you make incompatible API changes,
MINOR version when you add functionality in a backwards compatible manner, and
PATCH version when you make backwards compatible bug fixes.
Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.
We use bump2version to automatically update the version number in all places and automatically create a git tag. Run one of the following commands in the root of the repository
bump2version patch bump2version minor bump2version major
Once you are done, run
git push --tags
to publish the created tag on GitHub.
Building and publishing the package on PyPI#
Python packages are not distributed as source code, but as distributions. The most common distribution format is the so-called wheel. To build a wheel, run
python -m build
This command creates a source archive and a wheel, which are required for publishing your package to PyPI. These files are created directly in the root of the repository.
Before uploading them to PyPI you can check that your distribution is valid by running:
twine check dist/*
and finally publishing it with:
twine upload dist/*
Provide your username and password when requested and then go check out your package on PyPI!
For more information, follow the Python packaging tutorial.
It is possible to automate this with GitHub actions, see also this feature request in the cookiecutter-scverse template.
Please write documentation for new or changed features and use-cases. This project uses sphinx with the following features:
the myst extension allows to write documentation in markdown/Markedly Structured Text
Sphinx autodoc typehints, to automatically reference annotated input and output types
See the scanpy developer docs for more information on how to write documentation.
Tutorials with myst-nb and jupyter notebooks#
The documentation is set-up to render jupyter notebooks stored in the
docs/notebooks directory using myst-nb.
Currently, only notebooks in
.ipynb format are supported that will be included with both their input and output cells.
It is your reponsibility to update and re-run the notebook whenever necessary.
If you are interested in automatically running notebooks as part of the continuous integration, please check
out this feature request in the
If you refer to objects from other packages, please add an entry to
docs/conf.py. Only if you do so can sphinx automatically create a link to the external documentation.
If building the documentation fails because of a missing link that is outside your control, you can add an entry to the
Building the docs locally#
cd docs make html open _build/html/index.html
or use sphinx-autobuild
sphinx-autobuild docs docs/_build/html