Home
NATS Tools
Useful tools to develop Python projects relying on NATS
Quick start
Installing the project
Users can install project from github using pip
:
Using NATSD
It's possible to control NATS servers using nats_tools.NATSD
:
from nats_tools import NATSD
# Create a new nats-server daemon
natsd = NATSD(debug=True)
# Start the server
natsd.start()
# Stop the server
natsd.stop()
Using pytest fixtures
Define an argument named natsd
in your tests in order to get a NATSD
instance already started. The instance is stopped during test teardown.
def test_using_nats_server(natsd: NATSD):
"""A test which relies on nats-server being up and running."""
assert natsd.is_alive()
assert natsd.monitor.healthz() == {"status": "ok"}
Using the fixture, each test is executed using a unique nats-server running in its own process.
Parametrizing fixtures
It's possible to start a NATS server with custom configuration for each test usig parametrized fixture:
@parametrize_nats_server(port=5000)
def test_parametetrize_nats_server_fixture_can_be_used(natsd: NATSD):
assert natsd.port == 5000
Using custom fixtures
It's very easy to create a fixture similar to natsd
fixture:
import pytest
import typing as t
from nats_tools import NATSD
@pytest.fixture
def my_natsd() -> t.Iterator[NATSD]:
"""Yield a started NATSD instance for each test case."""
with NATSD() as nats:
yield nats
CI Helpers
The script install-nats-server.sh
located in scripts/
directory can be used in CI environments to install NATS server.
Developer installation
This project is packaged using setuptools and a pyproject.toml according to PEP 621.
Starting with PEP 621, the Python community selected
pyproject.toml
as a standard way of specifying python projects metadata. Setuptools has adopted this standard and use the information contained in this file as an input in the build process.
Install using script
The install script is responsible for first creating a virtual environment, then updating packaging dependencies such as
pip
,setuptools
andwheel
within the virtual environment. Finally, it installs the project in development mode within the virtual environment.The virtual environment is always named
.venv/
Run the install.py
script located in the scripts/
directory with the Python interpreter of your choice. The script accepts the following arguments:
--dev
: install extra dependencies required to contribute to development--docs
: install extra dependencies required to build and serve documentation-e
or--extras
: a string of comma-separated extras such as"dev,docs"
.-a
or--all
: a boolean flag indicating that all extras should be installed.
Example usage:
- Install with build extra only (default behaviour)
- Install with dev extra
- Install all extras
Note: The
venv
module must be installed for the python interpreter used to run install script. On Debian and Ubuntu systems, this package can be installed using the following command:sudo apt-get install python3-venv
. On Windows systems, python distributions have thevenv
module installed by default.
Development tasks
The file tasks.py
is an invoke task file. It describes several tasks which developers can execute to perform various actions.
To list all available tasks, activate the project virtual environment, and run the command inv --list
:
$ inv --list
Available tasks:
build Build sdist and wheel, and optionally build documentation.
requirements Generate requirements.txt file
check Run mypy typechecking.
clean Clean build artifacts and optionally documentation artifacts as well as generated bytecode.
coverage Serve code coverage results and optionally run tests before serving results
docker Build cross-platform docker image for the project
docs Serve the documentation in development mode.
format Format source code using black and isort.
lint Lint source code using flake8.
test Run tests using pytest and optionally enable coverage.
wheelhouse Build wheelhouse for the project
Build project artifacts
The build
task can be used to build a source distribution (sdist
), a wheel binary package by default.
Optionally, it can be used to build the project documentation as a static website.
Usage:
- Build
sdist
andwheel
only:
- Build
sdist
,wheel
and documentation:
Building wheelhouse
The wheelhouse
task can be used to generate an installation bundle also known as a wheelhouse.
pip wheel can be used to generate and package all of a project’s dependencies, with all the compilation performed, into a single directory that can be converted into a single archive. This archive then allows installation when index servers are unavailable and avoids time-consuming recompilation.
This command does not accept any argument, and generates the wheelhouse into dist/wheelhouse
.
Run tests
The test
task can be used to run tests using pytest
.
By default, test coverage is not enabled and -c
or --cov
option must be provided to enable test coverage.
Usage:
- Run tests without coverage:
- Run tests with coverage:
Visualize test coverage
The coverage
task can be used to serve test coverage results on http://localhost:8000
by default. Use --port
option to use a different port.
By default, test coverage is expected to be present before running the task. If it is desired to run tests before serving the results, use --run
option.
Run typechecking
The check
task can be used to run mypy
.
By default type checking is not run on tests and -i
or --include-tests
option must be provided to include them.
Run linter
The lint
task can be used to lint source code using flake8
. This task does not accept any option.
flake8
is configured in the setup.cfg file.
Format source code
The format
task can be used to format source code using black
and isort
. This task does not accept any option.
black
is not configured in any way, butisort
is configured in setup.cfg.
Serve the documentation
The docs
task can be used to serve the documentation as a static website on http://localhost:8000 with auto-reload enabled by default. Use the --port
option to change the listenning port and the --no-watch
to disable auto-reload.
Build Docker image
The docker
task can be used to build a Docker image for the project. The Dockerfile can be found at the root of the repository. By default, image is built for linux/amd64
only. Use --platform
argument to build image for different architectures.
Use
inv docker --help
to learn about allowed options.
Git flow
Two branches exist:
-
next
: The development branch. All developers must merge commits tonext
through Pull Requests. -
main
: The release branch. Developers must not commit to this branch. Only merge fromnext
branch with fast-forward strategy are allowed onmain
branch.
Each time new commits are pushed on
main
, semantic-release may perform a release bump according to commit messages.
Git commits
Developers are execpted to write commit messages according to the Convetionnal Commits specification.
Commit messages which are not valid conventionnal commits are ignored in changelog.
Changelog
Changelog is generated for each release candidate and each release according to commit messages found since last release.
Changelog content is written to CHANGELOG.md
by @semantic-release/release-notes-generator plugin configured with conventionnalcommit
preset.
Contributing to the documentation
Project documentation is written using MkDocs static site generator. Documentation source files are written in Markdown. They can be found in docs/ directory.
Aside from documentation written in markdown files, Python API reference is generated from docstrings and type annotations found in source code.
Acknowledgements
Most of the code used in this project directly comes from or is inspired by nats-py testing module. nats-py is licensed under Apache-2.0