This page provides a guide for developers wishing to contribute to Sphinx-Needs.
For bug reports and well-described technical feature requests, please use our issue tracker: https://github.com/useblocks/sphinx-needs/issues
For feature ideas and questions, please use our discussion board: https://github.com/useblocks/sphinx-needs/discussions
If you have already created a PR, you can send it in. Our CI workflow will check (test and code styles) and a maintainer will perform a review, before we can merge it. Your PR should conform with the following rules:
- A meaningful description or link, which describes the change
- The changed code (for sure :) )
- Test cases for the change (important!)
- Updated documentation, if behavior gets changed or new options/directives are introduced.
- Update of
docs/changelog.rst
. - If this is your first PR, feel free to add your name in the
AUTHORS
file.
Sphinx-Needs requires only Poetry to be installed as a system dependency, the rest of the dependencies are 'bootstrapped' and installed in an isolated environment by Poetry.
- Install Poetry
Install project dependencies
poetry install
- Install Pre-Commit
Install the Pre-Commit hooks
pre-commit install
For running tests, install the dependencies of our official documentation:
pip install -r docs/requirements.txt
Sphinx-Needs uses make
to invoke most development related actions.
Use make list
to get a list of available targets.
make --no-print-directory --directory ../ list
To build the Sphinx-Needs documentation stored under /docs
, run:
# Build HTML pages
make docs-html
or
# Build PDF pages
make docs-pdf
It will always perform a clean build (calls make clean
before the build). If you want to avoid this, run the related sphinx-commands directly under /docs
(e.g. make docs
).
To check if all used links in the documentation are still valid, run:
make docs-linkcheck
You can either run the tests directly using pytest
, in an existing environment:
pytest tests/
Or you can use the provided Makefile:
make test
Note some tests use syrupy to perform snapshot testing. These snapshots can be updated by running:
pytest tests/ --snapshot-update
Hint
Please be sure to have the dependencies of the official documentation also installed:
pip install -r docs/requirements.txt
Sphinx-Needs uses pre-commit to run formatting and checking of source code. This can be run directly using:
pre-commit run --all-files
or via the provided Makefile:
make lint
Sphinx-Needs own documentation is used for creating a benchmark for each PR. If the runtime takes 10% longer as the previous ones, the benchmark test will fail.
Benchmark test cases are available under tests/benchmarks
. And they can be locally executed via make benchmark
.
The results for each PR/commit get added to a chart, which is available under http://useblocks.com/sphinx-needs/bench/index.html.
The benchmark data is stored on the benchmarks branch, which is also used by github-pages as source.
This project provides a test matrix for running the tests across a range of Python and Sphinx versions. This is used primarily for continuous integration.
Nox is used as a test runner.
Running the matrix tests requires additional system-wide dependencies
- Install Nox
- Install Nox-Poetry
- You will also need multiple Python versions available. You can manage these using Pyenv
You can run the test matrix by using the nox
command
nox
or using the provided Makefile
make test-matrix
For a full list of available options, refer to the Nox documentation, and the local noxfile <../noxfile.py>
.
Our noxfile.py
../noxfile.py
See the Poetry documentation for a list of commands.
In order to run custom commands inside the isolated environment, they should be prefixed with poetry run
(ie. poetry run <command>
).
There is a release pipeline installed for the CI.
This gets triggered automatically, if a tag is created and pushed. The tag must follow the format: [0-9].[0-9]+.[0-9]
. Otherwise the release jobs won't trigger.
The release jobs will build the source and wheel distribution and try to upload them to test.pypi.org
and pypy.org
.
The following is an outline of the build events which this extension adds to the Sphinx build process <events>
:
- After configuration has been initialised (
config-inited
event):- Register additional directives, directive options and warnings (
load_config
) - Check configuration consistency (
check_configuration
)
- Register additional directives, directive options and warnings (
- Before reading changed documents (
env-before-read-docs
event):- Initialise
BuildEnvironment
variables (prepare_env
) - Register services (
prepare_env
) - Register functions (
prepare_env
) - Initialise default extra options (
prepare_env
) - Initialise extra link types (
prepare_env
) - Ensure default configurations are set (
prepare_env
) - Start process timing, if enabled (
prepare_env
) - Load external needs (
load_external_needs
)
- Initialise
- For all removed and changed documents (
env-purge-doc
event):- Remove all cached need items that originate from the document (
purge_needs
)
- Remove all cached need items that originate from the document (
- For changed documents (
doctree-read
event, priority 880 of transforms)- Determine and add data on parent sections and needs(
analyse_need_locations
) - Remove
Need
nodes marked ashidden
(analyse_need_locations
)
- Determine and add data on parent sections and needs(
- When building in parallel mode (
env-merge-info
event), mergeBuildEnvironment
data (merge_data
) - After all documents have been read and transformed (
env-updated
event) (NOTE these are skipped forneeds
builder)- Copy vendored JS libraries (with CSS) to build folder (
install_lib_static_files
) - Generate permalink file (
install_permalink_file
) - Copy vendored CSS files to build folder (
install_styles_static_files
)
- Copy vendored JS libraries (with CSS) to build folder (
- Note, the
BuildEnvironment
is cached at this point, only if any documents were updated. - For all changed documents, or their dependants (
doctree-resolved
)- Replace all
Needextract
nodes with a list of the collectedNeed
(process_creator
) - Remove all
Need
nodes, ifneeds_include_needs
isTrue
(process_need_nodes
) - Call dynamic functions, set as values on the need data items and replace them with their return values (
process_need_nodes -> resolve_dynamic_values
) - Replace needs data variant values (
process_need_nodes -> resolve_variants_options
) - Check for dead links (
process_need_nodes -> check_links
) - Generate back links (
process_need_nodes -> create_back_links
) - Process constraints, for each
Need
node (process_need_nodes -> process_constraints
) - Perform all modifications on need data items, due to
Needextend
nodes (process_need_nodes -> extend_needs_data
) - Format each
Need
node to give the desired visual output (process_need_nodes -> print_need_nodes
) - Process all other need specific nodes, replacing them with the desired visual output (
process_creator
)
- Replace all
- At the end of the build (
build-finished
event)- Call all user defined need data checks, a.k.a needs_warnings (
process_warnings
) - Write the
needs.json
to the output folder, if needs_build_json = True (build_needs_json
) - Write the
needs.json
per ID to the output folder, if needs_build_json_per_id = True (build_needs_id_json
) - Write all UML files to the output folder, if needs_build_needumls = True (
build_needumls_pumls
) - Print process timing, if needs_debug_measurement = True (
process_timing
)
- Call all user defined need data checks, a.k.a needs_warnings (