Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-39857: Turn off pytest-flake8 and add ruff configuration #145

Merged
merged 16 commits into from
Jul 5, 2023
Merged
3 changes: 1 addition & 2 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,7 @@ jobs:
shell: bash -l {0}
run: |
conda install -y -q \
"flake8<5" \
pytest pytest-flake8 pytest-xdist pytest-openfiles pytest-cov
pytest pytest-xdist pytest-openfiles pytest-cov

- name: List installed packages
shell: bash -l {0}
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_docs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ jobs:
run: pip install --no-deps -v .

- name: Install documenteer
run: pip install 'documenteer[pipelines]<0.8'
run: pip install 'documenteer[pipelines]>0.8,<0.9'

- name: Build documentation
working-directory: ./doc
run: package-docs build
run: package-docs build -n -W
5 changes: 5 additions & 0 deletions .github/workflows/lint.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,8 @@ on:
jobs:
call-workflow:
uses: lsst/rubin_workflows/.github/workflows/lint.yaml@main
ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: chartboost/ruff-action@v1
7 changes: 4 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@ repos:
hooks:
- id: isort
name: isort (python)
- repo: https://github.com/PyCQA/flake8
rev: 6.0.0
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.0.275
hooks:
- id: flake8
- id: ruff
7 changes: 7 additions & 0 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,10 @@
html_short_title = project
doxylink = {}
exclude_patterns = ["changes/*"]

# Try to pull in links for butler and pipe_base.
intersphinx_mapping["lsst"] = ("https://pipelines.lsst.io/v/daily/", None) # noqa

nitpick_ignore = [
("py:class", "networkx.classes.digraph.DiGraph"),
]
18 changes: 9 additions & 9 deletions doc/lsst.ctrl.bps/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,21 +29,21 @@ a default value, ``lsst.ctrl.bps.htcondor.HTCondorService``, will be used.
Checking status of WMS services
-------------------------------

Run `bps ping` to check the status of the WMS services. This subcommand
Run ``bps ping`` to check the status of the WMS services. This subcommand
requires specifying the WMS plugin (see :ref:`bps-wmsclass`). If the plugin
provides such functionality, it will check whether the WMS services
necessary for workflow management (submission, reporting, canceling,
etc) are usable. If the WMS services require authentication, that will
also be tested.

If services are ready for use, then `bps ping` will log an INFO success
If services are ready for use, then ``bps ping`` will log an INFO success
message and exit with 0. If not, it will log ERROR messages and exit
with a non-0 exit code. If the WMS plugin did not implement the ping
functionality, a NotImplementedError will be thrown.

.. note::

`bps ping` does *not* test whether compute resources are available or
``bps ping`` does *not* test whether compute resources are available or
that jobs will run.

.. _bps-computesite:
Expand Down Expand Up @@ -673,7 +673,7 @@ BPS can be configured to either create per-job QuantumGraph files or use the
single full QuantumGraph file plus node numbers for each job. The default is
using per-job QuantumGraph files.

To use full QuantumGraph file, the submit YAML must set `whenSaveJobQgraph` to
To use full QuantumGraph file, the submit YAML must set ``whenSaveJobQgraph`` to
"NEVER" and the ``pipetask run`` command must include ``--qgraph-id {qgraphId}
--qgraph-node-id {qgraphNodeId}``. For example:

Expand Down Expand Up @@ -824,7 +824,7 @@ User-visible Changes

The major differences visible to users are:

- `bps report` shows new job called mergeExecutionButler in detailed view.
- ``bps report`` shows new job called mergeExecutionButler in detailed view.
This is what saves the run info into the central butler repository.
As with any job, it can succeed or fail. Different from other jobs, it
will execute at the end of a run regardless of whether a job failed or
Expand Down Expand Up @@ -1006,8 +1006,8 @@ Clustering
The description of all the Quanta to be executed by a submission exists in the
full QuantumGraph for the run. bps breaks that work up into compute jobs
where each compute job is assigned a subgraph of the full QuantumGraph. This
subgraph of Quanta is called a `cluster`. bps can be configured to use
different clustering algorithms by setting `clusterAlgorithm`. The default
subgraph of Quanta is called a "cluster". bps can be configured to use
different clustering algorithms by setting ``clusterAlgorithm``. The default
is single Quantum per Job.

Single Quantum per Job
Expand All @@ -1016,7 +1016,7 @@ Single Quantum per Job
This is the default clustering algorithm. Each job gets a cluster containing
a single Quantum.

Compute job names are based upon the Quantum dataId + `templateDataId`. The
Compute job names are based upon the Quantum dataId + ``templateDataId``. The
PipelineTask label is used for grouping jobs in bps report output.

Config Entries (not currently needed as it is the default):
Expand Down Expand Up @@ -1044,7 +1044,7 @@ The minimum configuration information is a label, a list of PipelineTask
labels, and a list of dimensions. Sometimes a submission may want to treat
two dimensions as the same thing (e.g., visit and exposure) in terms of
putting Quanta in the same cluster. That is handled in the config via
`equalDimensions` (a comma-separated list of dimA:dimB pairs).
``equalDimensions`` (a comma-separated list of dimA:dimB pairs).

Job dependencies are created based upon the Quanta dependencies. This means
that the naming and order of the clusters in the submission YAML does not
Expand Down
46 changes: 42 additions & 4 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@ dynamic = ["version"]

test = [
"pytest >= 3.2",
"flake8 >= 3.7.5",
"pytest-flake8 >= 1.0.4",
"pytest-openfiles >= 0.5.0"
]

Expand Down Expand Up @@ -106,8 +104,6 @@ line_length = 110
write_to = "python/lsst/ctrl/bps/version.py"

[tool.pytest.ini_options]
addopts = "--flake8"
flake8-ignore = ["W503", "E203", "N802", "N803", "N806", "N812", "N815", "N816"]

[tool.pydocstyle]
convention = "numpy"
Expand All @@ -119,3 +115,45 @@ convention = "numpy"
# not fit on one line.
# We do not require docstrings in __init__ files (D104).
add-ignore = ["D107", "D105", "D102", "D100", "D200", "D205", "D400", "D104"]

[tool.ruff]
exclude = [
"__init__.py",
]
ignore = [
"N802",
"N803",
"N806",
"N812",
"N815",
"N816",
"N999",
"D107",
"D105",
"D102",
"D104",
"D100",
"D200",
"D205",
"D400",
]
line-length = 110
select = [
"E", # pycodestyle
"F", # pyflakes
"N", # pep8-naming
"W", # pycodestyle
"D", # pydocstyle
"UP", # pyupgrade
"C4",
]
target-version = "py310"
extend-select = [
"RUF100", # Warn about unused noqa
]

[tool.ruff.pycodestyle]
max-doc-length = 79

[tool.ruff.pydocstyle]
convention = "numpy"
2 changes: 1 addition & 1 deletion python/lsst/ctrl/bps/bps_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ class BpsConfig(Config):

Parameters
----------
other : `str`, `dict`, `Config`, `BpsConfig`
other : `str`, `dict`, `~lsst.daf.butler.Config`, `BpsConfig`
Path to a yaml file or a dict/Config/BpsConfig containing configuration
to copy.
search_order : `list` [`str`], optional
Expand Down
2 changes: 1 addition & 1 deletion python/lsst/ctrl/bps/bps_reports.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ def from_table(cls, table):

Returns
-------
inst : `lsst.ctrl.bps.report.BaseRunReport`
inst : `lsst.ctrl.bps.bps_reports.BaseRunReport`
A report created based on the information in the provided table.
"""
inst = cls(table.dtype.descr)
Expand Down
2 changes: 1 addition & 1 deletion python/lsst/ctrl/bps/cli/bps.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ class BpsCli(LoaderCLI):
localCmdPkg = "lsst.ctrl.bps.cli.cmd"


@click.command(cls=BpsCli, context_settings=dict(help_option_names=["-h", "--help"]), epilog=epilog)
@click.command(cls=BpsCli, context_settings={"help_option_names": ["-h", "--help"]}, epilog=epilog)
@log_level_option(default=["INFO"])
@long_log_option()
@log_file_option()
Expand Down
4 changes: 2 additions & 2 deletions python/lsst/ctrl/bps/cli/cmd/commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ def report(*args, **kwargs):
show_default=True,
help="Only cancel jobs submitted via bps.",
)
@click.option("--pass-thru", "pass_thru", default=str(), help="Pass the given string to the WMS service.")
@click.option("--pass-thru", "pass_thru", default="", help="Pass the given string to the WMS service.")
@click.option(
"--global/--no-global",
"is_global",
Expand All @@ -136,7 +136,7 @@ def cancel(*args, **kwargs):

@click.command(cls=BpsCommand)
@opt.wms_service_option()
@click.option("--pass-thru", "pass_thru", default=str(), help="Pass the given string to the WMS service.")
@click.option("--pass-thru", "pass_thru", default="", help="Pass the given string to the WMS service.")
def ping(*args, **kwargs):
"""Ping workflow services."""
# Note: Using return statement doesn't actually return the value
Expand Down