Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-27131: Reorganize pickling to allow subclasses to add parameters #153

Merged
merged 2 commits into from
Oct 12, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 4 additions & 3 deletions examples/argumentParser.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,14 @@
#
"""Example showing use of the argument parser

Here are some examples that use the repository in obs_test (which is automatically setup
when pipe_base is setup):
Here are some examples that use the repository in obs_test (which is
automatically setup when pipe_base is setup):

./argumentParser.py $OBS_TEST_DIR/data/input --help
./argumentParser.py $OBS_TEST_DIR/data/input --id --show config data
./argumentParser.py $OBS_TEST_DIR/data/input --id filter=g --show data
./argumentParser.py $OBS_TEST_DIR/data/input --id filter=g --config oneFloat=1.5 --show config
./argumentParser.py $OBS_TEST_DIR/data/input --id filter=g \
--config oneFloat=1.5 --show config
"""
import lsst.pex.config as pexConfig
import lsst.pipe.base as pipeBase
Expand Down
12 changes: 8 additions & 4 deletions python/lsst/pipe/base/argumentParser.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,8 @@ def castDataIds(self, butler):
try:
keyType = idKeyTypeDict[key]
except KeyError:
# OK, assume that it's a valid key and guess that it's a string
# OK, assume that it's a valid key and guess that it's a
# string
keyType = str

log = lsstLog.Log.getDefaultLogger()
Expand Down Expand Up @@ -735,7 +736,8 @@ def _parseDirectories(self, namespace):
mapperClass = dafPersist.Butler.getMapperClass(_fixPath(DEFAULT_INPUT_NAME, namespace.rawInput))
namespace.calib = _fixPath(DEFAULT_CALIB_NAME, namespace.rawCalib)

# If an output directory is specified, process it and assign it to the namespace
# If an output directory is specified, process it and assign it to the
# namespace
if namespace.rawOutput:
namespace.output = _fixPath(DEFAULT_OUTPUT_NAME, namespace.rawOutput)
else:
Expand Down Expand Up @@ -850,7 +852,8 @@ def handleCamera(self, namespace):
Namespace (an ) with the following fields:

- ``camera``: the camera name.
- ``config``: the config passed to parse_args, with no overrides applied.
- ``config``: the config passed to parse_args, with no overrides
applied.
- ``obsPkg``: the ``obs_`` package for this camera.
- ``log``: a `lsst.log` Log.

Expand Down Expand Up @@ -1009,7 +1012,8 @@ class FilteredStream:
"""

def __init__(self, pattern):
# obey case if pattern isn't lowecase or requests NOIGNORECASE
# obey case if pattern isn't lowecase or requests
# NOIGNORECASE
mat = re.search(r"(.*):NOIGNORECASE$", pattern)

if mat:
Expand Down
75 changes: 42 additions & 33 deletions python/lsst/pipe/base/butlerQuantumContext.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,28 +108,32 @@ def get(self, dataset: typing.Union[InputQuantizedConnection,
Parameters
----------
dataset
This argument may either be an `InputQuantizedConnection` which describes
all the inputs of a quantum, a list of `~lsst.daf.butler.DatasetRef`, or
a single `~lsst.daf.butler.DatasetRef`. The function will get and return
This argument may either be an `InputQuantizedConnection` which
describes all the inputs of a quantum, a list of
`~lsst.daf.butler.DatasetRef`, or a single
`~lsst.daf.butler.DatasetRef`. The function will get and return
the corresponding datasets from the butler.

Returns
-------
return : `object`
This function returns arbitrary objects fetched from the bulter. The
structure these objects are returned in depends on the type of the input
argument. If the input dataset argument is a InputQuantizedConnection, then
the return type will be a dictionary with keys corresponding to the attributes
of the `InputQuantizedConnection` (which in turn are the attribute identifiers
of the connections). If the input argument is of type `list` of
`~lsst.daf.butler.DatasetRef` then the return type will be a list of objects.
If the input argument is a single `~lsst.daf.butler.DatasetRef` then a single
object will be returned.
This function returns arbitrary objects fetched from the bulter.
The structure these objects are returned in depends on the type of
the input argument. If the input dataset argument is a
`InputQuantizedConnection`, then the return type will be a
dictionary with keys corresponding to the attributes of the
`InputQuantizedConnection` (which in turn are the attribute
identifiers of the connections). If the input argument is of type
`list` of `~lsst.daf.butler.DatasetRef` then the return type will
be a list of objects. If the input argument is a single
`~lsst.daf.butler.DatasetRef` then a single object will be
returned.

Raises
------
ValueError
If a `DatasetRef` is passed to get that is not defined in the quantum object
Raised if a `DatasetRef` is passed to get that is not defined in
the quantum object
"""
if isinstance(dataset, InputQuantizedConnection):
retVal = {}
Expand All @@ -154,26 +158,30 @@ def put(self, values: typing.Union[Struct, typing.List[typing.Any], object],
Parameters
----------
values : `Struct` or `list` of `object` or `object`
The data that should be put with the butler. If the type of the dataset
is `OutputQuantizedConnection` then this argument should be a `Struct`
with corresponding attribute names. Each attribute should then correspond
to either a list of object or a single object depending of the type of the
corresponding attribute on dataset. I.e. if dataset.calexp is [datasetRef1,
datasetRef2] then values.calexp should be [calexp1, calexp2]. Like wise
if there is a single ref, then only a single object need be passed. The same
restriction applies if dataset is directly a `list` of `DatasetRef` or a
single `DatasetRef`.
The data that should be put with the butler. If the type of the
dataset is `OutputQuantizedConnection` then this argument should be
a `Struct` with corresponding attribute names. Each attribute
should then correspond to either a list of object or a single
object depending of the type of the corresponding attribute on
dataset. I.e. if ``dataset.calexp`` is
``[datasetRef1, datasetRef2]`` then ``values.calexp`` should be
``[calexp1, calexp2]``. Like wise if there is a single ref, then
only a single object need be passed. The same restriction applies
if dataset is directly a `list` of `DatasetRef` or a single
`DatasetRef`.
dataset
This argument may either be an `InputQuantizedConnection` which describes
all the inputs of a quantum, a list of `lsst.daf.butler.DatasetRef`, or
a single `lsst.daf.butler.DatasetRef`. The function will get and return
This argument may either be an `InputQuantizedConnection` which
describes all the inputs of a quantum, a list of
`lsst.daf.butler.DatasetRef`, or a single
`lsst.daf.butler.DatasetRef`. The function will get and return
the corresponding datasets from the butler.

Raises
------
ValueError
If a `DatasetRef` is passed to put that is not defined in the quantum object, or
the type of values does not match what is expected from the type of dataset.
Raised if a `DatasetRef` is passed to put that is not defined in
the quantum object, or the type of values does not match what is
expected from the type of dataset.
"""
if isinstance(dataset, OutputQuantizedConnection):
if not isinstance(values, Struct):
Expand All @@ -199,19 +207,20 @@ def put(self, values: typing.Union[Struct, typing.List[typing.Any], object],
raise TypeError("Dataset argument is not a type that can be used to put")

def _checkMembership(self, ref: typing.Union[typing.List[DatasetRef], DatasetRef], inout: set):
"""Internal function used to check if a DatasetRef is part of the input quantum
"""Internal function used to check if a DatasetRef is part of the input
quantum

This function will raise an exception if the ButlerQuantumContext is used to
get/put a DatasetRef which is not defined in the quantum.
This function will raise an exception if the ButlerQuantumContext is
used to get/put a DatasetRef which is not defined in the quantum.

Parameters
----------
ref : `list` of `DatasetRef` or `DatasetRef`
Either a list or a single `DatasetRef` to check
inout : `set`
The connection type to check, e.g. either an input or an output. This prevents
both types needing to be checked for every operation, which may be important
for Quanta with lots of `DatasetRef`s.
The connection type to check, e.g. either an input or an output.
This prevents both types needing to be checked for every operation,
which may be important for Quanta with lots of `DatasetRef`.
"""
if not isinstance(ref, list):
ref = [ref]
Expand Down