Skip to content

Commit

Permalink
Handle repeated options. (#2021)
Browse files Browse the repository at this point in the history
* Handle repeated options.
Close #1941

* English language fix.
  • Loading branch information
abellgithub committed May 21, 2018
1 parent 4e7fe68 commit d1a5e7d
Show file tree
Hide file tree
Showing 6 changed files with 181 additions and 32 deletions.
32 changes: 29 additions & 3 deletions doc/apps/pipeline.rst
Expand Up @@ -31,9 +31,8 @@ Substitutions

The ``pipeline`` command can accept command-line option substitutions and
they replace
existing options that are specified in the input JSON pipeline. If
multiple stages of the same name exist in the pipeline, `all` stages would
be overridden. For example, to set the output and input LAS files for a
existing options that are specified in the input JSON pipeline.
For example, to set the output and input LAS files for a
pipeline that does a translation, the ``filename`` for the reader and the
writer can be overridden:

Expand All @@ -42,6 +41,33 @@ writer can be overridden:
$ pdal pipeline translate.json --writers.las.filename=output.laz \
--readers.las.filename=input.las

If multiple stages of the same name exist in the pipeline, `all` stages would
be overridden. In the following example, both colorization filters would
have their `dimensions` option overridden to the value
"Red:1:1.0, Blue, Green::256.0" by the command shown below:

::

{
"pipeline" : [
"input.las",
{
"type" : "filters.colorization",
"raster" : "raster1.tiff"
"dimensions": "Red"
},
{
"type" : "filters.colorization",
"raster" : "raster2.tiff"
"dimensions": "Blue"
},
"placeholder.laz"
]
}

$ pdal pipeline colorize.json --filters.colorization.dimensions= \
"Red:1:1.0, Blue, Green::256.0"

Option substitution can also refer to the tag of an individual stage.
This can be done by using the syntax --stage.<tagname>.<option>. This
allows options to be set on individual stages, even if there are multiple
Expand Down
56 changes: 46 additions & 10 deletions doc/pipeline.rst
Expand Up @@ -113,6 +113,39 @@ with the :ref:`writers.gdal` writer:

.. _processing_modes:

Point Views and Multiple Outputs
................................................................................

Some filters produce sets of points as output. filters.splitter, for example,
creates a point set for each tile (rectangular area) in which input points
exist.
Each of these output sets is called a point view. Point views are carried
through a PDAL pipeline individually. Some writers can produce separate
output for each input point view. These writers use a placeholder character
(#) in the output filename which is replaced by an incrementing integer for
each input point view.

The following pipeline provides an example of writing multiple output
files from a single pipeline. The crop filter creates two output point views
(one for each specified geometry) and the writer creates output files
'output1.las' and 'output2.las' containing the two sets of points:

::

.. code-block:: json
{
"pipeline":[
"input.las",
{
"type" : "filters.crop",
"bounds" :
[ "([0, 75], [0, 75])", "([50, 125], [50, 125])" ],
},
"output#.las"
]
}
Processing Modes
--------------------------------------------------------------------------------

Expand Down Expand Up @@ -194,6 +227,8 @@ For more on PDAL stages and their options, check the PDAL documentation on
* A stage object may have additional members with names corresponding to
stage-specific option names and their respective values. Values provided as
JSON objects or arrays will be stringified and parsed within the stage.
Some options allow multiple inputs. In those cases, provide the option
values as a JSON array.

* Applications can place a ``user_data`` node on any stage object and it will be
carried through to any serialized pipeline output.
Expand All @@ -205,6 +240,11 @@ Filename Globbing
characters. This can be useful if working with multiple input files in a
directory (e.g., merging all files).

Filename globbing ONLY works in pipeline specifications. It doesn't work
when a filename is provided as an option through a command-line application
like ``pdal pipeline`` or ``pdal translate``.


Extended Examples
--------------------------------------------------------------------------------

Expand Down Expand Up @@ -331,7 +371,7 @@ written as ASCII text.
{
"type":"colorization",
"raster":"autzen.tif",
"dimensions":"Red:1:1, Green:2:1, Blue:3:1"
"dimensions": ["Red:1:1", "Green:2:1", "Blue:3:1" ]
},
{
"filename":"junk.txt",
Expand All @@ -341,12 +381,11 @@ written as ASCII text.
]
}
Merge & Reproject
Reproject
................................................................................

Our first example with multiple readers, this pipeline infers the reader types,
and assigns spatial reference information to each. Next, the
:ref:`filters.merge` merges points from all previous readers, and the
and assigns spatial reference information to each.
:ref:`filters.reprojection` filter reprojects data to the specified output
spatial reference system.

Expand All @@ -362,9 +401,6 @@ spatial reference system.
"filename":"1.2-with-color.las",
"spatialreference":"EPSG:2027"
},
{
"type":"filters.merge"
},
{
"type":"reprojection",
"out_srs":"EPSG:2028"
Expand Down Expand Up @@ -419,9 +455,9 @@ is a consumer of data, and a Filter is an actor on data.

.. note::

As a C++ API consumer, you are generally not supposed to worry about the underlying
storage of the PointView, but there might be times when you simply just
"want the data." In those situations, you can use the
As a C++ API consumer, you are generally not supposed to worry about
the underlying storage of the PointView, but there might be times when
you simply just "want the data." In those situations, you can use the
:cpp:func:`pdal::PointView::getBytes` method to stream out the raw storage.


Expand Down
62 changes: 43 additions & 19 deletions pdal/PipelineReaderJSON.cpp
Expand Up @@ -139,11 +139,12 @@ void PipelineReaderJSON::parsePipeline(Json::Value& tree)
void PipelineReaderJSON::readPipeline(std::istream& input)
{
Json::Value root;
Json::Reader jsonReader;
if (!jsonReader.parse(input, root))
Json::CharReaderBuilder builder;
builder["rejectDupKeys"] = true;
std::string err;
if (!parseFromStream(builder, input, &root, &err))
{
std::string err = "JSON pipeline: Unable to parse pipeline:\n";
err += jsonReader.getFormattedErrorMessages();
err = "JSON pipeline: Unable to parse pipeline:\n" + err;
throw pdal_error(err);
}

Expand Down Expand Up @@ -306,38 +307,61 @@ std::vector<Stage *> PipelineReaderJSON::extractInputs(Json::Value& node,
return inputs;
}

namespace
{

bool extractOption(Options& options, const std::string& name,
const Json::Value& node)
{
if (node.isString())
options.add(name, node.asString());
else if (node.isInt())
options.add(name, node.asInt64());
else if (node.isUInt())
options.add(name, node.asUInt64());
else if (node.isDouble())
options.add(name, node.asDouble());
else if (node.isBool())
options.add(name, node.asBool());
else if (node.isNull())
options.add(name, "");
else
return false;
return true;
}

} // unnamed namespace

Options PipelineReaderJSON::extractOptions(Json::Value& node)
{
Options options;

for (const std::string& name : node.getMemberNames())
{
Json::Value& subnode(node[name]);

if (name == "plugin")
{
PluginManager<Stage>::loadPlugin(node[name].asString());
PluginManager<Stage>::loadPlugin(subnode.asString());

// Don't actually put a "plugin" option on
// any stage
continue;
}

if (node[name].isString())
options.add(name, node[name].asString());
else if (node[name].isInt())
options.add(name, node[name].asInt64());
else if (node[name].isUInt())
options.add(name, node[name].asUInt64());
else if (node[name].isDouble())
options.add(name, node[name].asDouble());
else if (node[name].isBool())
options.add(name, node[name].asBool());
else if (node[name].isNull())
options.add(name, "");
else if (node[name].isArray() || node[name].isObject())
if (extractOption(options, name, subnode))
continue;
else if (subnode.isArray())
{
for (const Json::Value& val : subnode)
if (!extractOption(options, name, val))
throw pdal_error("JSON pipeline: Invalid value type for "
"option list '" + name + "'.");
}
else if (subnode.isObject())
{
Json::FastWriter w;
options.add(name, w.write(node[name]));
options.add(name, w.write(subnode));
}
else
throw pdal_error("JSON pipeline: Value of stage option '" +
Expand Down
18 changes: 18 additions & 0 deletions test/data/pipeline/range_bad_limits.json.in
@@ -0,0 +1,18 @@
{
"pipeline":[
{
"type": "readers.faux",
"bounds": "([1,10], [101,110], [201,210])",
"mode": "ramp",
"count": 10
},
{
"type": "filters.range",
"limits": "Y[108:109]",
"limits": "X[2:5]",
"limits": "Z[1:1000]",
"limits": "X[7:9]",
"limits": "Y[103:105]"
}
]
}
20 changes: 20 additions & 0 deletions test/data/pipeline/range_multi_limits.json.in
@@ -0,0 +1,20 @@
{
"pipeline":[
{
"type": "readers.faux",
"bounds": "([1,10], [101,110], [201,210])",
"mode": "ramp",
"count": 10
},
{
"type": "filters.range",
"limits": [
"Y[108:109]",
"X[2:5]",
"Z[1:1000]",
"X[7:9]",
"Y[103:105]"
]
}
]
}
25 changes: 25 additions & 0 deletions test/unit/apps/pcpipelineTestJSON.cpp
Expand Up @@ -312,6 +312,31 @@ TEST(json, issue1417)
run_pipeline("pipeline/issue1417.json", options);
}

// Make sure we handle repeated options properly
TEST(json, issue1941)
{
PipelineManager manager;
std::string file;

file = Support::configuredpath("pipeline/range_multi_limits.json");
manager.readPipeline(file);
EXPECT_EQ(manager.execute(), (point_count_t)5);
const PointViewSet& s = manager.views();
EXPECT_EQ(s.size(), 1U);
PointViewPtr view = *s.begin();
EXPECT_EQ(view->getFieldAs<int>(Dimension::Id::X, 0), 3);
EXPECT_EQ(view->getFieldAs<int>(Dimension::Id::X, 1), 4);
EXPECT_EQ(view->getFieldAs<int>(Dimension::Id::X, 2), 5);
EXPECT_EQ(view->getFieldAs<int>(Dimension::Id::X, 3), 8);
EXPECT_EQ(view->getFieldAs<int>(Dimension::Id::X, 4), 9);

PipelineManager manager2;

file = Support::configuredpath("pipeline/range_bad_limits.json");
EXPECT_THROW(manager2.readPipeline(file), pdal_error);
}


// Test that stage options passed via --stage.<tagname>.<option> work.
TEST(json, stagetags)
{
Expand Down

0 comments on commit d1a5e7d

Please sign in to comment.