Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master'
Browse files Browse the repository at this point in the history
  • Loading branch information
abellgithub committed Jan 17, 2017
2 parents 8ac869c + ded16ad commit a6e0e27
Show file tree
Hide file tree
Showing 24 changed files with 326 additions and 277 deletions.
2 changes: 1 addition & 1 deletion doc/download.rst
Expand Up @@ -71,7 +71,7 @@ a call for help with building current Windows PDAL builds is at https://lists.os
RPMs
................................................................................

RPMs for PDAL are available at http://pdal.s3-website-us-east-1.amazonaws.com/rpms/
RPMs for PDAL are available at https://copr.fedorainfracloud.org/coprs/neteler/pdal/

Debian
................................................................................
Expand Down
5 changes: 5 additions & 0 deletions doc/stages/filters.predicate.rst
Expand Up @@ -8,6 +8,11 @@ Like the :ref:`filters.programmable` filter, the predicate filter applies a
the stream by setting true/false values into a special "Mask" dimension in the
output point array.

.. note::

See :ref:`filters.programmable` for documentation about how to access
the ``metadata``, ``spatialreference``, and ``schema`` variables.

.. code-block:: python
import numpy as np
Expand Down
80 changes: 65 additions & 15 deletions doc/stages/filters.programmable.rst
Expand Up @@ -3,15 +3,20 @@
filters.programmable
====================

The programmable filter takes a stream of points and applies a `Python`_
function to each point in the stream.
The programmable filter allows `Python`_ software to be embedded in a
:ref:`pipeline` that interacts with a `NumPy`_ array of the data and allows
you to modify those points. Additionally, some global :ref:`metadata` is also
available that Python functions can interact with.

The function must have two `NumPy`_ arrays as arguments, `ins` and `outs`. The
`ins` array represents input points, the `outs` array represents output points.
Each array contains all the dimensions of the point schema, for a number of
points (depending on how large a point buffer the pipeline is processing at the
time, a run-time consideration). Individual arrays for each dimension can be
read from the input point and written to the output point.
The function must have two `NumPy`_ arrays as arguments, ``ins`` and ``outs``.
The ``ins`` array represents the points before the ``filters.programmable``
filter and the ``outs`` array represents the points after filtering.

.. warning::

Each array contains all the :ref:`dimensions` of the incoming ``ins`` point schema.
Each array in the ``outs`` list match `NumPy`_ array of the
same type as provided as ``ins`` for shape and type.


.. code-block:: python
Expand All @@ -24,14 +29,20 @@ read from the input point and written to the output point.
outs['Z'] = Z
return True
Note that the function always returns `True`. If the function returned `False`,
an error would be thrown and the translation shut down.
If you want to write a dimension that might not be available, use can use one
or more `add_dimension` options.
To filter points based on a `Python`_ function, use the
:ref:`filters.predicate` filter.
1) The function must always return `True` upon success. If the function returned `False`,
an error would be thrown and the :ref:`pipeline` exited.



2) If you want to write a dimension that might not be available, use can use one
or more ``add_dimension`` options.

.. note::

To filter points based on a `Python`_ function, use the
:ref:`filters.predicate` filter.

Example
-------
Expand Down Expand Up @@ -71,9 +82,48 @@ which scales up the Z coordinate by a factor of 10.
outs['Z'] = Z
return True
Module Globals
--------------------------------------------------------------------------------

Three global variables are added to the Python module as it is run to allow
you to get :ref:`dimensions`, :ref:`metadata`, and coordinate system information.
Additionally, the ``metadata`` object can be set by the function to modify metadata
for the in-scope :ref:`filters.programmable` :cpp:class:`pdal::Stage`.

.. code-block:: python
def myfunc(ins,outs):
print ('schema: ', schema)
print ('srs: ', spatialreference)
print ('metadata: ', metadata)
outs = ins
return True
Updating metadata
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The filter can update the global ``metadata`` dictionary as needed, define it as a
**global** Python variable for the function's scope, and the updates will be
reflected back into the pipeline from that stage forward.

.. code-block:: python
def myfunc(ins,outs):
global metadata
metadata = {'name': 'root', 'value': 'a string', 'type': 'string', 'description': 'a description', 'children': [{'name': 'filters.programmable', 'value': 52, 'type': 'integer', 'description': 'a filter description', 'children': []}, {'name': 'readers.faux', 'value': 'another string', 'type': 'string', 'description': 'a reader description', 'children': []}]}
return True
Standard output and error
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

A ``redirector`` module is available for scripts to output to PDAL's log stream
explicitly. The module handles redirecting ``sys.stderr`` and ``sys.stdout`` for you
transparently, but it can be used directly by scripts. See the PDAL source
code for more details.


Options
-------
--------------------------------------------------------------------------------

script
When reading a function from a separate `Python`_ file, the file name to read
Expand Down
2 changes: 1 addition & 1 deletion kernels/InfoKernel.cpp
Expand Up @@ -330,7 +330,7 @@ MetadataNode InfoKernel::run(const std::string& filename)
void InfoKernel::dump(MetadataNode& root)
{
if (m_showSchema)
root.add(m_manager.pointTable().toMetadata().clone("schema"));
root.add(m_manager.pointTable().layout()->toMetadata().clone("schema"));

if (m_PointCloudSchemaOutput.size() > 0)
{
Expand Down
2 changes: 1 addition & 1 deletion pdal/PipelineExecutor.cpp
Expand Up @@ -76,7 +76,7 @@ std::string PipelineExecutor::getSchema() const
throw pdal_error("Pipeline has not been executed!");

std::stringstream strm;
MetadataNode root = m_manager.pointTable().toMetadata().clone("schema");
MetadataNode root = m_manager.pointTable().layout()->toMetadata().clone("schema");
pdal::Utils::toJSON(root, strm);
return strm.str();
}
Expand Down
19 changes: 19 additions & 0 deletions pdal/PointLayout.cpp
Expand Up @@ -318,5 +318,24 @@ Dimension::Type PointLayout::resolveType(Dimension::Type t1,
}
}

MetadataNode PointLayout::toMetadata() const
{

MetadataNode root;

for (const auto& id : dims())
{
MetadataNode dim("dimensions");
dim.add("name", dimName(id));
Dimension::Type t = dimType(id);
dim.add("type", Dimension::toName(Dimension::base(t)));
dim.add("size", dimSize(id));
root.addList(dim);
}

return root;
}

} // namespace pdal


4 changes: 4 additions & 0 deletions pdal/PointLayout.hpp
Expand Up @@ -41,6 +41,7 @@

#include <pdal/DimDetail.hpp>
#include <pdal/DimType.hpp>
#include <pdal/Metadata.hpp>

namespace pdal
{
Expand Down Expand Up @@ -225,6 +226,9 @@ class PointLayout
*/
PDAL_DLL const Dimension::Detail *dimDetail(Dimension::Id id) const;


PDAL_DLL MetadataNode toMetadata() const;

private:
PDAL_DLL virtual bool update(Dimension::Detail dd, const std::string& name);

Expand Down
15 changes: 1 addition & 14 deletions pdal/PointTable.cpp
Expand Up @@ -108,20 +108,7 @@ char *PointTable::getPoint(PointId idx)

MetadataNode BasePointTable::toMetadata() const
{
const PointLayoutPtr l(layout());
MetadataNode root;

for (const auto& id : l->dims())
{
MetadataNode dim("dimensions");
dim.add("name", l->dimName(id));
Dimension::Type t = l->dimType(id);
dim.add("type", Dimension::toName(Dimension::base(t)));
dim.add("size", l->dimSize(id));
root.addList(dim);
}

return root;
return layout()->toMetadata();
}

} // namespace pdal
Expand Down
4 changes: 2 additions & 2 deletions pdal/PointView.hpp
Expand Up @@ -56,7 +56,7 @@ namespace pdal
{
namespace plang
{
class BufferedInvocation;
class Invocation;
}

struct PointViewLess;
Expand All @@ -68,7 +68,7 @@ typedef std::set<PointViewPtr, PointViewLess> PointViewSet;

class PDAL_DLL PointView : public PointContainer
{
friend class plang::BufferedInvocation;
friend class plang::Invocation;
friend class PointIdxRef;
friend struct PointViewLess;
public:
Expand Down
5 changes: 4 additions & 1 deletion pdal/StageFactory.cpp
Expand Up @@ -140,6 +140,7 @@ StringList StageFactory::extensions(const std::string& driver)
{ "writers.sbet", { "sbet" } },
{ "writers.derivative", { "derivative" } },
{ "writers.sqlite", { "sqlite" } },
{ "writers.gdal", { "tif", "tiff", "vrt" } },
};

return exts[driver];
Expand Down Expand Up @@ -216,7 +217,9 @@ std::string StageFactory::inferWriterDriver(const std::string& filename)
{ "txt", "writers.text" },
{ "xyz", "writers.text" },
{ "", "writers.text" },
{ "tif", "writers.gdal" }
{ "tif", "writers.gdal" },
{ "tiff", "writers.gdal" },
{ "vrt", "writers.gdal" }
};

// Strip off '.' and make lowercase.
Expand Down
121 changes: 0 additions & 121 deletions pdal/plang/BufferedInvocation.cpp

This file was deleted.

0 comments on commit a6e0e27

Please sign in to comment.