Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master' into issue-2507
Browse files Browse the repository at this point in the history
  • Loading branch information
abellgithub committed May 30, 2019
2 parents 16359e0 + 96119a0 commit 110e7ae
Show file tree
Hide file tree
Showing 42 changed files with 701 additions and 989 deletions.
10 changes: 5 additions & 5 deletions doc/quickstart.rst
Expand Up @@ -38,9 +38,9 @@ Install Conda
Conda installation instructions can be found at the following links. Read
through them a bit for your platform so you have an idea what to expect.

* `Windows <https://conda.io/docs/user-guide/install/windows.html>`__
* `macOS <https://conda.io/docs/user-guide/install/macos.html>`__
* `Linux <https://conda.io/docs/user-guide/install/linux.html>`__
* `Windows <https://conda.io/projects/conda/en/latest/user-guide/install/windows.html>`__
* `macOS <https://conda.io/projects/conda/en/latest/user-guide/install/macos.html>`__
* `Linux <https://conda.io/projects/conda/en/latest/user-guide/install/linux.html>`__

.. note::

Expand All @@ -56,7 +56,7 @@ On macOS and Linux, all Conda commands are typed into a terminal window. On
Windows, commands are typed into the Anaconda Prompt window. Instructions can
be found in the Conda `Getting Started`_ guide.

.. _`Getting Started`: https://conda.io/docs/user-guide/getting-started.html#starting-conda
.. _`Getting Started`: https://conda.io/projects/conda/en/latest/user-guide/getting-started.html#starting-conda


Test Installation
Expand All @@ -79,7 +79,7 @@ the PDAL maintenance branch.
It is a good idea to install PDAL in it's own environment (or
add it to an existing one). You will **NOT** want to add it to your default
environment named ``base``. Managing environments is beyond the scope of
the quickstart, but you can read more about it `here <https://conda.io/docs/user-guide/getting-started.html#managing-envs>`_.
the quickstart, but you can read more about it `here <https://conda.io/projects/conda/en/latest/user-guide/getting-started.html#managing-envs>`_.

To install the PDAL package so that we can use it to run PDAL commands, we run
the following command to create an environment named ``myenv``, installing PDAL
Expand Down
2 changes: 0 additions & 2 deletions doc/references.rst
Expand Up @@ -52,8 +52,6 @@ Reference
.. [Mesh2009] ALoopingIcon. "Meshing Point Clouds." *MESHLAB STUFF*. n.p., 7 Sept. 2009. Web. 13 Nov. 2015.
.. [Mongus2012] Mongus, D., Zalik, B., 2012. Parameter-free ground filtering of LiDAR data for automatic DTM generation. ISPRS J. Photogramm. Remote Sens. 67, 1–12.
.. [Pingel2013] Pingel, Thomas J., Keith C. Clarke, and William A. McBride. “An Improved Simple Morphological Filter for the Terrain Classification of Airborne LIDAR Data.” ISPRS Journal of Photogrammetry and Remote Sensing 77 (2013): 21–30.
.. [Rusu2008] Rusu, Radu Bogdan, et al. "Towards 3D point cloud based object maps for household environments." Robotics and Autonomous Systems 56.11 (2008): 927-941.
Expand Down
40 changes: 0 additions & 40 deletions doc/stages/filters.kdistance.rst

This file was deleted.

@@ -1,9 +1,9 @@
.. _filters.mongoexpression:
.. _filters.mongo:

filters.mongoexpression
filters.mongo
========================

The **Mongo Expression Filter** applies query logic to the input
The **Mongo Filter** applies query logic to the input
point cloud based on a MongoDB-style query expression using the
point cloud attributes.

Expand All @@ -21,7 +21,7 @@ This example passes through only the points whose Classification is non-zero.
[
"input.las",
{
"type": "filters.expression",
"type": "filters.mongo",
"expression": {
"Classification": { "$ne": 0 }
}
Expand All @@ -38,7 +38,7 @@ is greater than 1.
[
"input.las",
{
"type": "filters.expression",
"type": "filters.mongo",
"expression": { "$and": [
{ "ReturnNumber": "NumberOfReturns" },
{ "NumberOfReturns": { "$gt": 1 } }
Expand Down
64 changes: 0 additions & 64 deletions doc/stages/filters.mongus.rst

This file was deleted.

18 changes: 7 additions & 11 deletions doc/stages/filters.rst
Expand Up @@ -36,16 +36,15 @@ invalidate an existing KD-tree.
filters.cluster
filters.colorinterp
filters.colorization
filters.covariancefeatures
filters.dem
filters.eigenvalues
filters.estimaterank
filters.elm
filters.ferry
filters.hag
filters.info
filters.kdistance
filters.lof
filters.mongus
filters.neighborclassifier
filters.nndistance
filters.normal
Expand Down Expand Up @@ -74,6 +73,10 @@ invalidate an existing KD-tree.
:ref:`filters.colorization`
Fetch and assign RGB color information from a GDAL-readable datasource.

:ref:`filters.covariancefeatures`
Filter that calculates local features based on the covariance matrix of a
point's neighborhood.

:ref:`filters.eigenvalues`
Compute pointwise eigenvalues, based on k-nearest neighbors.

Expand All @@ -90,17 +93,10 @@ invalidate an existing KD-tree.
Compute pointwise height above ground estimate. Requires points to be
classified as ground/non-ground prior to estimating.

:ref:`filters.kdistance`
Compute pointwise K-Distance (the Euclidean distance to a point's k-th
nearest neighbor). [Deprecated - use :ref:`filters.nndistance`]

:ref:`filters.lof`
Compute pointwise Local Outlier Factor (along with K-Distance and Local
Reachability Distance).

:ref:`filters.mongus`
Label ground/non-ground returns using [Mongus2012]_.

:ref:`filters.neighborclassifier`
Update pointwise classification using k-nearest neighbor consensus voting.

Expand Down Expand Up @@ -196,7 +192,7 @@ the input. These filters will invalidate an existing KD-tree.
filters.iqr
filters.locate
filters.mad
filters.mongoexpression
filters.mongo
filters.range
filters.sample
filters.tail
Expand Down Expand Up @@ -227,7 +223,7 @@ the input. These filters will invalidate an existing KD-tree.
Cull points falling outside the computed Median Absolute Deviation for a
given dimension.

:ref:`filters.mongoexpression`
:ref:`filters.mongo`
Cull points using MongoDB-style expression syntax.

:ref:`filters.range`
Expand Down
77 changes: 77 additions & 0 deletions doc/stages/readers.e57.rst
@@ -0,0 +1,77 @@
.. _readers.e57:

readers.e57
===========

The **E57 Reader** supports reading from E57 files.

The reader supports E57 files with Cartesian point clouds.

.. note::

E57 files can contain multiple point clouds stored in a single
file. If that is the case, the reader will read all the points
from all of the internal point clouds as one.

Only dimensions present in all of the point clouds will be read.

.. note::

Point clouds stored in spherical format are not supported.

.. note::

The E57 `cartesianInvalidState` dimension is mapped to the Omit
PDAL dimension. A range filter can be used to filter out the
invalid points.

.. plugin::

.. streamable::


Example 1
---------

.. code-block:: json
[
{
"type":"readers.e57",
"filename":"inputfile.e57"
},
{
"type":"writers.text",
"filename":"outputfile.txt"
}
]
Example 2
---------

.. code-block:: json
[
{
"type":"readers.e57",
"filename":"inputfile.e57"
},
{
"type":"filters.range",
"limits":"Omit[0:0]"
},
{
"type":"writers.text",
"filename":"outputfile.txt"
}
]
Options
-------

_`filename`
E57 file to read [Required]

.. include:: reader_opts.rst
2 changes: 1 addition & 1 deletion doc/stages/readers.ept.rst
Expand Up @@ -98,7 +98,7 @@ threads
Number of worker threads used to download and process EPT data. A
minimum of 4 will be used no matter what value is specified.

.. _Entwine Point Tile: https://github.com/connormanning/entwine/blob/master/doc/entwine-point-tile.md
.. _Entwine Point Tile: https://entwine.io/entwine-point-tile.html
.. _Entwine: https://entwine.io/
.. _Potree: http://potree.entwine.io/data/nyc.html
.. _Plasio: http://speck.ly/?s=http%3A%2F%2Fc%5B0-7%5D.greyhound.io&r=ept%3A%2F%2Fna.entwine.io%2Fnyc&ca=-0&ce=49.06&ct=-8239196%2C4958509.308%2C337&cd=42640.943&cmd=125978.13&ps=2&pa=0.1&ze=1&c0s=remote%3A%2F%2Fimagery%3Furl%3Dhttp%3A%2F%2Fserver.arcgisonline.com%2FArcGIS%2Frest%2Fservices%2FWorld_Imagery%2FMapServer%2Ftile%2F%7B%7Bz%7D%7D%2F%7B%7By%7D%7D%2F%7B%7Bx%7D%7D.jpg
Expand Down
55 changes: 55 additions & 0 deletions doc/stages/writers.e57.rst
@@ -0,0 +1,55 @@
.. _writers.e57:

writers.e57
===========

The **E57 Writer** supports writing to E57 files.

The writer supports E57 files with Cartesian point clouds.

.. note::

E57 files can contain multiple point clouds stored in a single
file. The writer will only write a single cloud per file.

.. note::

Spherical format points are not supported.

.. note::

The E57 `cartesianInvalidState` dimension is mapped to the Omit
PDAL dimension. A range filter can be used to filter out the
invalid points.

.. plugin::

.. streamable::


Example
-------

.. code-block:: json
[
{
"type":"readers.las",
"filename":"inputfile.las"
},
{
"type":"writers.e57",
"filename":"outputfile.e57",
"doublePrecision":false
}
]
Options
-------

_`filename`
E57 file to write [Required]

doublePrecision
Use double precision for storage (false by default).
2 changes: 1 addition & 1 deletion doc/stages/writers.sbet.rst
Expand Up @@ -3,7 +3,7 @@
writers.sbet
============

The **SBET writer** writes files in the SBET format, used for exchange data from interital measurement units (IMUs).
The **SBET writer** writes files in the SBET format, used for exchange data from inertial measurement units (IMUs).

.. embed::

Expand Down

0 comments on commit 110e7ae

Please sign in to comment.