Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master' into wkt1
Browse files Browse the repository at this point in the history
  • Loading branch information
abellgithub committed Mar 10, 2020
2 parents 4df43a6 + b8390e4 commit 93701af
Show file tree
Hide file tree
Showing 157 changed files with 9,316 additions and 8,624 deletions.
16 changes: 16 additions & 0 deletions cmake/gtest.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,20 @@ if (MSVC)
#link dynamically too (default is /MT[d])
option(gtest_force_shared_crt "Always use shared Visual C++ run-time DLL" ON)
endif()

set(GOOGLETEST_VERSION 1.10.0)
add_subdirectory(vendor/gtest)

find_package(absl)
if (absl_FOUND)
cmake_policy(SET CMP0079 NEW)
target_compile_definitions(gtest PUBLIC GTEST_HAS_ABSL=1)
target_compile_definitions(gtest_main PUBLIC GTEST_HAS_ABSL=1)
target_link_libraries(gtest PRIVATE absl::algorithm
absl::base
absl::debugging
absl::numeric
absl::strings
absl::utility
absl::failure_signal_handler)
endif()
8 changes: 8 additions & 0 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,14 @@ Python

python

Java
--------------------------------------------------------------------------------

.. toctree::
:maxdepth: 2

java

Tutorials
--------------------------------------------------------------------------------

Expand Down
206 changes: 206 additions & 0 deletions doc/java.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,206 @@
.. _java:

********************************************************************
Java
********************************************************************

.. index:: Java, JNI, Scala, Bindings


PDAL provides `Java bindings to use PDAL on JVM <https://github.com/PDAL/java>`_. It is released independently from PDAL itself as of PDAL 1.7.
Native binaries are prebuilt for Linux and MacOS and delivered in a jar, so there is no need
in building PDAL with a special flag or building JNI binaries manually.

The project consists of the following modules:

* `pdal-native`_ - with packed OS specific libraries to link PDAL to JNI proxy classes. Dependency contains bindings for `x86_64-darwin`_ and `x86_64-linux`_, other versions are not supported yet.
* `pdal`_ - with the core bindings functionality.
* `pdal-scala`_ - a Scala API package that simplifies PDAL Pipeline construction.

Versions
--------------------------------------------------------------------------------

PDAL JNI major version usually follows PDAL versioning i.e. `pdal-java 1.8.x`_ was
built and tested against `PDAL 1.8.x`_ and `pdal-java 2.1.x`_ against `PDAL 2.x.x`_.

Using PDAL Java bindings
--------------------------------------------------------------------------------

.. index:: Bindings, Java, Scala

PDAL provides `JNI bindings <https://docs.oracle.com/javase/8/docs/technotes/guides/jni/index.html>`_
that gives users access to executing
:ref:`pipeline <pipeline>` instantiations and capturing the results
in `Java`_ interfaces.
This mode of operation is useful if you are looking to have PDAL simply act as
your data format and processing handler.

Users are expected to construct their own PDAL :ref:`pipeline <pipeline>`,
execute it, and retrieve points into Java memory:

.. code-block:: scala
import io.pdal._
val json =
"""
|{
| "pipeline":[
| {
| "filename":"1.2-with-color.las",
| "spatialreference":"EPSG:2993"
| },
| {
| "type": "filters.reprojection",
| "out_srs": "EPSG:3857"
| },
| {
| "type": "filters.delaunay"
| }
| ]
|}
""".stripMargin
val pipeline = Pipeline(json)
pipeline.validate() // check if our JSON and options were good
pipeline.setLogLevel(8) // make it really noisy
pipeline.execute() // execute the pipeline
val metadata: String = pipeline.getMetadata() // retrieve metadata
val pvs: PointViewIterator = pipeline.getPointViews() // iterator over PointViews
val pv: PointView = pvs.next() // let's take the first PointView
// load all points into JVM memory
// PointCloud provides operations on PDAL points that
// are loaded in this case into JVM memory as a single Array[Byte]
val pointCloud: PointCloud = pv.getPointCloud()
val x: Double = pointCloud.getDouble(0, DimType.X) // get a point with PointId = 0 and only a single dimensions
// in some cases it is not neccesary to load everything into JVM memory
// so it is possible to get only required points directly from the PointView
val y: Double = pv.getDouble(0, DimType.Y)
// it is also possible to get access to the triangular mesh generated via PDAL
val mesh: TriangularMesh = pv.getTriangularMesh()
// the output is an Array of Triangles
// Each Triangle contains PointIds from the PDAL point table
val triangles: Array[Triangle] = mesh.asArray
pv.close()
pipeline.close()
Using PDAL Scala
--------------------------------------------------------------------------------

PDAL Scala project introduces a DSL to simplify PDAL Pipeline construction (this is the same pipeline from the section above):

.. code-block:: scala
import io.pdal._
import io.pdal.pipeline._
val expression =
ReadLas("1.2-with-color.las", spatialreference = Some("EPSG:2993")) ~
FilterReprojection("EPSG:3857") ~
FilterDelaunay()
val pipeline = expression.toPipeline
pipeline.validate() // check if our JSON and options were good
pipeline.setLogLevel(8) // make it really noisy
pipeline.execute() // execute the pipeline
val metadata: String = pipeline.getMetadata() // retrieve metadata
val pvs: PointViewIterator = pipeline.getPointViews() // iterator over PointViews
val pv: PointView = pvs.next() // let's take the first PointView
// load all points into JVM memory
// PointCloud provides operations on PDAL points that
// are loaded in this case into JVM memory as a single Array[Byte]
val pointCloud: PointCloud = pv.getPointCloud()
val x: Double = pointCloud.getDouble(0, DimType.X) // get a point with PointId = 0 and only a single dimensions
// in some cases it is not neccesary to load everything into JVM memory
// so it is possible to get only required points directly from the PointView
val y: Double = pv.getDouble(0, DimType.Y)
// it is also possible to get access to the triangular mesh generated via PDAL
val mesh: TriangularMesh = pv.getTriangularMesh()
// the output is an Array of Triangles
// Each Triangle contains PointIds from the PDAL point table
val triangles: Array[Triangle] = mesh.asArray
pv.close()
pipeline.close()
It covers PDAL 2.0.x, but to use any custom DSL that is not covered by the
current Scala API you can use `RawExpr`_ type to build `Pipeline Expression`_:

.. code-block:: scala
import io.pdal._
import io.pdal.pipeline._
import io.circe.syntax._
val pipelineWithRawExpr =
ReadLas("1.2-with-color.las") ~
RawExpr(Map("type" -> "filters.crop").asJson) ~
WriteLas("1.2-with-color-out.las")
Installation
................................................................................

.. index:: Install, Java, Scala

PDAL Java artifacts are cross published for `Scala 2.13`_, `2.12`_ and `2.11`_.
However, if it is not required, a separate artifact that has no Scala specific
artifact postfix is published as well.

.. code-block:: scala
// pdal is published to maven central, but you can use following repos in addition
resolvers ++= Seq(
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots") // for snaphots
)
libraryDependencies ++= Seq(
"io.pdal" %% "pdal" % "x.x.x", // core library
"io.pdal" % "pdal-native" % "x.x.x", // jni binaries
"io.pdal" %% "pdal-scala" % "x.x.x" // if scala core library (if required)
)
The latest version is: |Maven Central|

.. |Maven Central| image:: https://maven-badges.herokuapp.com/maven-central/io.pdal/pdal/badge.png
:target: https://search.maven.org/search?q=g:io.pdal

There is also an `example SBT PDAL Demo project <https://github.com/PDAL/java/tree/master/examples/pdal-jni>`_ in the
bindings repository, that can be used for a quick start.

Compilation
................................................................................

.. index:: Compile, Java, Scala

Development purposes (including binaries) compilation:
1. Install PDAL (using brew / package managers (unix) / build from sources / etc)
2. Build native libs `./sbt native/nativeCompile`_ (optionally, binaries would be built during tests run)
3. Run `./sbt core/test`_ to run PDAL tests

Only Java development purposes compilation:
1. Provide `$LD_LIBRARY_PATH`_ or `$DYLD_LIBRARY_PATH`_
2. If you don't want to provide global variable you can pass `-Djava.library.path=<path>`_ into sbt:
`./sbt -Djava.library.path=<path>`_
3. Set `PDAL_DEPEND_ON_NATIVE=false`_ (to disable `native` project build)
4. Run `PDAL_DEPEND_ON_NATIVE=false ./sbt`_

If you would like to use your own bindings binary, it is necessary to set `java.library.path`:

.. code-block:: scala
// Mac OS X example with manual JNI installation
// cp -f native/target/resource_managed/main/native/x86_64-darwin/libpdaljni.2.1.dylib /usr/local/lib/libpdaljni.2.1.dylib
// place built binary into /usr/local/lib, and pass java.library.path to your JVM
javaOptions += "-Djava.library.path=/usr/local/lib"
You can use `pdal-native` dep in case you don't have installed JNI bindings and to avoid steps described above.
Dependency contains bindings for `x86_64-darwin`_ and `x86_64-linux`_, other versions are not supported yet.
20 changes: 16 additions & 4 deletions doc/stages/filters.normal.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ stages for more advanced filtering.
The eigenvalue decomposition is performed using Eigen's
`SelfAdjointEigenSolver <https://eigen.tuxfamily.org/dox/classEigen_1_1SelfAdjointEigenSolver.html>`_.

Normals will be automatically flipped towards the viewpoint to be consistent. By
default the viewpoint is located at the midpoint of the X and Y extents, and
1000 units above the max Z value. Users can override any of the XYZ coordinates,
or set them all to zero to effectively disable the normal flipping.
Normals will be automatically flipped towards positive Z, unless the always_up_
flag is set to `false`. Users can optionally set any of the XYZ coordinates to
specify a custom viewpoint_ or set them all to zero to effectively disable the
normal flipping.

.. note::

Expand All @@ -36,6 +36,14 @@ or set them all to zero to effectively disable the normal flipping.
regardless of the always_up_ flag. To disable all normal flipping, do not
provide a viewpoint_ and set `always_up`_ to false.

In addition to always_up_ and viewpoint_, users can run a refinement step (on
by default) that propagates normals using a minimum spanning tree. The
propagated normals can lead to much more consistent results across the dataset.

.. note::

To disable normal propagation, users can set refine_ to `false`.

.. embed::

Example
Expand Down Expand Up @@ -73,3 +81,7 @@ _`viewpoint`
_`always_up`
A flag indicating whether or not normals should be inverted only when the Z
component is negative. [Default: true]

_`refine`
A flag indicating whether or not to reorient normals using minimum spanning
tree propagation. [Default: true]
6 changes: 6 additions & 0 deletions doc/stages/filters.python.rst
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,12 @@ filter and the ``outs`` array represents the points after filtering.
"add_dimension": [ "NewDimensionOne", "NewDimensionTwo", "NewDimensionThree" ]


You can also specify the :ref:`type <types>` of the dimension using an ``=``.
::

"add_dimension": "NewDimensionOne=uint8"


Modification Example
--------------------------------------------------------------------------------

Expand Down
14 changes: 7 additions & 7 deletions doc/stages/readers.hdf.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,12 +72,13 @@ properly mapped and then outputs a LAS file.
Common Use Cases
----------------

A possible use case for this driver is reading NASA's ICESAT2 data.
This example reads the X, Y, and Z coordinates from the ICESAT2
ATL03 format and converts them into a LAS file.
A possible use case for this driver is reading NASA's `ICESat-2 <https://icesat-2.gsfc.nasa.gov/>`__ data.
This example reads the X, Y, and Z coordinates from the ICESat-2
`ATL03 <https://icesat-2.gsfc.nasa.gov/sites/default/files/page_files/ICESat2_ATL03_ATBD_r002.pdf>`__ format and converts them into a LAS file.

.. note::
ICESAT2 data use EPSG:7912.
ICESat-2 data use `EPSG:7912 <https://epsg.io/7912>`__. ICESat-2 Data products documentation can be found `here <https://icesat-2.gsfc.nasa.gov/science/data-products>`_


.. code-block:: json
Expand All @@ -99,9 +100,8 @@ ATL03 format and converts them into a LAS file.
]
`ICESAT2 Data products Documentation <https://icesat-2.gsfc.nasa.gov/science/data-products>`_
~
~
Options
-------

Expand Down
6 changes: 3 additions & 3 deletions doc/stages/writers.gltf.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ GLTF is a file format `specification`_ for 3D graphics data.
If a mesh has been generated
for a PDAL point view, the **GLTF Writer** will produce simple output in
the GLTF format. PDAL does not currently support many of the attributes
that can be found in a GLTF file.
that can be found in a GLTF file. This writer creates a *binary* GLTF.

.. _specification: https://www.khronos.org/gltf/

Expand All @@ -26,7 +26,7 @@ Example
},
{
"type":"writers.gltf",
"filename":"output.gltf",
"filename":"output.glb",
"red": 0.8,
"metallic": 0.5
}
Expand All @@ -36,7 +36,7 @@ Options
-------

filename
Name of the GLTF file to be written. [Required]
Name of the GLTF (.glb) file to be written. [Required]

metallic
The metallic factor of the faces. [Default: 0]
Expand Down
27 changes: 27 additions & 0 deletions doc/stages/writers.tiledb.rst
Original file line number Diff line number Diff line change
Expand Up @@ -66,4 +66,31 @@ filters
JSON array or object of compression filters for either `coords` or `attributes` of the form {coords/attributename : {"compression": name, compression_options: value, ...}} [Optional]


By default TileDB will use the following set of compression filters for coordinates and attributes;

.. code-block:: json
{
"coords":[
{"compression": "bit-shuffle"},
{"compression": "gzip", "compression_level": 9}
],
"Intensity":{"compression": "bzip2", "compression_level": 5},
"ReturnNumber": {"compression": "zstd", "compression_level": 75},
"NumberOfReturns": {"compression": "zstd", "compression_level": 75},
"ScanDirectionFlag": {"compression": "bzip2", "compression_level": 5},
"EdgeOfFlightLine": {"compression": "bzip2", "compression_level": 5},
"Classification": {"compression": "gzip", "compression_level": 9},
"ScanAngleRank": {"compression": "bzip2", "compression_level": 5},
"UserData": {"compression": "gzip", "compression_level": 9},
"PointSourceId": {"compression": "bzip2"},
"Red": {"compression": "rle"},
"Green": {"compression": "rle"},
"Blue": {"compression": "rle"},
"GpsTime": [
{"compression": "bit-shuffle"},
{"compression": "zstd", "compression_level": 75}
]
}
.. _TileDB: https://tiledb.io
1 change: 0 additions & 1 deletion filters/HeadFilter.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,6 @@
#pragma once

#include <pdal/Filter.hpp>
#include <pdal/PointViewIter.hpp>

namespace pdal
{
Expand Down
Loading

0 comments on commit 93701af

Please sign in to comment.