Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/master' into issue-3178
Browse files Browse the repository at this point in the history
  • Loading branch information
abellgithub committed Aug 31, 2020
2 parents 8c89f3b + cbbb898 commit 25162d8
Show file tree
Hide file tree
Showing 217 changed files with 24,477 additions and 2,283 deletions.
63 changes: 63 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
name: Docs

on:
push:
branches: '*'
pull_request:
branches: '*'

jobs:
docs:
name: Docs

runs-on: ubuntu-latest
strategy:
fail-fast: true
container: osgeo/proj-docs

steps:
- uses: actions/checkout@v2
- name: Print versions
shell: bash -l {0}
run: |
python3 --version
sphinx-build --version
- name: Lint .rst files
shell: bash -l {0}
run: |
if find . -name '*.rst' | xargs grep -P '\t'; then echo 'Tabs are bad, please use four spaces in .rst files.'; false; fi
working-directory: ./doc
- name: Doxygen
shell: bash -l {0}
run: |
make doxygen
working-directory: ./doc
- name: HTML
shell: bash -l {0}
run: |
make html
working-directory: ./doc
- name: PDF
shell: bash -l {0}
run: |
make latexpdf
working-directory: ./doc
- name: Spelling
shell: bash -l {0}
run: |
make spell
working-directory: ./doc
- uses: actions/upload-artifact@v2
with:
name: PDF
path: doc/build/latex/PDAL.pdf
- uses: actions/upload-artifact@v2
with:
name: HTML
path: doc/build/html/*
- uses: actions/upload-artifact@v2
with:
name: Misspelled
path: doc/build/spelling/output.txt


2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ script:

after_success: |
if [[ "$TRAVIS_SECURE_ENV_VARS" == "true" &&
"$TRAVIS_BRANCH" == "2.1-maintenance" &&
"$TRAVIS_BRANCH" == "2.2-maintenance" &&
-n "$encrypted_6a5172b96922_key" ]]
then
cd scripts/ci
Expand Down
4 changes: 3 additions & 1 deletion cmake/gtest.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@ add_subdirectory(vendor/gtest)

find_package(absl QUIET)
if (absl_FOUND)
cmake_policy(SET CMP0079 NEW)
if(${CMAKE_VERSION} VERSION_GREATER_EQUAL "3.13.0")
cmake_policy(SET CMP0079 NEW)
endif()
target_compile_definitions(gtest PUBLIC GTEST_HAS_ABSL=1)
target_compile_definitions(gtest_main PUBLIC GTEST_HAS_ABSL=1)
target_link_libraries(gtest PRIVATE absl::algorithm
Expand Down
3 changes: 3 additions & 0 deletions doc/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -146,6 +146,9 @@ publish: clone
cp -r $(BUILDDIR)/html/* $(OUTPUTDIR)/
cd $(OUTPUTDIR) ; git checkout master; git add "*" ; git commit -m "update for $(DATE)" . ; git push

spell:
$(SPHINXBUILD) -b spelling $(ALLSPHINXOPTS) -d $(BUILDDIR)/doctrees build/spelling

pdf:
$(SPHINXBUILD) -b pdf $(ALLSPHINXOPTS) $(BUILDDIR)/pdf
@echo
Expand Down
34 changes: 15 additions & 19 deletions doc/about.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ philosophy in a number of important ways:
4. PDAL is coordinated by users with its declarative :ref:`JSON <pipeline>`
syntax. LAStools is coordinated by linking lots of small, specialized
command line utilities together with intricate arguments.
5. PDAL is an open source project, with all of its development activites
5. PDAL is an open source project, with all of its development activities
available online at https://github.com/PDAL/PDAL

.. _about_pcl:
Expand All @@ -134,41 +134,37 @@ processing suite for point cloud data. The developer community of the PCL
library is focused on algorithm development, robotic and computer vision, and
real-time laser scanner processing. PDAL can read and write PCL's PCD format.

Greyhound and Entwine
Entwine
................................................................................

.. index:: Greyhound, Entwine
.. index:: Entwine

`Greyhound`_ is an open source software from `Hobu, Inc.`_ that allows clients
to query and stream progressive point cloud data over the network. `Entwine`_
is open source software from Hobu, Inc. that organizes massive point cloud
collections into `Greyhound`_-streamable data services. These two software
projects allow province-scale LiDAR collections to be organized and served
via HTTP clients over the internet. PDAL provides :ref:`readers.greyhound` to
allow users to read data into PDAL processes from that server.

`Entwine`_ is open source software from Hobu, Inc. that organizes massive
point cloud collections into streamable data services. These two software
projects allow province-scale LiDAR collections to be organized and served via
HTTP clients over the internet. PDAL provides :ref:`readers.ept` to allow
users to read data from those `Entwine Point Tile`_ collections that Entwine
produces..

.. _`Entwine Point Tile`: https://entwine.io/entwine-point-tile.html
.. _`Hobu, Inc.`: https://hobu.co

.. _`Entwine`: https://entwine.io
.. _`Greyhound`: http://greyhound.io

plas.io and Potree
Potree
................................................................................

`plas.io`_ is a `WebGL`_ HTML5 point cloud renderer that speaks `ASPRS LAS`_ and
`LASzip`_ compressed LAS. You can find the software for it at plasiojs.io and
https://github.com/hobu/plasio-ui

`Potree`_ is a `WebGL`_ HTML5 point cloud renderer that speaks `ASPRS LAS`_ and
`LASzip`_ compressed LAS. You can find the software at
https://github.com/potree/potree/

.. note::

Both renderers can now consume data from Greyhound. See them in action at
http://speck.ly and http://potree.entwine.io
See Potree in action using the USGS 3DEP AWS Public Dataset at
https://usgs.entwine.io

.. _`plas.io`: http://plas.io
.. _`WebGL`: https://en.wikipedia.org/wiki/WebGL
.. _`Potree`: http://potree.org
.. _`LASzip`: http://laszip.org
Expand Down Expand Up @@ -196,7 +192,7 @@ might find useful bits of functionality in them. These other tools include:

.. note::

The `libLAS`_ project is an open source project that pre-dates PDAL, and
The `libLAS`_ project is an open source project that predates PDAL, and
provides some of the processing capabilities provided by PDAL. It is
currently in maintenance mode due to its dependence on LAS, the release of
relevant LAStools capabilities as open source, and the completion of
Expand Down
5 changes: 4 additions & 1 deletion doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def process_dimensions():
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['breathe', 'sphinx.ext.autodoc',
'sphinx.ext.mathjax', 'sphinx.ext.intersphinx',
'sphinxcontrib.bibtex', 'embed']
'sphinxcontrib.bibtex', 'embed','sphinxcontrib.spelling']


# Add any paths that contain templates here, relative to this directory.
Expand All @@ -76,6 +76,9 @@ def process_dimensions():
copyright = u'%d' % year


spelling_word_list_filename='spelling_wordlist.txt'


# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
Expand Down
4 changes: 2 additions & 2 deletions doc/development/metadata.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Metadata nodes are reference types that can be copied cheaply. Metadata nodes
are annotated with the original data type to allow better interpretation of
the data.
For example, when binary data is stored in a base 64-encoded
format, knowing that the data doesn't ulitmately represent a string can allow
format, knowing that the data doesn't ultimately represent a string can allow
algorithms to convert it back to its binary representation when desired.
Similarly, knowing that data is numeric allows it
to be written as a JSON numeric type rather than as a string.
Expand All @@ -39,7 +39,7 @@ overloading, but there are instances where this is impossible and the
programmer must call a specific function to set the type of the metadata node.
Binary data that has been converted to a string by base 64 encoding can
be tagged as a such by calling addEncoded(). Programmers can specify the
type of a node explictly by calling addWithType(). Currently supported
type of a node explicitly by calling addWithType(). Currently supported
types are: "boolean", "string", "float", "double", "bounds",
"nonNegativeInteger", "integer", "uuid" and "base64Binary".

Expand Down
6 changes: 3 additions & 3 deletions doc/development/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,7 @@ pipeline.
should occur at this time. If the initialization requires knowledge of
the point table, implement the function that accepts one, otherwise
implement the no-argument version. Whether to place initialization code
at this step or in prepared() or ready() (see below) is a judgement call,
at this step or in prepared() or ready() (see below) is a judgment call,
but detection of errors earlier in the process allows faster termination of
a command. Files opened in this step should also be closed before
returning.
Expand Down Expand Up @@ -247,14 +247,14 @@ bool processOne(PointRef& ref)
This method allows processing of a single point. A reader will typically
read a point from an input source. When a reader returns 'false' from
this function, it indicates that there are no more points to be read.
When a filter returns 'false' from this funciton, it indicates
When a filter returns 'false' from this function, it indicates
that the point just processed should be filtered out and not passed
to subsequent stages for processing.

Implementing a Reader
................................................................................

A reader is a stage that takes input from a point clould format supported by
A reader is a stage that takes input from a point cloud format supported by
PDAL and loads points into a point table through a point view.

A reader needs to register or assign those dimensions that it will reference
Expand Down
3 changes: 2 additions & 1 deletion doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ PDAL - Point Data Abstraction Library
:alt: PDAL logo
:align: right

PDAL is a C++ `BSD`_ library for translating and manipulating `point cloud
PDAL is a C++ library for translating and manipulating `point cloud
data`_. It is very much like the `GDAL`_ library which handles raster and
vector data. The :ref:`about` page provides high level overview of the library
and its philosophy. Visit :ref:`readers` and :ref:`writers` to list data
Expand Down Expand Up @@ -89,6 +89,7 @@ Drivers
:glob:

pipeline
stages/stages
stages/readers
stages/writers
stages/filters
Expand Down
2 changes: 1 addition & 1 deletion doc/pipeline.rst
Original file line number Diff line number Diff line change
Expand Up @@ -321,7 +321,7 @@ code at `hag.py`_.

.. seealso::

:ref:`filters.hag` describes using a specific filter to do
:ref:`filters.hag_nn` describes using a specific filter to do
this job in more detail.

.. code-block:: json
Expand Down
2 changes: 1 addition & 1 deletion doc/project/conventions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ Other Conventions
* For public headers from the ./include/pdal directory, use angle brackets:
#include <pdal/Stage.h>

* For private headers (from somehwere in ./src), use quotes:
* For private headers (from somewhere in ./src), use quotes:
#include "Support.hpp"

* Don't #include a file where a simple forward declaration will do.
Expand Down

0 comments on commit 25162d8

Please sign in to comment.