-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to add new dimensions to already existing data #153
Comments
Hi Hobu! I found the ferry method, but cannot seem to get it to work:
All stages of the pipeline work except for the ferry part. Additionally, no warning is given and the pipeline.execute() just doesn't produce any output, making debugging difficult. |
Another option is to copy
Also make sure to set |
Hey Hobu! Thanks for the info, some further testing resulted in:
If I do anyting more, like:
i get these errors when trying to execute:
working, so it could be that there is an issue between later steps. I'll try and test a few things so I can actually provide some useful info to you why it doesn't work ;) In the meantime, I'll try your suggestions! thanks! Edit: some further testing showed that the entire pipeline runs, until the overlay. (both the normal ferry as well ass your suggestion). On the overlay step however, the pipeline fails no matter what you enter as a dimension without providing any message. |
Just to close this off: The extra_dims=all worked, and the bug was caused by an invalid geometry in the geopackage used for the overlay. Could it be usefull to add a warning for this to the overlay function? |
If you set |
Hi Hobu!
So laspy has a function add_extra_dim() to add new dimensions and data to already existing las files in python.
I found this in the pdal documentation: https://pdal.io/en/latest/dimensions.html which would assume we can at least create new Dimensions taken form that list, if it's not yet in use.
However, I couldn't find any documentation on how to add dimensions using pdal-python, the way laspy does it.
Thanks in advance!
The text was updated successfully, but these errors were encountered: