-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to build terrain #3
Comments
Hey there,
|
Used to increasing the RAM use. Hope it may help someone. Thank you for quick response. |
Hey there, |
Thank you for your suggestion. Unfortunately now I'm getting the following error root@a1db04a328db:/data# ctb-tile -R -f Mesh -s 21 -e 0 -C -N -o ./tilesets_new2 ./a/a.tif I have used the following steps:
And as suggested I have converted the data to WGS84. Please help me on this. Thank you |
Have you tried this approach (as described here)? For large datasets this always worked for me sor far.
|
Still no luck. The size of the tiff image was 2.72 GB and the size of build terrain data is around 150 MB. Is the size looks right? |
Redo this process for each zoom level. Usually, this is only requried for the higher zoom levels. |
Hey @BWibo I have multiple .tif files, which consist of a city. The size of them is 6.3 GB. The following is my pipeline:
Then, unfortunately I'm getting the following error:
Although the size of my dataset is also large, they are multiple .tif files. And my situation is different from the solution you list above In your solution, the data is just a large single .tif file and it is able to process for each zoom level but it is not suitable for my case. Please give some hints, thanks :) PS: How about processing lots of .tif files that cover the whole country? The size of them would be up to 1TB Oliver |
Hey there,
|
Hey @BWibo, Thanks for your reply. From my understanding, the input .tif dataset in your first step is a single .tif file. Am I correct? Hence, I have to merge multiple .tif files into one single .tif file: Then, I follow your steps. But in step 2, I got the following error:
It seems because there are too many tiled .tif files in the folder. Then, I tried to increase stack size: If I increased the stack size again, to e.g 131072 KB, it reported: So my zoom level range is just from 0 to 16 right now. Another question is that we need to generate a layer.json in the end according to the command below: Should I just simply run this command Oliver |
No, you don't have to merge the
Try the
No, I usually create the layer file from the input dataset, not one of the zoom levels. |
Hi @BWibo, Thanks for your answers. Everything was going well except building terrain tilesets on high zoom level 18:
The command was killed and I guess it may be related to low warp memory? I am not sure.
I don't know very clearly about the instruction/explanation above and don't know how to set these two options correctly. My OS is ubuntu 16.04. Please give me some hints. Thanks again :) Oliver |
Hey there, Look out for ways to monitor your system resources and see if they overflow when using the tool. |
Hi @BWibo, Yes, you are right. The process was forcefully killed by my system and I got the error below:
Moreover, I also observed that the usage of CPU and memory was increased to 3125% and 85.2+%, respectively. Then stuck there and process would be killed after a while. Could you please give some advice? Thanks again! Oliver |
Apparently you are reaching a hardware memory limit. |
Hi @BWibo, Thanks for your advice. I checked the docker container's hardware configuration:
which indicated that the Hence, I should look into memory optimization with cesium terrain builder as you mentioned above. More information about my dataset: After tiling my input dataset on level 18, I got 1141140 .tif files and their size was 19.8GB I remember you were able to Oliver |
The file size may increase due to tiling, but I have no experience to what extent. I did it just the way it is described here: #3 (comment) I worked with 1m DTMs as well, but for a much larger extent. What seems strange to me is the memory consumption. Just one note: |
@predictwise Did you manage to resolve your issue? If yes, how? |
Not yet. I temporarily put it aside and focus on my other work. There must be certain step that I missed. Otherwise, I could manage to build terrain. |
Another option is a problem with the inut data. Check your original input |
Hi @BWibo, How about I share you a google drive link with my original input |
Please post the output of |
Hi @BWibo, The following is the output of
The following is the output of
This is the output of
Have you found out something wrong on my data? Oliver |
I can't spot anything problematic from a quick look on the output. |
Hi @BWibo, Yes, I also can't find out anything problematic from the output. Could you please test my dataset on level 18 from your side? If yes, I will send you a google drive dataset link. Thank you very much. Oliver |
No, I'm sorry. Taking a closer look into your data or doing the processing for you is something I cannot provide without an official assignment. If you are interested, let me know. |
Hello, Thanks @BWibo for maintaining this docker image and @homme and @ahuarte47 for the work on ctb! Sorry for digging up this issue but I also have the data overflow issue when working with large datasets and I have found two different tips for solving this issue: (let's take an example where the generation fails at level 8)
Which one is the correct way to go and why ? Thanks |
Hey there, I think there is no right or wrong here, it depends on your data set. In fact, I don't know exactly what influcences at which level For me, the level where to start tiling the |
Thanks for your answer; |
These overflow issues are raised by GDAL when for lower levels CTB tries to create overviews from input data. If input data is big, GDAL throws an exception. I have always avoided this overflow issue for lower levels using a simplified version of input data with a lower resolution. I process with gdal_translate the original input data to output a set of rasters with lower resolutions (x2, x4, x8....) and then I use for each level those that do not throw any error, starting from level 0 to upper levels, one by one. The higher levels your are processing, the higher resolution raster you use. |
Ok, thanks for both of your answers 👍 |
@ahuarte47 Thx, for the clarification. |
I'm using Docker in windows 10
Using the following commands:
docker run -v "${PWD}:/data" -it --name ctb tumgis/ctb-quantized-mesh
ctb-tile -f Mesh -C -N -o ./tilesets/terrain/test a.tif
Sometimes it's failing, by giving the following output:
0...10...20...30...40...Killed
so I'm using the following command for resuming the terrain building process:
ctb-tile -f Mesh -R -C -N -o ./tilesets/terrain/test a.tif
It giving the following errors at the end:
root@e594ebe1722d:/data# ctb-tile -f Mesh -R -C -N -o ./tilesets/terrain/test a.tif
0...10...20...30...40...50...60...70...80...90...ERROR 2: gdalwarpoperation.cpp, 1574: cannot allocate 814172736 bytes
ERROR 1: IReadBlock failed at X offset 0, Y offset 0
ERROR 1: GetBlockRef failed at X block offset 0, Y block offset 0
ERROR 2: gdalwarpoperation.cpp, 1574: cannot allocate 814172736 bytes
ERROR 1: IReadBlock failed at X offset 0, Y offset 0
ERROR 1: GetBlockRef failed at X block offset 0, Y block offset 0
100 - done.
The text was updated successfully, but these errors were encountered: