-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Killed" #16
Comments
Tried with jobs=2 and it got killed exactly at the same place after writing 1.9GB of data |
There was indeed a memory leak, keeping data between tiles. |
FYI, I've been able to finish Africa with a single output file. Took ~80 minutes (with the pending pyosmium optimization) and output is 6.3 GB. |
Phyghtmap was using less memory compared to even after fixing this. With 64GB RAM I cannot compile Africa 10m interval (13.9GB) anymore. Not to mention Asia 10m (66GB). |
Africa 10m interval should be around 13.9GB in size (I'm using 1" source data exclusively, alos world 3d newest Revision, but I guess any 1" data will be similar for memory consumption). 6.3GB means something is wrong it that's for 10m interval so I guess that's 20m interval maybe with 3" data? With the old phyghtmap it was possible (using 1 process) to generate Asia 10m resulting in a 66GB output file with 64GB RAM and I think a 6GB temp file on NVME (though I'm not sure that the temp was used, but it was really close with memory and calculated like 3.5 days nonstop on a AMD Ryzen 3600) |
Actually right now with 10.6GB written out - it uses 61GB RAM already. Something is definitely wrong still or we will need to use the phyghtmap with the first patch to get it running for Asia/China/Russia/South America/Africa on a 64GB RAM machine (haven*t tried to increase virtual memory to say 128GB or similar - which maybe okay for the usecase). That is with jobs=9 |
Well, I'm actually using the command you originally shared, so 10m steps. The difference being it's 1" and 3" (but 1" should be favored when available).
|
Well there is no viewfinderpanoramas 1" - and the SRTM1" now require an account to download - which I guess you haven't done (not even sure it's implemented). Alos World 3D was even harder to download - I think I wrote a script based on their website links to download them - but yeah I know 1" take much longer to process and likely use more memory too. But I guess there is still some memory leak somewhere else It really should not hit 64GB of RAM with 10.8GB of data written out (and overall should be 13.9GB). Maybe pyhgtmap could use a temp file to store the ways with a switch? |
I've setup an account for SRTM1", and it works fine.
184 bytes for each way. As the output for Africa has (tens of?) millions of ways, the actual memory usage is HUGE... |
But how was it possible with the old phyghtmap to use approx only 1/6 of memory? I think the easiest is just an option for a temp file. Yeah that would mean 360GB or so for Asia, but that is much easier to have on a computer than 360GB of memory... (and increasing swap for that size I'm not sure how good the memory management in Linux could handle that). If one is to compute such a huge file, some hundred GB of tempfile space shouldn't be an issue. Storing all those hgts anyhow takes up huge space too so someone trying this likely has a big HDD for data storage besides a fast system drive anyhow. |
It's not that I actually need this - lately there have been no significant updates to 1" source data outside of Europe and north America when it comes to higher quality - and I still have the old pbf output files - I noticed it with Europe and then just wanted to give Africa a try being a couple of times bigger. |
Much more efficient memory-wise than massive lists of tuples, allowing to process much bigger dataset with a reasonable amount of memory. Fixes #16
My Africa dataset contains almost 100 million ways. Relying on numpy array instead of list of tuples reduces memory consumption by about 10x factor (2.6GB consumed at the end of the process):
|
Well I now have another problem with the newest version - seems there are problems with order - when trying to split the output with mkgmap splitter - I get the error: Phygtmap produced data never had that problem. Running osmosis on this data to sort it - would again fail due to size constraints. I tried if it's related to a map being south of the Equator - and used Fiji which was fine except for getting this error message although I don't see a problem with the data output on first glance: hgt file /home/contourlines/hgt/SRTM1v3.0/S20W180.hgt: 3601 x 3601 points, bbox: (-180.00000, -20.00000, -179.00000, -19.00000), checking polygon borders Not sure how what this means. Maybe input data is problematic on that tile? I then tried Ecuador to have a map passing the equator. Also no problem. |
That's a bit strange as I didn't change the nodes management with the latest change. |
Those files crash Osmosis running out of memory If I try to use Osmosis to sort it. The error message is from mkgmap splitter. |
BTW - I downloaded a fresh copy of S20W180 from SRTM and the above small bug still happens. (this source gives directly hgt not tif files for download) Also the download tool crashed when it tried to download with correct username/password (or maybe crashed on unzipping): During handling of the above exception, another exception occurred: Traceback (most recent call last): |
I will try tomorrow using this version (I think this one must still be close to identical and I believe it will run under current Ubuntu): or should I even downgrade to this one: At the same time I'm just running the most recent version vs South-America and Asia and see if it splits or not. On both continents I definitely didn't change any input data since the last succesful run. |
You have a credentials issue. You may delete your |
hgt file /home/contourlines/hgt/SRTM1v3.0/N25E003.hgt: 3601 x 3601 points, bbox: (3.00000, 25.00000, 4.00000, 26.00000)
Killed
not sure why this is happening - command was:
nice -n 19 pyhgtmap --earthexplorer-user=test--earthexplorer-password=Test--jobs=10 --polygon=/home/contourlines/bounds/africa.poly --step=10 --no-zero-contour --void-range-max=-420 --output-prefix=africa --line-cat=$detail --start-node-id=10000000 --start-way-id=10000000 --source=view1,srtm1,view3,srtm3 --max-nodes-per-way=230 --max-nodes-per-tile=0 --pbf --hgtdir=/home/contourlines/hgt -j16
With around 62GB of available RAM. Did id run out of memory? I will rerun with --jobs=2 but a bit better error message why it was killed would be great. Europe was killed as well. with above command.
The text was updated successfully, but these errors were encountered: