You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to use odjitter with a subset of the São Paulo OD data and it is crashing when I set --max-per-od 1. It works fine when I try with --max-per-od 100 and --max-per-od 10. My PC freezes in the process, so it is probably a RAM usage related problem -- I have a core i5 6th gen with 8GB running Ubuntu 20.04.3 LTS.
Here is a reproducible example (using R):
piggyback::pb_download(file="zones_sp_center.geojson",
repo="spstreets/OD2017"
)
piggyback::pb_download(file="od_sp_center.csv",
repo="spstreets/OD2017"
)
system("odjitter --od-csv-path ./od_sp_center.csv --zones-path ./zones_sp_center.geojson --max-per-od 1 --output-path result.geojson")
# Scraped 114 zones from ./zones_sp_center.geojson# Disaggregating OD data# Killed
The text was updated successfully, but these errors were encountered:
Thanks for the bug report! I can confirm the problem -- I managed to run it, getting a 2GB output file, but it took 30GB of RAM, which was right near my laptop's limit. :) The problem is that the tool buffers the entire GeoJSON representation in-memory and writes it all at once. There's no reason to do this; I'll work on a fix.
Hi, @dabreegster .
I am trying to use
odjitter
with a subset of the São Paulo OD data and it is crashing when I set--max-per-od 1
. It works fine when I try with--max-per-od 100
and--max-per-od 10
. My PC freezes in the process, so it is probably a RAM usage related problem -- I have a core i5 6th gen with 8GB running Ubuntu 20.04.3 LTS.Here is a reproducible example (using
R
):The text was updated successfully, but these errors were encountered: