-
Notifications
You must be signed in to change notification settings - Fork 300
Closed
Description
I have a large (extent not size) polygon layer of the German federal state borders on my PostgreSQL data base. There is also an index on the geom column.
CREATE INDEX germany_bld_geom_idx
ON spatial_derived.germany_bld
USING gist
(geom);
Using the sf:: package to load the layer into R takes extremely long ~40s:
# bld = sf::st_read_db(con, table = c('spatial_derived', 'germany_bld'))
bld = sf::st_read_db(con, query = "SELECT * FROM spatial_derived.germany_bld", geom_column = 'geom')
Using the postGIStools:: package, it is done in ~0.5s:
bld = postGIStools::get_postgis_query(con, "SELECT * FROM spatial_derived.germany_bld", geom_name="geom")
bld = sf::st_as_sf(bld)
Up to now I've only seen advantages in using sf:: in R, however in this case other packages are faster. Is st_read_db() still in development? What causes this big difference?
PS: The question is also on: https://gis.stackexchange.com/questions/257576/sfst-read-db-considerably-slower-than-postgistoolsget-postgis-query?noredirect=1#comment406825_257576
Metadata
Metadata
Assignees
Labels
No labels