New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Geoserver startup is significantly slower due to chown of GEOSERVER_DATA_DIR and GEOWEBCACHE_CACHE_DIR #515
Comments
I think we need a way to test if GWC is inside the data directory like
then chown once. As for the flag, how would it still ensure that the correct folder permissions are preserved ?. Remember you are now able to pass your uid to preserve local folder permissions inside the data directory |
I would assume this would cause problem on the first run only. As long as the uid do not change after the first run, we do not need to always We have been using your Geoserver image since 2.9.3, thanks for the great image by the way. We have never encounter ownership issues, except at the big upgrade from 2.9.3 to 2.19.0. We had to do this So it's not about |
I have got a much more elegant solution that I am going to implement
I will investigate this togle behaviour |
…307) Geoserver: update to latest version 2.22.2 to get vulnerability fix For vulnerability in `jt-jiffle` < 1.1.22, see https://nvd.nist.gov/vuln/detail/CVE-2022-24816, and GHSA-v92f-jx6p-73rx. Changed to use the CORS (Cross-Origin Resource Sharing) default config from the image instead of our own. Both are quite similar so if we can use the default config, future upgrade will be simpler. New Geoserver version will have `jt-jiffle` 1.1.24. The old one had version 1.1.20. ``` $ docker run -it --rm --entrypoint bash pavics/geoserver:2.22.2-kartoza-build20230226-r5-allow-change-context-root-and-fix-missing-stable-plugins _ __ _ ____ _ ____ ____ | |/ /__ _ _ __| |_ ___ ______ _ | _ \ ___ ___| | _____ _ __ / ___| ___ ___/ ___| ___ _ ____ _____ _ __ | ' // _` | '__| __/ _ \_ / _` | | | | |/ _ \ / __| |/ / _ \ '__| | | _ / _ \/ _ \___ \ / _ \ '__\ \ / / _ \ '__| | . \ (_| | | | || (_) / / (_| | | |_| | (_) | (__| < __/ | | |_| | __/ (_) |__) | __/ | \ V / __/ | |_|\_\__,_|_| \__\___/___\__,_| |____/ \___/ \___|_|\_\___|_| \____|\___|\___/____/ \___|_| \_/ \___|_| root@c3787dccea2d:/geoserver# find / -iname '**jt-jiffle**' /usr/local/tomcat/webapps/geoserver/WEB-INF/lib/jt-jiffle-language-1.1.24.jar /usr/local/tomcat/webapps/geoserver/WEB-INF/lib/jt-jiffle-op-1.1.24.jar root@c3787dccea2d:/geoserver# ``` Used our own custom build image because the original kartoza image is missing 2 plugins that we use, see kartoza/docker-geoserver#508 and to avoid excessively slow startup due to kartoza/docker-geoserver#515. CORS config difference: ```diff --- web.xml.old 2023-03-22 16:10:20.000000000 -0400 +++ web.xml.new 2023-03-22 16:10:06.000000000 -0400 <filter> <filter-name>CorsFilter</filter-name> <filter-class>org.apache.catalina.filters.CorsFilter</filter-class> <init-param> - <param-name>cors.allowed.methods</param-name> - <param-value>GET,POST,HEAD,OPTIONS,PUT</param-value> - </init-param> - <init-param> <param-name>cors.allowed.origins</param-name> <param-value>*</param-value> </init-param> <init-param> <param-name>cors.allowed.headers</param-name> - <param-value>Content-Type,X-Requested-With,accept,Origin,Access-Control-Request-Method,Access-Control-Request-Headers,Authorization,Authentication</param-value> + <param-value>Content-Type,X-Requested-With,accept,Access-Control-Request-Method,Access-Control-Request-Headers,If-Modified-Since,Range,Origin,Authorization</param-value> + </init-param> + <init-param> + <param-name>cors.exposed.headers</param-name> + <param-value>Access-Control-Allow-Origin,Access-Control-Allow-Credentials</param-value> </init-param> </filter> ``` Missing `cors.allowed.methods`, new `cors.exposed.headers`. For `cors.allowed.headers`, missing `Authentication`, new `If-Modified-Since,Range`. Hopefully everything still works with the new CORS config and future upgrade will be simpler. The new version is able to start now. Syncing production data to vm VM to test for upgrade problem. Last upgrade there was some problem with the existing data. I highly suggest all organizations to test with their existing data before going live. Tested with the following notebooks, hopefully CORS changes are effectively tested there: * https://github.com/Ouranosinc/pavics-sdi/blob/f4aecf64889f0c8503ea67b59b6558ae18407cf6/docs/source/notebooks/WFS_example.ipynb * https://github.com/Ouranosinc/pavics-sdi/blob/f4aecf64889f0c8503ea67b59b6558ae18407cf6/docs/source/notebooks/regridding.ipynb * https://github.com/bird-house/finch/blob/877312d325d4de5c3efcb4f1f75fbe5cd22660d6/docs/source/notebooks/subset.ipynb * https://github.com/Ouranosinc/raven/blob/0be6d77d71bcaf4546de97b13bafc6724068a73d/docs/source/notebooks/01_Getting_watershed_boundaries.ipynb with `RAVEN_GEO_URL` pointing to another Geoserver (also from this PR) to test CORS (Cross-Origin Resource Sharing) ## Other changes - Raven: allow to customize the Geoserver it will use Useful to test the local Geoserver or to have your own Geoserver with your own data. Default to PAVICS Geoserver. Set `RAVEN_GEO_URL` in `env.local` to something like `https://host/geoserver/`. - env.local.example: change default Geoserver admin user from 'admin' to 'admingeo' This only impacts new deployment when `env.local.example` is instanciated to `env.local`. This is to avoid confusion with the admin user of Magpie, which is also 'admin'.
@tlvu Can you test the PR and see if it resolves the issue for you |
What is the bug or the crash?
This line
docker-geoserver/scripts/entrypoint.sh
Lines 84 to 88 in b8d4388
GEOSERVER_DATA_DIR
is 229 GB andGEOWEBCACHE_CACHE_DIR
is underGEOSERVER_DATA_DIR
which basically make itchown
twice !Found when upgrading from 2.19.0 to 2.22.2. That new
chown
was introduced somewhere in 2.21.x I believe.Please make it optional to
chown
GEOSERVER_DATA_DIR
andGEOWEBCACHE_CACHE_DIR
.Allow toggle env var like
CHOWN_DATA_DIR=false
andCHOWN_CACHE_DIR=false
, likeEXISTING_DATA_DIR=false
would be great.Steps to reproduce the issue
Have a gigantic
GEOSERVER_DATA_DIR
, see all Geoserver configs in our PR bird-house/birdhouse-deploy#307.Versions
2.22.2
Additional context
No response
The text was updated successfully, but these errors were encountered: