Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importer Timeouts #417

Closed
PureLoneWolf opened this issue Feb 19, 2021 · 9 comments
Closed

Importer Timeouts #417

PureLoneWolf opened this issue Feb 19, 2021 · 9 comments
Labels
bug Something isn't working

Comments

@PureLoneWolf
Copy link

Hi there

Is there anywhere I can amend the timeout settings somewhere when importing the Chowdown zip?
I think it may be the images/zip file size.

I have been playing around and got it to import 32 recipes - but the images were considerably smaller on that one - So the overall filesize was under 15mb and it worked. A 28mb file times out after a while and gives an "Empty Response" error on the site (imported 22 recipes). A 40mb file imported 24 recipes before giving the "Empty Response" error.

I am trying to import around 300 recipes, split across multiple categorisations

Is it possible to point the app to the folders so that it can grab the images/md files directly?

@vabene1111
Copy link
Collaborator

hmm interesting, when testing i could import the 150 or so example recipes from the chowdown repo without any problems. I guess i will have to rewrite the importer module to work as a background processing task that finishes after some time to prevent timeouts.

Sadly i have another feature that i need to finish first so for the meantime you will either need to split the import (which is annoying, i know) or play around with the nginx config to allow higher timeouts (although i am not 100% on how to do this without researching). I kinda hoped the limit was a little higher 😂

@vabene1111 vabene1111 added the bug Something isn't working label Feb 19, 2021
@vabene1111 vabene1111 pinned this issue Feb 19, 2021
@kushfest
Copy link

I found a similar issue in the Nextcloud Cookbook importer. I'm trying to migrate 400+ recipes, which made for a zip file over 100MB. When I tried to upload it, I'd get hit with a content length error. I figured my file was too large and I'd need to update nginx's "client_max_body_size" setting to be sure my file wouldn't be too large.

I found that setting in the Recipes.conf file, but that didn't solve it. I then used the shell on my nginx_recipes container and added the setting to the html section of /etc/nginx/nginx.conf and restarted that container. I no longer get the file length error! Without that setting in nginx.conf, it was likely defaulting to 1MB max. I wonder if there is a way to simplify editing that setting through the interface?

The large zip file is giving me a 500 error now, so I'm still trying to figure that out haha, but I made a smaller test zip and it imported well.

@PureLoneWolf
Copy link
Author

Unfortunately, I can't locate an nginx.conf inside the docker - I have the Recipes.conf, but that doesn't do it.

I tried changing it on the nginx.conf for unraid, but that didn't work.

Could I add an nginx.conf to the /etc inside the Unraid container do you think?

@vabene1111
Copy link
Collaborator

in the newer deployments the nginx conf is mounted in a volume and should be editable.

I definitly need to improve this process, both in regards of upload size and also use some kind of background process.

@PureLoneWolf
Copy link
Author

I am definitely willing to edit the conf file manually....if I could find it ;) Could you let me know which file (and where it is) inside the Unraid docker to edit?

Cheers

@vabene1111
Copy link
Collaborator

i have no idea where it is in unraid because that is a special setup that i did not create myself. I dont even know if unraid uses an additional nginx in front of the main application.

But the config needs to be mounted into the nginx container under /etc/nginx/sites-enabled maybe this helps

@vabene1111 vabene1111 changed the title Chowdown Import - Timing Out Importer Timeouts Feb 20, 2021
@nishantbb
Copy link

I found a similar issue in the Nextcloud Cookbook importer. I'm trying to migrate 400+ recipes, which made for a zip file over 100MB. When I tried to upload it, I'd get hit with a content length error. I figured my file was too large and I'd need to update nginx's "client_max_body_size" setting to be sure my file wouldn't be too large.

I found that setting in the Recipes.conf file, but that didn't solve it. I then used the shell on my nginx_recipes container and added the setting to the html section of /etc/nginx/nginx.conf and restarted that container. I no longer get the file length error! Without that setting in nginx.conf, it was likely defaulting to 1MB max. I wonder if there is a way to simplify editing that setting through the interface?

The large zip file is giving me a 500 error now, so I'm still trying to figure that out haha, but I made a smaller test zip and it imported well.

Yup I'm importing ~70 Nextcloud Cookbook recipes. The entire folder was too big, but I got the first half to work by splitting into fourths. My last two sections are getting 500 errors . . .

I will play with NGINX settings.

@vabene1111
Copy link
Collaborator

if you find settings that are working good let me know. Adjusting the settings might be a good temporay fix. For the long term a proper background task handling system is needed but i currently dont think that i have the time for that.

@vabene1111
Copy link
Collaborator

fixed with the next update

@vabene1111 vabene1111 unpinned this issue Mar 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants