-
Notifications
You must be signed in to change notification settings - Fork 172
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ps2pdf compression timeouts for large pdf files #70
Comments
This seems to work great on most of my tested pdf files. Some pdf files (also around 250 pages) are causing a timeout of pdfseparate. Problem is, that the script at that point doesn't know the page count yet, so the timeout can't be calculated proportionally.
|
@lenke182 , I encountered similar issues with large PDF documents of roughly 700 pages. I ended up increasing timeouts (proportional to page count) and switching from My solution consisted into the following changes:
There are still cases where the GUI stops printing command line output (10+MB PDF files)... I'm no Python connoisseur but in such cases I just "watch the contents" of the Docker volume folder and copy the generated safe PDF to its final destination (once the Docker process exits). |
@micahflee, I believe that for big PDF files many will face problems. My main use case for |
Using dangerzone on a big pdf file with 243 pages results in a timeout for the compression in ps2pdf. Everything else seems to work fine. I suggest to increase the timeout or calculate it based on the page count, since smaller pdf files wont take that long to compress in contrast to bigger pdf files.
https://github.com/firstlookmedia/dangerzone-converter/blob/master/scripts/pixels-to-pdf-unpriv#L125
The text was updated successfully, but these errors were encountered: