Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ps2pdf compression timeouts for large pdf files #70

Closed
lenke182 opened this issue Mar 28, 2020 · 3 comments
Closed

ps2pdf compression timeouts for large pdf files #70

lenke182 opened this issue Mar 28, 2020 · 3 comments
Labels
bug Something isn't working container timeout Dangerzone Times Out

Comments

@lenke182
Copy link

Using dangerzone on a big pdf file with 243 pages results in a timeout for the compression in ps2pdf. Everything else seems to work fine. I suggest to increase the timeout or calculate it based on the page count, since smaller pdf files wont take that long to compress in contrast to bigger pdf files.

https://github.com/firstlookmedia/dangerzone-converter/blob/master/scripts/pixels-to-pdf-unpriv#L125

Merging 243 pages into a single PDF
Compressing PDF
Error compressing PDF, ps2pdf timed out after 60 seconds
@lenke182
Copy link
Author

lenke182 commented Apr 13, 2020

This seems to work great on most of my tested pdf files.

Some pdf files (also around 250 pages) are causing a timeout of pdfseparate. Problem is, that the script at that point doesn't know the page count yet, so the timeout can't be calculated proportionally.

https://github.com/firstlookmedia/dangerzone-converter/blob/master/scripts/document-to-pixels-unpriv#L151

Separating document into pages
Error separating document into pages, pdfseparate timed out after 60 seconds

sudo: unable to resolve host 9227720a63d0: Temporary failure in name resolution

@yveszoundi
Copy link

@lenke182 , I encountered similar issues with large PDF documents of roughly 700 pages. I ended up increasing timeouts (proportional to page count) and switching from pdfseparate to pdftk.

My solution consisted into the following changes:

  • Global timeout based on script leveraging pdfinfo: global_timeout = pdfinfo_page_ct * 30
  • Changes in dangerzone-converter
    • Introduce a new dependency pdftk and libvips-tools in the Dockerfile
    • In dangerzone-converter update all scripts and set the process run timeout to global_timeout
    • In document-to-pixels-unpriv, switch from pdfseparate to pdftk: args = ["pdftk", pdf_filename, "burst", "output", "/tmp/page-%d.pdf"]
    • In document-to-pixels-unpriv, swap pdftocairo with vips: ["vips", "pdfload", pdf_filename, png_filename]. vips appears faster than pdftocairo but I didn't benchmark it.
  • Changes in dangerzone UI project
    • Use my custom Docker image by updating container.py and global_common.py
    • In tasks.py, skip image pulling as I'm using my own Docker local image.

There are still cases where the GUI stops printing command line output (10+MB PDF files)... I'm no Python connoisseur but in such cases I just "watch the contents" of the Docker volume folder and copy the generated safe PDF to its final destination (once the Docker process exits).

@yveszoundi
Copy link

@micahflee, I believe that for big PDF files many will face problems. My main use case for dangerzone is reading IT e-books (often 200+ pages). I'm only testing on an iMac, my other system runs Qubes OS.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working container timeout Dangerzone Times Out
Projects
None yet
Development

No branches or pull requests

4 participants