Very large images
Clone this wiki locally
Very large images require a lot of memory to be displayed and processed, and trying to load them in a browser can trigger browser bugs or make your desktop unresponsive. You have several options if you want to download such large images:
- Use dezoomify-rs, a command-line desktop application that can dezoom even very large images.
- Limit the size of the image
- Use the dezoomify node application: this solution requires you to download tools, but works even with very large images (your computer still has to have enough RAM to store all the pixels of the full image, though).
- Make dezoomify require tiles through a proxy: it works right from the usual dezoomify website, but makes the image load slower, increases the load on our server, and may still fail on very large images.
You can also choose to use something other than dezoomify. For example, there is a shell script that can download tiles and assemble them.
You can have dezoomify generate a smaller image, in order to work around your browser's memory limitations.
Before creating the canvas, dezoomify checks whether its area (width x height in pixels) is larger than
UI.MAX_CANVAS_AREA. If so, it scales the image down. The default value for the maximum area is 268,435,456 which should work in most browsers. If your browsers requires scaling down images even further, you can set the value in your browser console with, for instance:
UI.MAX_CANVAS_AREA = 16777216
There are two options for using Dezoomify through a terminal:
Just open your browser console before starting the dezoomification, and type:
ZoomManager.proxy_tiles = "proxy.php";
The tiles will load slower, but at the and, you will be offered a Save image button, that allows you to save the image to your computer.
Browsers forbid scripts from accessing images if the server from which they come doesn't explicitly allows it. This means that by default, dezoomify can't access images, and can't offer a Save image button.
The line above makes dezoomify download tiles through a proxy (
proxy.php) that explicitly allows the script to access the images.
This is slower, because images are not downloaded directly. If you want it to be faster, you can use your own proxy. There is a fast proxy available in this repository, in
node-app/proxy.js. You can lauch it with node, and then use
http://127.0.0.1:8181/ instead of