-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use vips to improve speed? #1109
Comments
What exactly speed do you want improve? |
I have no specific use case. This is just a general request. According to the spreadsheet in the above link, vips very fast. |
Vips use different approach and therefore consumes much less memory, but it can't be significant faster because of this. The real reason vips do processing in parallel. There is also experimental branch of Pillow with OpenMP if you need it. |
First, There are python bindings for vips, so if anyone wants to use it, they can. Second, they're testing PIL 1.1.7 with bilinear scaling, which has been recently replaced with faster and better code. Third, I'm not sure I really buy their memory methodology. |
@wiredfool what do you mean with "I'm not sure I really buy their memory methodology. "? |
Counting memory usage by summing RSS in multithreaded programs runs the risk of double counting memory that's actually shared. It could be interpreted as an upper bound. (though, less changes behind ps's back there than when calculating processor usage percentage). |
Thank you for your reply. I am not a memory expert. I guess you are right. I don't want to start a childish A vs B debate. I am a user of the PIL API. A lot of people use PIL, but vips seems to be not that wide spread. I just want to know why the Pillow developers don't use vips. What are the reasons? |
It's a separate project, with different goals. Pillow's goal isn't really to be the fastest, rather to be correct and increasingly complete across a range of Pythons and platforms. I think it would be far easier to write a layer on top of VIPS exposing that part of the Pillow API that it implements than to rehost Pillow on VIPS. |
If I would code a wrapper around vips to expose the Pillow API, would you use it? If not, why would you prefer Pillow over a wrapped vips? |
Please ask on Stack Overflow if you don't get an answer here. |
@aclark4life I don't have a specific problem. I see that there are several image processing libraries for python. For me this looks like reinventing the wheel. I am curious why the Pillow developers take a library which is according to their docs very fast. A question like this would get down and closed voted in seconds. |
@guettli Ah, indeed. OK in that case answer is "yes". If you were to integrate VIPS with Pillow in a PR then we'd definitely consider accepting it. I suspect you are getting a lukewarm reception because the core team is mostly concerned with fixing bugs and getting releases out the door. Enhancements are a luxury for us and we mostly rely on community contributions to implement them. |
By the way, I asked the VIPS author to re-run with Pillow 2.7.0, which shows some improvement:
|
@hugovk Wow, interesting right? |
The speed improvement looks good. Thank you for sharing. |
VIPS is a library for image processing.
According to the docs, it is fast and lightweight:
http://www.vips.ecs.soton.ac.uk/index.php?title=Speed_and_Memory_Use
What about using vips to improve the speed of pillow?
The text was updated successfully, but these errors were encountered: