New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bad performances ? #263
Comments
Have you tried disabling the blur? |
The blur is processed after the generation of the screenshot and it's accelerated by 3D so the problem is not the blur. Here is a screenshot of the js timeline perf when I click on the button: http://puu.sh/4blyf/4f65a59c05.png These 985ms (and more sometimes) come from your script. |
From right before calling With this code in there:
It runs So the problem is somewhere in your application. |
Fixed: christianparpart/xzero-web@312fb20 It's still laggy a bit but it's normal I guess ;) Thanks for helping, |
Would be faster if you just used |
My tests show that jpeg with 90% of quality is faster than lossless png. It don't work when I apply the blur filter on body and the exit effect (when I click on close button) is impossible to create without the plugin. Thanks anyway ;) |
Alright :) |
Googling "html2canvas slow" turns up several results, and yet this has never been properly addressed by the author, at least through looking into culprit code and understanding why it takes so long and what (if anything) can be done about it, and sharing those details with the community in the interests of a fix. |
@ArcaneEngineer I am not sure where you gathered that it wouldn't be clear where the bulk of the time for creating a render goes. The reason why it is slow is very clear, that doesn't mean that it could be properly addressed. Firstly, unless you want html2canvas to manipulate almost every single DOM node on your document, it requires the whole DOM to be cloned to a seperate window context where it can freely mutate it without requiring the original to be touched, which in itself already can take ~100ms. Parsing every single DOM node/pseudonode and their computed CSS values and the position on the document takes some time as well, but what really is the performance killer is measuring/rendering text. There are no proper low level APIs that allow you to properly measure text wrapping, or even render with letter spacing, which means that to do letter based rendering you may potentially be required to measure the bounds of every single letter on the document. You could potentially start parsing font files manually and calculate the size of glyphs yourself, but then you would get more issues about why the library is half a mb in size and why the author hasn't properly addressed the size of it. |
So the software rendering is the real killer here... is this primarily a per-pixel cost? I have a minimal test case set up with a single If that sounds normal, then I can narrow down options: optimise your code or process on back end. |
For those stumbling upon this thread years later, consider using html-to-image instead |
html-to-image is good, but it will load all fonts. |
Hello, I found your repository 2 days ago and I integrated into an open-source webserver website.
When I click on the menu in the header, the rendering is slow. Sometimes the browser (Chrome for me) randomly crash after I clicked on the button.
This is normal ?
URL:
http://xzero.io
Thanks,
The text was updated successfully, but these errors were encountered: