Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iTerm IIP support #43

Merged
merged 32 commits into from
Feb 22, 2023
Merged

iTerm IIP support #43

merged 32 commits into from
Feb 22, 2023

Conversation

jerch
Copy link
Owner

@jerch jerch commented Feb 4, 2023

Supported formats:

  • PNG
  • JPEG
  • GIF (non animated)

Other formats are possible but not yet a goal (theoretically only limited by browser support). Same with animated GIF or APNG - we have currently no way to output animated image content, as we do canvas scraping driven by xterm.js' render cycle. Maybe with a later incarnation...

TODO:

  • compare behavior with iTerm
  • faster alternative to atob + image - done with custom b64 decoder + createImageBitmap
  • ctor options + limits
  • pull image type + dimensions early
  • fix ImageStorage to support other data sources than canvas - added direct ImageBitmap support
  • test cases:
    • header parser
    • base64 decoder
    • image type deduction + size
  • highlevel API tests
  • docs in readme
  • document base64 style
  • fix demo ctor broken #42
  • remove png-ts dependency

Issues:

  • b64 decoder + blob creation is blocking (~70ms for a 25MB PNG), several ways to fix this:
    • make b64 decoder streamlined spreading work across chunks - single invocation is <1ms
    • use wasm b64 decoder - decoding speedup ~3.8x (~7x with SIMD)
    • again web workers for the rescue? - nope
  • drawImage on canvas for resizing is blocking - work around with a worker / offscreen canvas? - solved by applying resize metrics on bitmap creation directly (not for Safari)

@jerch
Copy link
Owner Author

jerch commented Feb 5, 2023

With the last commit we gain 3-6 times faster processing speed by avoiding string + gc pressure, also atob seems to be much slower than the custom base64 decoder. Things now get usable with typical web optimized image sizes, but we are not there yet - for bigger images the blocking of the base64 decoder + blob creation on the mainthread is a clear showstopper (~70ms for a 25MB png file).

@jerch
Copy link
Owner Author

jerch commented Feb 8, 2023

Seems the new base64 decoder is quite speedy:

   Context "addons/xterm-addon-image/out/base64.benchmark.js"
      Context "Base64"
         Context "Node - Buffer"
            Case "decode - 256" : 100 runs - average throughput: 28.05 MB/s
            Case "decode - 4096" : 100 runs - average throughput: 241.21 MB/s
            Case "decode - 65536" : 100 runs - average throughput: 525.54 MB/s
            Case "decode - 1048576" : 100 runs - average throughput: 559.42 MB/s
         Context "Base64Decoder"
            Case "decode - 256" : 100 runs - average throughput: 39.17 MB/s
            Case "decode - 4096" : 100 runs - average throughput: 439.40 MB/s
            Case "decode - 65536" : 100 runs - average throughput: 1226.31 MB/s
            Case "decode - 1048576" : 100 runs - average throughput: 1392.19 MB/s

@jerch
Copy link
Owner Author

jerch commented Feb 13, 2023

Some promising results with base64 decoding in SIMD (needs nodejs v16 to run):

   Context "addons/xterm-addon-image/out/base64.benchmark.js"
      Context "Base64"
         Context "Node - Buffer"
            Case "decode - 256" : 100 runs - average throughput: 37.66 MB/s
            Case "decode - 4096" : 100 runs - average throughput: 286.23 MB/s
            Case "decode - 65536" : 100 runs - average throughput: 520.20 MB/s
            Case "decode - 1048576" : 100 runs - average throughput: 580.90 MB/s
         Context "Base64Decoder"
            Case "decode - 256" : 100 runs - average throughput: 43.50 MB/s
            Case "decode - 4096" : 100 runs - average throughput: 532.61 MB/s
            Case "decode - 65536" : 100 runs - average throughput: 1325.31 MB/s
            Case "decode - 1048576" : 100 runs - average throughput: 2704.22 MB/s

Not sure yet whether to go that path, as it creates frictions with Safari (still needs the scalar version).

Edit:
Will stick with the scalar version for now, as it is still fast enough and we have workarounds in place for all other Safari shortcomings. Still I might have to drop Safari support in the future, as it turns more and more into a big hindrance for better functionality:

  • missing offscreen canvas support - no offloading of image handling possible due to this
  • missing createImageBitmap support - shims creates bad DOM/mainthread blocking and bloats the source
  • missing wasm simd support - very ugly to work around with bad performance and bloated sources

In long term all these issues will be a dealbreaker for Safari, they better get things supported or Safari will die as a target platform for many web developers needing performant solutions. I really dont get it why apple neglect its once superior engine and turns it into the new ugly kid.

@jerch jerch merged commit 4489c03 into master Feb 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

demo ctor broken
1 participant