Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Video feed broken in iOS 14 #244

Closed
pearcemichal opened this issue Sep 23, 2020 · 21 comments
Closed

Video feed broken in iOS 14 #244

pearcemichal opened this issue Sep 23, 2020 · 21 comments

Comments

@pearcemichal
Copy link

I have not seen much about this, so I thought I would start a topic here. I have tested this both in QuaggaJS and Quagga2 with the same results.

After updating a device to iOS 14, the video feed appears as a thin line up the center of the display
K9SAOBx

I also tried looking at the examples on quagga's site, and it does the same thing for most of them. However, the "Scan barcode to input-field" does seem to display the video correctly, though I wasn't able to get it to actually scan anything.

I will continue to test things and update this as I find anything.

@github-actions
Copy link

Thank you for filing an issue! Please be patient. :-)

@ericblade
Copy link
Owner

hmm. "Scan barcode to input-field"? Which example/demo is that?

I'll see if I can borrow an iThing that has 14 on it, but I don't really have a dev environment well suited for debugging things in iOS at the moment.

My gut feeling is that it might be necessary to add some width styling to the video element.

@pearcemichal
Copy link
Author

pearcemichal commented Sep 23, 2020

its one of the 1.0 beta examples here.

it opens the video as a full screen overlay, instead of as an inline video element, and I was actually able to get it to work on iOS by copying the source scripts. (the ones being loaded, not the source code written on the page, which does not work)

Of course, while this works on iOS 14, it does not work on android. I get a notification stating that the camera is being used, but there is no visible camera feed.

EDIT: I forgot to mention, I did try adding width styling to the video element, and other various things on the page but nothing I did made a difference as far as I can tell.

@reppard
Copy link

reppard commented Sep 23, 2020

Similar issue for here after iOS 14 update. The camera appears to be very zoomed in and view port seems to be smaller and not honoring the aspect ratio and orientation. Happy to help debug this issue.
image

@ericblade
Copy link
Owner

The camera appears to be very zoomed in and view port seems to be smaller and not honoring the aspect ratio and orientation.

Not sure that there's anything that we can do on our side if that's the case, it might be possible that there's actually some breakage in Safari's camera handling. I don't know if iOS devices report multiple cameras, like many Android devices do, that might be a thing? Perhaps it has different logical cameras at different aspect ratios, etc, and after the update, it's selecting a different camera by default?

I forgot to mention, I did try adding width styling to the video element, and other various things on the page

worrisome. Normally what I try to do when I'm working with debugging things in my app that uses Quagga, is try to get it to do what i want it to do in a regular browser, first, load it to my local server, and then hit it with my cell phone. I don't know of any way to get iOS or Android to allow camera access to a dev server, but in a desktop browser, it assumes that localhost is secure, and allows it.

So, what I would probably start with doing, is putting down a div element on the page, setting a border around it in css, getting it to look like it's supposed to, and then passing { inputStream: { target: ThatVideoElement } } into the Quagga config. That should create the imagebuffer and drawingbuffer canvases as children of that element, which might shed some light on it. As well, if you create two canvas elements, one with an id of "imgBuffer" and one with an id of "drawingBuffer", then Quagga shouldn't do any modifying of the document at all, and that might also shed some light.

@harrybin
Copy link

harrybin commented Sep 24, 2020

we face the same issue....
It seems like the aspect ratio can't be calculated properly since the iOS14 update.
For some reason the seems to assume a width of 32768 instead of the real one like in iOS13.

Even if I set fix values for width and height to the container as well the video and canvas DIVs I get really strange behavior on iOS14:
image
the left side is Chrome on Desktop and right side is the iOS14.

container: {
      // transformOrigin: 'top left',
      // transform: `scale(50%)`,
      background: 'red',
      position: 'relative',
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },
    viewport: {
      position: 'relative',
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },
    video: {
      background:
        'linear-gradient(90deg, #fff 2px, transparent 90%) center, linear-gradient(#fff 2px, transparent 1%) center, green',
      position: 'relative',
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },
    canvas: {
      background:
        'linear-gradient(90deg, #fff 2px, transparent 90%) center, linear-gradient(#fff 2px, transparent 1%) center, blue',
      position: 'absolute',
      top: 0,
      left: 0,
      opacity: 0.4,
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },

for

 <div id="BarcodeScannContainer" className={classes.container} hidden={!mediaStreamActive}>
        <div id="interactive" className={clsx('viewport', classes.viewport)}>
          <video
            ref={videoRef}
            id="scannerVideoElem"
            className={clsx('videoCamera', classes.video)}
            autoPlay={true}
            preload="auto"
            src=""
            muted={true}
            playsInline={true}
          />
          <canvas className={clsx('drawingBuffer', classes.canvas)} />
        </div>
      </div>

@harrybin
Copy link

harrybin commented Sep 24, 2020

So, what I would probably start with doing, is putting down a div element on the page, setting a border around it in css, getting it to look like it's supposed to, and then passing { inputStream: { target: ThatVideoElement } } into the Quagga config. That should create the imagebuffer and drawingbuffer canvases as children of that element, which might shed some light on it. As well, if you create two canvas elements, one with an id of "imgBuffer" and one with an id of "drawingBuffer", then Quagga shouldn't do any modifying of the document at all, and that might also shed some light.

I tried your suggestion:

      <div id="BarcodeScannContainer" className={classes.container} hidden={!mediaStreamActive}>
        <div id="interactive" ref={scannerRef} className={clsx('viewport', classes.viewport)}>         
          <canvas className={clsx('imgBuffer', classes.video)} />
          <canvas className={clsx('drawingBuffer', classes.canvas)} />
        </div>
      </div>

with these styles:

container: {
      // transformOrigin: 'top left',
      // transform: `scale(50%)`,
      background: 'red',
      position: 'relative',
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },
    viewport: {
      borderStyle: 'dotted',
      borderWidth: 'medium',
      position: 'relative',
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },
    video: {
      borderStyle: 'solid',
      borderWidth: 'medium',
      background:
        'linear-gradient(90deg, #fff 2px, transparent 90%) center, linear-gradient(#fff 2px, transparent 1%) center, green',
      position: 'absolute',
      top: 0,
      left: 0,
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },
    canvas: {
      borderStyle: 'dotted',
      borderWidth: 'thick',
      background:
        'linear-gradient(90deg, #fff 2px, transparent 90%) center, linear-gradient(#fff 2px, transparent 1%) center, blue',
      position: 'absolute',
      top: 0,
      left: 0,
      opacity: 0.4,
      maxWidth: videoWidth,
      minWidth: videoWidth,
      maxHeight: videoHeight,
      minHeight: videoHeight,
    },

the resulting html on the iPhone with iOS14 is:
image
The video tag is generated on the left (its red border is no css I did when doing the screenshot), like in Chrome too, but with wrong dimensions as you can see in the image.
On Chrome it looks correct for this code:
image

Any ideas how to work around this iOS14 issue?

@ericblade
Copy link
Owner

To be clear in my understanding, is Quagga involved in either of those, or is it just HTML/CSS ?
If Quagga is not involved, then we've got a clear case of Safari doing something very different from Chrome. If Quagga is, then I think we need to trace out if Quagga is causing the problem somehow in Quagga.init().

umm... right, i did think that there was supposed to be a video element as well as the canvases.. but i didn't find it. It's been a while since I rewrote the init code,

So, I wonder if it might be possible to see the problem occurring in Safari on a Mac? It might be possible if it does occur on desktop, to be able to apply some adjustments in the page on the fly to see if there's something that can be adjusted to solve it? Or maybe that can be done in whatever debugger you were using above?

I'm here trying to get a handle on what might be a cause of a problem that I can't even see, since I don't have a modern iOS or macOS device, so I'm sorry if this goes slowly!

@harrybin
Copy link

Hello Eric,
thanks for coming back on this.
Quagga is for sure involved. The video tag generated by Quagga can't be influenced from outside.
This video tag becomes strange dimension on the iPhone (iOS14) in portait mode. Very interesting is that it looks fine when turning the phone into landscape mode with all these fix css values I used in me last tests. But anyway no barcode is detected on the iOS14.
My last screenshot doesn't show, but the camera picture is shown in the imgbuffer canvas as well s in the video tag.
The picture itself inside that video-tag-bar looks fine, but the picture in the canvas looks like scaled and distorted. Maybe that's the reason why no barcode is detected.
I was not able to make the video tag having the correct dimensions.
So I tried further wrapper for zxing and found react-webcam-barcode-scanner which works out of the box.
I miss the flexibility in configuration of quagga2 but however it works with iOS14 and also support QR-codes beside barcodes which I will need in the near future too. Maybe you find some hints in that implementation why quagga stops working with iOS14.
Regarding to to debug an iOS device: I'm using "remotedebug_ios_webkit_adapter" (see: https://www.outsystems.com/blog/posts/how-to-troubleshoot-ios-on-windows/). It's not 100% stable and losses the connection sometimes when the phone turn off the display, but it's better than to be forced to use a mac. (hint: for iOS13 I needed to use port 9000 and for iOS14 9221).

@ericblade
Copy link
Owner

OK, so, when you are creating a LiveStream in Quagga, it will search for an element id'd "video", and if that's not found, then it will document.createElement('video') and viewport.appendChild(video)

So, if there's a specific element you want to use giving it <video id="video" ...> should work, and if not, you can use CSS like video { border: 1px 'red'; }

Sure, Quagga can't read QR codes by itself right now, but it does have the (experimental) ability to pass an image onto other code that can :) I don't know if this actually works in live mode yet, but it does for individual images https://github.com/ericblade/quagga2-reader-qr

I'm seeing from https://firt.dev/ios-14

image

I wonder if this is somehow related.

@lotusorrose
Copy link

lotusorrose commented Oct 5, 2020

I tried ios13 and ios14 both, seems the line below throws an error (on ios14) as "Unable to play video stream. Is webcam working?" this is even though a thin line of preview is shown, If I change the css for the video element to use 100% width the preview does show up full screen but its highly zoomed and unusable (even as a human I cannot make out anything as its highly zoomed). The same code does not throw any error on ios13

Quagga.init(this.state, function(err) {
    if (err) {
        console.log(err);
        App.attachListeners();
        return;
    }
    App.attachListeners();
    App.checkCapabilities();
    Quagga.start();
}

I would be happy to help debug this further as I managed to get hold of two devices one on ios13 and one on ios14

@ericblade
Copy link
Owner

@lotusorrose would love for the assist there. I don't really have access to any iOS devices. IF you need any help with finding your way around or figuring out what something means, or is doing, lmk

That error is thrown by camera_access.ts, function waitForVideo, inner function checkVideo(). initCamera() calls waitForVideo() before returning. waitForVideo checks for the given video element's length and height to be > 10, then resolve, otherwise it fires again in 500 ms, up to 10 times. If that all fails, then it throws.

This is in a section of code that Christoph probably wrote a long long time ago, and he's not been available for questions for a really long time.. so, i'm not entirely sure why that check does what it does, but I can certainly see why it's throwing a failure eventually.

@ericblade
Copy link
Owner

Hey all -- we may have some sort of a way to handle -- and I was starting to think about this, but i hadn't quite figured out what the problem would be : serratus/quaggaJS#444 (comment)

@lotusorrose @reppard @harrybin @pearcemichal

could you post your constraints? i'm having a feeling that perhaps iOS is giving us a totally unexpected result when we request the camera with certain constraints.

@lotusorrose
Copy link

lotusorrose commented Oct 7, 2020

Thanks @ericblade for the link, I have the same aspectRatio as mentioned in the link and yes if I comment the line with aspect ratio below it works as before, the constraint is as below

constraints: {
                        width: {min: 1024},
                        height: {min: 786},
                        aspectRatio: {min: 1, max: 100}, // commenting this line on ios14 works for ios14, ios13 and android devices
                        facingMode: "environment" // or user
                    }

@ericblade
Copy link
Owner

ericblade commented Oct 7, 2020

This is making me wonder if iOS 14 is somehow supplying a 1 aspect ratio when you ask for it.

If this is the case -- would anyone want to try doing a bunch of getUserMedia() requests with varying aspectRatio settings, and see what happens? -- I'm not sure if this qualifies as a bug or a feature. If it's actually giving a 100 or otherwise ridiculously weird aspectRatio, that could be a feature where it's adapting the image to actually fit the request, but it's kind of a bug when it gives you a correct response to a sort of crazy request?

@ericblade
Copy link
Owner

I was just reviewing my thoughts on the matter, and any relevant code, and it occurred to me that I was thinking incorrectly yesterday -- 1 isn't a totally bonkers aspect ratio, but anything higher than 2 really is, as aspect ratio is normally going to be a float of (width / height). So, a sane range would be min: 1, max: 2 . This is also what is used in the current example/live_w_locator .

Could someone try with aspectRatio set to min: 1 max: 2 and see if this also solves the issue? I'm not sure where people picked up max: 100 setting, I don't see it in any of the examples that are in the current source.

@reppard
Copy link

reppard commented Oct 9, 2020

Could someone try with aspectRatio set to min: 1 max: 2 and see if this also solves the issue? I'm not sure where people picked up max: 100 setting, I don't see it in any of the examples that are in the current source.

@ericblade this got my stream working again in iOS 14.0.1 in a test environment. The styling is a bit off but that may be on my side. This looks promising!

As for max: 100, I too had this as my default max. I had to have pulled that from some docs or example somewhere. I don't recall where it came from though

@ericblade
Copy link
Owner

The styling is a bit off but that may be on my side.

I'm having a suspicion that it's doing some kind of dynamic resizing to fit the request, so I'm thinking perhaps it might be a situation where we need to have people be very explicit about what they want in their constraints -- rather than "give me something that is within these parameters", we might need to supply "give me the exact aspect ratio i want". Though I think there's some likelihood that these things may vary between users and applications... i suspect for full screen on a phone, you're going to want to pass it either width/height or aspectRatio with width/height.

It might be good for someone to spend a bunch of time noting what happens in mobile browsers when you request various things (ie, what do i get if i ask for a camera at 1920x1080, but my display is 1080x1920, or my display is smaller/larger than that?). Mobile Chrome and Safari seem to be quite different in how they handle camera vs desktop -- i'm fairly certain that on mobile, Chrome resizes your element to fit the cameras dimensions, whereas on desktop, everything stays exactly the dimensions you specify, but the camera gets clipped. At least, there's something wildly different about how it operates on desktop vs on mobile, as my app gets very different views between my desktop and my cell phone when trying to tweak my app's layout

I had to have pulled that from some docs or example somewhere. I don't recall where it came from though

I'll have to search Christoph's repo as well, maybe even do a google/github search across the whole site... this is an unfortunate change, in that i don't think it's exactly a bug. :|

Thanks for the response!

@lotusorrose
Copy link

I was just reviewing my thoughts on the matter, and any relevant code, and it occurred to me that I was thinking incorrectly yesterday -- 1 isn't a totally bonkers aspect ratio, but anything higher than 2 really is, as aspect ratio is normally going to be a float of (width / height). So, a sane range would be min: 1, max: 2 . This is also what is used in the current example/live_w_locator .

Could someone try with aspectRatio set to min: 1 max: 2 and see if this also solves the issue? I'm not sure where people picked up max: 100 setting, I don't see it in any of the examples that are in the current source.

I tried both removing the line as well as changing the line both seem to work.

btw this aspectratio comes from https://serratus.github.io/quaggaJS/examples/live_w_locator.js

@ericblade
Copy link
Owner

Thanks for finding that. I'll submit a req to that repo, I don't know that it'll update the live sample, I don't know if that's autobuilt somehow. We might never be rid of this problem :-D

@ericblade
Copy link
Owner

I'm going to assume that we've fixed this, and no changes are necessary in the library itself?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants