Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use <video> as display sink via canvas.captureStream #521

Closed
bvibber opened this issue May 27, 2019 · 10 comments
Closed

Use <video> as display sink via canvas.captureStream #521

bvibber opened this issue May 27, 2019 · 10 comments

Comments

@bvibber
Copy link
Owner

bvibber commented May 27, 2019

In addition to MSE-based experiments, I'm experimenting with a WebRTC-style MediaStream as a medium for getting video frames and audio sample into a real native <video> element. This uses the <canvas>'s captureStream() method to obtain a suitable MediaStream of frames from the canvas's backing buffer, piped into the video's srcObject.

This appears as a live stream to the <video>, so ogv.js is still responsible for A/V sync etc; when drawing a frame we call requestFrame() on the stream (or track, depending on browser) to push the current frame onto the video stream, so it doesn't have to sample at a fixed rate.

Unfortunately while this mirrors cleanly and without visible delay, the browser default controls reflect that it's a live stream, so won't show the expected seek bar, duration/current-time etc. However it seems possible to monkeypatch the HTMLVideoElement object to pass-through to ogv.js's internals, which should allow custom controls to work even if the <video> is passed directly as the primary element (instead of sitting inside the <ogvjs>). Need to decide on best way to do this.

For audio, it should be possible to combine with an audio track obtained from the AudioContext, creating a stream with both video and audio tracks. Audio is untested so far.

Note that all this works in:

  • Firefox
  • Chrome
  • Chromium-based Edge dev build
  • Safari (desktop)
  • Safari (iOS Simulator)

and fails in:

  • Safari (iOS device) -- seems to be an internal bug in WebKit: https://bugs.webkit.org/show_bug.cgi?id=181663 / rdar://problem/51150406
  • pre-Chromium Edge -- no support for captureStream() on <canvas>. Will be obsoleted sooner or later by the Chromium-based Edge, so not a big deal.
@bvibber
Copy link
Owner Author

bvibber commented May 28, 2019

Merging in an audio track via a MediaStreamAudioDestinationNode seems to work in Firefox, Chrome/dev-Edge, and iOS Safari but doesn't play the audio on desktop Safari.

(So amusingly, currently desktop Safari gets video but no audio, and iOS Safari gets audio but no video. Put two devices together and you're set. ;) )

It's probably another bug in WebKit, will make a simpler test case and report upstream.

@bvibber
Copy link
Owner Author

bvibber commented May 28, 2019

@bvibber
Copy link
Owner Author

bvibber commented Jun 6, 2019

Mac sound issue imported to radar as rdar://problem/51290997

@bvibber
Copy link
Owner Author

bvibber commented Jun 28, 2019

There's a provisional test mode for this, has to be enabled explicitly. Test as "wasm <video>" in the demo.

@guest271314
Copy link

Unfortunately while this mirrors cleanly and without visible delay, the browser default controls reflect that it's a live stream, so won't show the expected seek bar, duration/current-time etc.

ts-ebml https://github.com/legokichi/ts-ebml can be used for setting duration of entire or time slices of the video (input Blob). Input => encode as Matroska or WebM => set duration => play at ogv.js.

mkvmerge can also be utilized to set duration at Matroska and WebM files.

To an appreciable degree MediaSource can be used to display (or get) currentTime and provide seeking functionality ("sequence" mode can be used which "automatically" sets currentTime as opposed to "segments" mode which needs use of timestampOffset).

@guest271314
Copy link

This should provide a rough draft in code of the concept described in this issue https://next.plnkr.co/edit/vQjbBo

@bvibber
Copy link
Owner Author

bvibber commented Sep 7, 2019

I'm not sure this would apply; in this case the <video> displays a live MediaStream of the <canvas> and Web Audio output, so the browser never sees any of the WebM metadata (which already includes a duration).

@guest271314
Copy link

Is this issue trying to solve the case of getting and displaying current time and seeking during a live MediaStream set as srcObject at a HTML <video> element?

If that is the case one workaround would be to delay the output by N seconds, pipe input media through MediaRecorder, or other file writer in N second chunks to set initial metadata at each file (Chrome, Chromium does not set duration of WebM files output by MediaRecorder) then stream the chunks (originating from canvas.captureStream() or other API which has MediaStreamTrack defined) at a <video> element with MediaSource set as src which will provide seekable controls (FWIW see guest271314/MediaFragmentRecorder#8). Since a native <video> element is described at this issue as long as the <video> is a single resolution can be recorded at Chrome, Chromium, though currently will crash the tab if variable resolution input frames exist in the track and .captureStream() is called.

If that is not what this issue is attempting to resolve kindly advise.

@guest271314
Copy link

The concept of this code https://plnkr.co/edit/Axkb8s?p=info is to create an infinite live MediaStream initialized with a stream silence and #000000 frames where any MediaStreamTrack from <canvas|HTMLMediaElement>captureStream(), getUserMedia(), getDisplayMedia(), RTCRtpSender, MediaStreamTrackAudioSourceNode can replace either and both audio and video tracks and where, ultimately, any time slice of the previously played, or to be played MediaStream can be recorded to a discrete WebM or Matroska file with variable resolution encoded into the video, capable of playback (including resizing of the variable resolution frames) at both Chrome and Firefox browsers, ideally without using any code not shipped with the browser, and using the same code for each browser. Still has issues, mainly due to the requirement of being able to play variable width and height frames at HTML <video> element. At one point created a version using Web Animation API, which provides a means to use the timeline of the animation to play the video forwards or backwards and adjust the playback rate, the caveat being the need to get all of the images from the video(s) first as it is not currently possible to dynamically set keyframes of an animation, unfortunately, lost the versions of that code.

@bvibber
Copy link
Owner Author

bvibber commented Jan 26, 2022

I had enough issues with this mode having underlying platform bugs in the browser, so I'm taking it out. ;_;

Was good to experiment with though!

@bvibber bvibber closed this as completed Jan 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants