-
Notifications
You must be signed in to change notification settings - Fork 2.9k
MediaElement and SyntheticMediaElement #8129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Tagging Remotion @JonnyBurger @Iamshankhadeep and GSAP @jackdoyle @PeterDaveHello |
Thanks for raising this! I think this might benefit from going through https://wicg.io/ or equivalent to get some help with flushing out the proposal a bit more. cc @whatwg/media |
I'm interested in this idea also; I'd like to sync CSS & Lottie animations with an audio/video file. Currently exploring using VTT as a unified timed RPC listing to trigger commands. Folks at Mux, Inc are taking a different approach to similar problem; they are[abstracting |
Hi, thanks for the ping @tomByrer. I don't totally understand what the result of the proposal would be, but it does sound related to what we're doing with media-chrome, which is a set of media UI elements that can work with any html element (native or custom) that exposes the same API as the native media elements ( We have a growing list of custom media elements, including wrappers for the youtube player and HLS.js. Many of them simply extend a custom-video-element class we built, while other start from scratch. Also, under video-dev/media-ui-extensions we have early proposals for extensions of the media element API for common needs like quality rendition switching and ad UIs. I gave a related talk at Demuxed. Happy to chat more if there's interest. |
@heff Definitely related! One concrete difference is that The discussion at muxinc/media-chrome#182 gets more to the heart of the difference. Particularly this comment:
In my use case, I am making "videos" out of DOM manipulation (example), and I need a You could also use this to control multiple Basically, it's a pattern for general-purpose imperative animation. Like the Web Animations API, it isn't inherently tied to a scrubber bar interface (although in most applications it will be). Unlike the Web Animations API, which can only animate CSS properties, this can be used to e.g. sync up a THREE.js scene to a scrubber bar. |
I think that makes sense to focus on an interface for just the media state/control API. Video can get more complicated with element attributes and child nodes (track, source). I see how it can makes sense to break that out of the requirement of being an element so it could work in other contexts like node. On that note, is it not a Media "Element" interface then? |
Yeah, the logic was |
Yeah, maybe one of those. Probably not worth worrying about naming until this gets a little further. |
Introduction
The
<video>
element was one of the most revolutionary new features of HTML5, and a key part of "Web 2.0". Today, the web platform has become powerful enough to create videos. Libraries such as GSAP, Liqvid, and Remotion allow developers to create seekable animations and even full-length videos using Javascript, just as we did with Flash in the days of yore. (Disclaimer: I am the author of Liqvid.) Such "videos" are just DOM manipulation synced up to an audio track (which is still a normal audio file) and a scrubber bar; in particular, they can be interactive, which is impossible with.mp4
videos.This proposal standardizes the behavior which is common between GSAP's
Timeline
, Liqvid'sPlayback
, Remotion'sPlayerRef
, and other libraries. It defines one new interface,MediaElement
, and one new class,SyntheticMediaElement
. The desiderata are:the existing
HTMLMediaElement
interface must implementMediaElement
SyntheticMediaElement
must implementMediaElement
the initial specification of
SyntheticMediaElement
should do no more than implementMediaElement
, in order for the proposal easy to adopt.In other words,
SyntheticMediaElement
implements a subset of the functionality of<audio>
/<video>
elements. It has a current time, a playback rate and a duration, and can be played, paused, and seeked.The choice of which properties/events to include is dicated by experience. The Liqvid plugin suite is compatible with all three of GSAP/Liqvid/Remotion, and this proposal is a less-kludgy version of the
@lqv/playback
interface that those plugins are built around.Details
The
MediaElement
interface includes the following properties ofHTMLMediaElement
:currentTime
duration
muted
pause()
paused
play()
playbackRate
seeking
volume
It also supports
addEventListener
andremoveEventListener
with the following event types:durationchange
ended
pause
play
playing
ratechange
seeked
seeking
timeupdate
volumechange
The
SyntheticMediaElement
class implementsMediaElement
.Polyfill
Polyfill: mjs, types, source.
This polyfill is based on Liqvid's
Playback
class. However, due to design errors that class does not currently implementMediaElement
as defined above (it measurescurrentTime
in milliseconds rather than seconds, and some of the event names are different).Enhancements
All three reference libraries implement additional functionality beyond the
MediaElement
interface defined above. We have not included these in the proposal since they violate Desiderata 1 and/or 3. However, they are useful to keep in mind.GSAP and Liqvid support giving string names to specific times or intervals.
GSAP and Remotion allow nesting of
Timeline
s/Sequence
s. Relatedly, Liqvid and Remotion allow ordinary<audio>
/<video>
elements to be controlled by the "synthetic" playback. In the future, both of these could be implemented by some sort ofadopt()
method onSyntheticMediaElement
.Liqvid allows a
Playback
to control anAnimationTimeline
. In the future, this could be added toSyntheticMediaElement
.The text was updated successfully, but these errors were encountered: