Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support <source>s in <video>s (use videoEl.src in a-assets and calculateVideoCacheHash) #3173

Closed
cvan opened this issue Oct 17, 2017 · 25 comments

Comments

@cvan
Copy link
Contributor

cvan commented Oct 17, 2017

Description:

The calculateVideoCacheHash helper function doesn't take into account videos with <source>s (instead of a defined src attribute on the parent <video> element).

Example:

        <video id="video"
               autoplay loop crossorigin="anonymous">
             <source src="https://ucarecdn.com/fadab25d-0b3a-45f7-8ef5-85318e92a261/"></source>
        </video>

vs.

        <video id="video" src="https://ucarecdn.com/fadab25d-0b3a-45f7-8ef5-85318e92a261/"
               autoplay loop crossorigin="anonymous">
             <source ></source>
        </video>

/cc @machenmusik


@cvan
Copy link
Contributor Author

cvan commented Oct 17, 2017

Digging a bit deeper, I found that AFRAME.systems.material.prototype.loadTexture also checks the 'src' attribute of a <video> element, throwing a Console warning.

I assume this can be fixed by fixing the calculateVideoCacheHash method to use <video>.src instead of <video>.getAttribute('src'). And then just checking for src.src here before throwing a warning.

There are two cases I can think of where the wrong video could be used from a cache hit.

Obviously, the following is very anti-best practice for loading, but a developer could inadvertently do this when preloading <video>s without ids defined:

<a-assets>
  <video><source src="video1.mp4"></source></video>
  <video><source src="video2.mp4"></source></video>
</a-assets>
<a-sky src="video1.mp4"></a-sky>

The other case, which is also uncommon: A developer could have more than one <video> element, defined in the scene's <a-assets>, mistakenly sharing the same id attribute values:

<a-assets>
  <video id="video">
    <source src="video1.mp4"></source>
  </video>
  <video id="video">
    <source src="video2.mp4"></source>
  </video>
</a-assets>
<a-sky src="#video"></a-sky>

@cvan cvan changed the title Have calculateVideoCacheHash use videoEl.src to support <source>s Support <source>s in <video>s (use videoEl.src in a-assets and calculateVideoCacheHash) Oct 17, 2017
@cvan
Copy link
Contributor Author

cvan commented Oct 17, 2017

On a related note (though it could be handled in a separate issue), should also change this line in mediaElementLoaded:

  if (!el.hasAttribute('autoplay') && el.getAttribute('preload') !== 'auto') {

to this:

  if (!el.autoplay && el.preload !== 'auto') {

@ngokevin
Copy link
Member

I wouldn't support source, prescribe to use a format supported by the modern VR browsers.

@ngokevin
Copy link
Member

As seen from many areas we would have to touch, I don't think it's worth the maintainance cost.

@cvan
Copy link
Contributor Author

cvan commented Oct 17, 2017

This is not some legacy way of defining videos. This is the best practice for modern browsers. I filed this not for some “backwards compatibility” issue - not at all. There is real value in supporting <source>. Serving different codecs is not just an edge case. /cc @machenmusik @domcccurdy

@cvan cvan reopened this Oct 17, 2017
@ngokevin
Copy link
Member

There's a tradeoff for what's ideal and what's practical to maintain. "Best practices" become different when it comes to WebVR. Is there a large common use case for needing different codecs in VR? I don't think it is difficult to serve a format that is supported by the VR-supported browsers versus converting a video to multiple ones. If we support this, any time A-Frame code touches video, we then need to consider not breaking this.

We can continue discuss, but let's leave this closed unless the project maintainers decide otherwise. It's more of a matter of deciding priorities and maintenance cost for the project versus what's ideal in a vacuum.

@dmarcos
Copy link
Member

dmarcos commented Oct 17, 2017

Can we find examples of WebVR needing / using different codecs?

@dmarcos
Copy link
Member

dmarcos commented Oct 17, 2017

I would defer this until we find specific use cases.

@ngokevin
Copy link
Member

ngokevin commented Oct 17, 2017

Just thought of that this seems possible to handle in a component / external code as well. Things that are at least possible without A-Frame intervention are generally less priority. For anyone running across this. Something like...

AFRAME.registerComponent('multivideo', {
  schema: {
    urls: {type: 'array'}
  },

  init: function () {
    var v = document.createElement('video');
    // 1. Loop over the URLs.
    // 2. Check if the video can play the type with `v.canPlayType`
    // 3. Keep looping until a video is playable.
    // 4. Set the material src to that video.
  }
});

// Alternatively, do this as a content script to get preloading working.
// Even possible to register a custom element for `<a-assets>` to handle this automatically.

@cvan
Copy link
Contributor Author

cvan commented Oct 17, 2017

An example today is of A-Blast and Puzzle Rain requiring OGG audio files in the Oculus Browser, Samsung Internet, and Chromium. And in other browsers, MP3 files are used. This is currently hardcoded in JavaScript, causing the incorrect assets to all be loaded initially in <a-assets> and then the correct ones fetched from the JS calls within the application logic.

Another use case is streaming video. Right now, @machenmusik has a PR for HLS. And typically you’d define one <source> for HLS, one for MPEG-DASH, and fall back to x264. Browsers already handle this feature detection based on the mime-type and codec passed in <source type>.

@ngokevin
Copy link
Member

I can't find uses of MP3 in A-Blast or Puzzle Rain (which is not A-Frame so there's no a-assets). They use OGG. Either way, it can be worked around in any case. Similarly for streaming video should be handled outside A-Frame IMO.

@machenmusik
Copy link
Contributor

So can you guys clarify whether video playback actually doesn't work in these scenarios, or whether it's just preload and/or caching?

@cvan
Copy link
Contributor Author

cvan commented Oct 18, 2017

Because of how <a-assets> and the material src loading work, you can't do this:

<video id="video" crossorigin="anonymous">
  <source src="video.mp4"></source>
</video>

No matter where you put it on the page, you won't be able to have A-Frame use a <video> with a <source> specified. It just doesn't load, because of the aforementioned places that check explicitly for <video>.getAttribute('src').

Any third-party component you write won't fix this, without a bunch of monkeypatching. And, even then, I'm not sure it'll work because of the loading order of the systems and components.

The caching and preloading are just additional issues that also prevent loading and possible cache hits for the wrong <video> elements.

Sorry if that wasn't clear. Thanks for asking.

@machenmusik
Copy link
Contributor

I do understand @dmarcos @ngokevin viewpoint that if it can be done without explicit core support, it is useful to allow external components to provide the functionality unless/until there is a clear candidate to integrate into core.

My understanding, correct me if I'm wrong, is that those playing video on the web are often using something like video.js or other player frameworks. So maybe the use case to consider is, can A-Frame support the video element usage of the most common player frameworks? If not, then there are presumably gaps worth addressing in core.

@ngokevin
Copy link
Member

I think it's simpler to prescribe not using a video player framework, it adds a lot more uncertainty. It's entirely possible to build a component or code that selects and sets a suitable video out of a list of video URLs. Not using <source> specifically, but that's OK.

@cvan
Copy link
Contributor Author

cvan commented Oct 18, 2017

Not using <source> specifically, but that's OK.

Why create a new DSL and deviate from how media sources are loaded in the <video> (and <audio>) elements natively supported in all browsers?

In my earlier comments, I found the spots that need adjustment, and there is a clear solution of just searching, replacing, and adding tests.

What's unreasonable about this request?

How are developers going to know that this doesn't work? Respond on Stack Overflow/Slack with a link to a different custom component and interface they have to learn (with its own caveats that don't 1:1 match up with browsers' native <video>)? Pointing developers to this now-closed issue? Add a new item in the FAQ? How sustainable is that approach?

@machenmusik
Copy link
Contributor

I think it's simpler to prescribe not using a video player framework, it adds a lot more uncertainty.

IMO, that's a little like prescribing not using frameworks atop WebGL. Simpler, but perhaps less useful.

@ngokevin
Copy link
Member

ngokevin commented Oct 18, 2017

A-Frame doesn't carry everything over from the 2D Web. As mentioned, there is long-term maintenance cost in code that is already complex. It's also awkward since it won't support the inline URL style, which is why a third-party component is preferred. It's not unreasonable, but neither are the maintenance concerns. Any third-party component that interfaces with the video element will then need to account for <source>. Documentation and online resources will continue to guide developers.

IMO, that's a little like prescribing not using frameworks atop WebGL. Simpler, but perhaps less useful.

"why frameworks at all?" sort of becomes reductio ad absurdium. The point of the simplicity was of using one framework versus two different frameworks.

@machenmusik
Copy link
Contributor

Yes, my point is that useful frameworks distinguish themselves over time, and to stipulate that you can't use them makes the overall solution less useful. I think we are all hoping that any of the WebVR frameworks proves as useful as THREE has for WebGL, or as others have for video.

@machenmusik
Copy link
Contributor

machenmusik commented Oct 18, 2017

The basis for video support in A-Frame is (a) the THREE standard and flat shaders accepting src texture, and (b) helper code that lets one specify the media element and the combination of A-Frame and THREE take care of the rest.

Supporting source elements requires no work for (a), AFAIK, and some very specific but clear changes @cvan identified for (b) - several of which are actually cleaner like using properties directly rather than via getAttribute.

Is there a specific objection? I think most of the complexity around video is around (1) user gesture requirements on mobile, which A-Frame can't solve (2) unsupported codecs and formats, which source element support was specifically designed to mitigate (3) browser quirks especially Safari and iOS which we are considering dropping.

Why not support the intended solution for (2) as @cvan asks? The browsers will already have done the heavy lifting.

@ngokevin
Copy link
Member

Ah, well if it's just swapping getAttribute calls to .src calls, I think it's fine. Is that all we need? If so, I overestimated the effort given the amount of content in the proposal.

@machenmusik
Copy link
Contributor

From
https://github.com/aframevr/aframe/blob/master/src/core/propertyTypes.js#L87

// Pass through media elements. If we have the elements, we don't have to call
// three.js loaders which would re-request the assets.

Maybe we should just be taking the media elements, which the browsers cache as they see fit or are instructed (and thus never having three.js loaders re-request) - inline video urls would then need to be specified as video elements, so it becomes simple to answer the question of how to control the video.... widespread usage of inline forms of a-video and a-videosphere may be the root of many FAQs on video.

@machenmusik
Copy link
Contributor

@cvan can you confirm it's just cleaning up getAttribute / hasAttribute usage?

@ngokevin I share concerns over complexity for caching etc. and also unwitting preload of giant video files that exacerbate memory constraints, do you think we should take specific steps to avoid?

@machenmusik
Copy link
Contributor

Let's move from abstract to concrete discussion on #3176. Only slightly more than getAttribute / hasAttribute, have a look.

ngokevin pushed a commit that referenced this issue Oct 19, 2017
* modify video to use source element instead of src attribute

* add test for video with source element instead of src attribute

* modify to use anonymous video with source element instead of with id and src attributes

* support video with source element(s) rather than src attribute

* fix pre-existing preload usage per discussion on PR

* editorial changes per discussion on PR

* add warning when audio/video asset has neither src nor srcObject

* remove querySelector per discussion on PR; use childElementCount as a weak check for child source elements
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants