Windows 10 Media Playback for Unity
Media Playback plugin for Unity on Windows 10.
It gives you access to a broad range of media playback features:
- Local files, progressive and Adaptive Streaming (HLS, DASH) playback
- Regular and 360 videos, stereoscopic (3D) and monoscopic
- All formats, codecs and media containers supported by Windows 10
- Ambisonic Audio (see below)
MediaPlaybackUnity is a Unity project covering 3 key scenarios:
- Regular video playback with Adaptive Streaming, MediaPlayback.unity scene
- 360 video playback (stereo/mono), MediaPlayback360.unity scene
- 360 video playback (stereo/mono) on Skybox, Skybox360Stereo.unity scene There is also a demo scene showing how to play video files from so called 'known folders' on Universal Windows Platform (Video library in this case). The scene is in MediaPlaybackUnity\Assets\Scenes\UWPTest folder.
Supported Unity versions:
How to use
- Download MediaPlaybackDemo release package or open MediaPlaybackUnity project from a cloned repo.
- Look how MediaPlayback.unity and MediaPlayback360.unity are structured. If you just want to play a video in your scene, use Playback and MediaPlaybackRunner components.
How to build
Unity project already has all plugin binaries prebuilt. For building the plugin, use Visual Studio 2017 with Windows Desktop, Universal Windows Platform and C++ toolsets installed. It also requires Windows 10 SDK for Fall Creators or later.
- Open MediaPlayback/MediaPlayback.sln
- Build Desktop and UWP projects
If built successfully, *MediaPlayback\Unity\MediaPlayback* should have all Unity files required. CopyMediaPlaybackDLLsToUnityProject.cmd script copies plugin binary files to Unity project's Plugins folder.
Properties and events
- Renderer targetRenderer - Renderer component to the object the frame will be rendered to. If null (none), other paramaters are ignored - you are expected to handle texture changes in TextureUpdated event handler.
- string targetRendererTextureName - Texture to update on the Target Renderer (must be material's shader variable name). If empty, and targetRenderer is not null, mainTexture will be updated
- string isStereoShaderParameterName - If material's shader has a variable that handles stereoscopic vs monoscopic video, put its name here (must be a float, 0 - monoscopic, 1 - stereoscopic).
- bool forceStereo - If true, the material's shader will be forced to render frames as stereoscopic (assuming isStereoShaderParameterName is not empty)
- bool forceStationaryXROnPlayback - if true, switches to XR Stationary tracking mode, and resets the rotation when starts playing a video. Once playback stops, switches back to RoomScale if that mode was active before the playback
- bool isStereo - true if current video is detected as stereoscopic by its metadata (ST3D box). forceStereo doesn't affect this property
- bool hardware4KDecodingSupported - true if hardware video decoding is supported for resolutions 4K and higher
- uint currentPlaybackTextureWidth / currentPlaybackTextureHeight - current frame resolution
- Texture2D currentVideoTexture - current video texture
- PlaybackState State - current playback state
- TextureUpdated (object sender, Texture2D newVideoTexture, bool isStereoscopic) - video texture has been updated. isStereoscopic is true if either the video is stereoscopic by its metadata, or forceStreo is true
- PlaybackStateChanged (object sender, ChangedEventArgs args) - playback state has been changed
- PlaybackFailed (object sender, long hresult) - playback failed
- SubtitleItemEntered (object sender, string subtitlesTrackId, string textCueId, string language, string textLines) - text subtitle cue entered (must be shown)
- SubtitleItemExited (object sender, string subtitlesTrackId, string textCueId) - text subtitle cue exited (must be hidden)
Rendering stereoscopic videos
The plugin now detects sterescopic videos based on the metadata (ST3D box). Once ST3D detected, MediaPlayer handles the stereoscopic frame, then the plugin renders all frames to the video texture in over/under layout.
360VideoShader and 360VideoSkyboxShader are based on Unity's SkyboxPanoramicShader. They currently dont't support 180-degree videos.
If you want to render to Skybox, handle TextureUpdated event on Playback object. Look at MediaPlaybackUnity/Assets/MediaPlayback/Scrips/MediaSkybox.cs script. There is a sample scene for Skybox rendering, in MediaPlaybackUnity/Assets/Scenes. The video texture is Y-flipped, make sure you handle it in the shader.
When rendering stereoscopic videos, the plugin converts stereoscopic frames to over/under frame layout, so the video texture always comes to the shader as over/under frame.
In your custom shaders, if you want to handle 180-degree videos or single-frame cubemaps, they all usually have no corresponding metadata, and must be handled in the shader based on the custom medatada.
Underlying MediaPlayer API only handles Ambionic Audio when SA3D metedata box is presented, as per Spatial Audio RFC. You don't need to set any options or perform initialization for using Ambisonic Audio. Once your video file or stream has SA3D box, Media Player will handle Ambisonic audio stream, spatialize and binauralize it, tracking the headset rotation internally.
As you prepare your content for Adaptive Streaming, GPAC MP4Box, as one of the the most popular tools for multimedia packaging, currently doesn't preserve SA3D metadata box, so msft-mahoward made some changes in it. Until his work is merged into the main GPAC repo, please use this fork, or another one - with all recent changes from the main GPAC repo and merged msft-mahoward's work.