Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Support of a Timing Object for Synchronized Multi-Angle Video Playback #3678

Open
coofzilla opened this issue Apr 15, 2024 · 5 comments
Labels

Comments

@coofzilla
Copy link
Contributor

coofzilla commented Apr 15, 2024

Description

Implement a feature in react-native-video to allow multiple video instances to be controlled by a single set of controls, ensuring synchronized playback across all instances.

Why it is needed ?

There is a growing demand for multi-angle video content where users can view the same scene from different perspectives in real-time. Content creators and app developers need a seamless way to synchronize multiple video streams to provide an immersive viewing experience. This feature is crucial for applications in sports, events, and security, where multi-perspective playback adds value to the user experience.

Possible implementation

Integrate a timing object model, similar to the one described at WebTiming, which will allow the react-native-video component to synchronize with a master timing source. This will enable a VStack (or similar layout structure) to house multiple react-native-video instances that can be controlled by a unified timing mechanism.

Code sample

import React from 'react';
import { VStack, VideoPlayer, TimingControl } from './components'; 

const videos = [
  { uri: 'video1.mp4' },
  { uri: 'video2.mp4' },
  { uri: 'video3.mp4' }
];

export default function SynchronizedVideos() {
  const timingControl = new TimingControl(); // Handle timing across video players

  const playAll = () => timingControl.play();
  const pauseAll = () => timingControl.pause();
  const seekAll = (time) => timingControl.seek(time);

  return (
    <VStack>
      {videos.map((video, index) => (
        <VideoPlayer
          key={index}
          source={video.uri}
          timingControl={timingControl} // Bind to the shared timing control
        />
      ))}
    </VStack>
  );
}

example:

image

@freeboub
Copy link
Collaborator

freeboub commented Apr 22, 2024

Hello, Idea is well understood on my side. BTW, I have few questions (Please, let's focus on android first, I know this platform better):
I expect you already start playing multiple live in the same time.
Q1: In that case (the initial case), is the synchronization OK ?
See following doc for the multiple times available on android: https://developer.android.com/media/media3/exoplayer/live-streaming
Q2: if not correct, did you try to explicitly change the buffer config ?

Q3 / proposal: regarding to API (thank you for the suggestion), Can I suggest something like this ?

import React from 'react';
import { SynchronizedVideo, VideoPlayer } from './components'; 

export default function MyComponent() {
  const timingControl = useRef(); // Handle timing across video players

  const playAll = () => timingControl.play();
  const pauseAll = () => timingControl.pause();
  const seekAll = (time) => timingControl.seek(time);

  return (
    <SynchronizedVideo ref={timingControl}>
      {videos.map((video, index) => (
        <Video
          key={index}
          source={video.uri}
        />
      ))}
    </VStack>
  );
}

Then new SynchronizedVideo component will dispatch actions to all sub Video Component.
This is a proposal, I am not sure of technical feasibility.
The idea is to be able to easily replace <Video by <SynchronizedVideo without any big rework on app side.
We can also think to move / duplicate all <Video Props to <SynchronizedVideo as it doesn't make sens to have different configuration. (Thinking again, but not all props as source shall be different, TBC)

Q4: do you have any other requirements on DRM, playback window (I think you have a live windows as you talking about seek), or addiotional input you can share ?

Q5: if you find some docs on how to synchronize multiple video with exoplayer, I am interested to review it (I did find for now)

@coofzilla
Copy link
Contributor Author

hello there! Just wanted to give a quick response to let you know that I'm working up a response to these questions. I have to discuss with my team and then I'll post back here 🥳 Thank you for the feedback!

@coofzilla
Copy link
Contributor Author

coofzilla commented Apr 30, 2024

Alrighty, spoke to my team and got some answers :)

Q1: Synchronization in Initial Tests

We have attempted to synchronize multiple video instances (both live and VOD), but we encounter slight discrepancies in start times even when we loop through each reference and command ref.current.resume(). These discrepancies, albeit small, are noticeable and increase with the number of players. This has been a persistent issue regardless of the content type.

Q2: Buffer Configuration

I have experimented with different buffer configurations; however, even optimized settings do not fully eliminate the millisecond delays between player synchronizations. If there are methods to execute playback commands simultaneously across all instances at precisely the same time, I would be interested in exploring them.

Q3: API Proposal

The API proposal you've suggested with SynchronizedVideo sounds promising as it offers a flexible and intuitive way to handle multiple video instances with a single reference, facilitating various layouts.

Q4: Focus Shift and Requirements

We've decided to initially focus on VOD synchronization. Our VOD content will be delivered via HLS. Concerning video duration synchronization, leveraging Unix epoch times for alignment seems viable as suggested by the ExoPlayer documentation.

For us, we'll ensure that the duration of each video source are the same on the backend; so, I'd like to hear your thoughts on how we could handle situations where those who wish to implement this feature and have durations that do not match or when there are discrepancies in the source timelines.

Q5: Documentation

I've found some things; but, they are for ios.

https://developer.apple.com/documentation/avfoundation/sample_buffer_playback

This seems more concerned with syncing media with animation; but, I figured I'd share anyway. https://developer.apple.com/documentation/avfoundation/avsynchronizedlayer

Additional Input and Collaboration

I have some experimental code that might be useful as we develop this feature. I am open to sharing this and discussing further to refine the implementation. Please let me know how you would prefer to proceed if that interests you.

Looking forward to your thoughts and any further suggestions you might have 🧑‍🔧

@coofzilla
Copy link
Contributor Author

image
    playerRefs.current.forEach(ref => {
      if (ref.current) {
        isPlaying ? ref.current.pause() : ref.current.resume();
      }
    });

This kinda shows the core issue of, even if they are the exact same video, and exact duration, if you play them in a loop. They'll be off.

@coofzilla
Copy link
Contributor Author

Started messing around with a synchronization function that I'm calling onProgress and its a bit tighter:

image
  const synchronizePlayers = (currentTimes: { [key: string]: number }) => {
    const maxTime = Math.max(...Object.values(currentTimes));
    const minTime = Math.min(...Object.values(currentTimes));

    if (maxTime - minTime > TOLERANCE) {
      playerRefs.current.forEach(ref => {
        if (ref.current) {
          ref.current.seek(maxTime);
        }
      });
    }
    setCurrentTime(maxTime);
  };

Main problem with this is the visual stutter when synchronizing the players.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants