Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for PIP mode in iOS #237

Open
pooja816 opened this issue Mar 9, 2018 · 70 comments
Open

Support for PIP mode in iOS #237

pooja816 opened this issue Mar 9, 2018 · 70 comments

Comments

@pooja816
Copy link

pooja816 commented Mar 9, 2018

Support for PIP mode in iOS

Please add support for PIP mode in iOS

Expected Behavior

When any video call is in process, in between I also want to access other features of the app e.g. - WhatsApp video call

Actual Behavior

When I try to access other features of the app during call, I need to disconnect the call

Video iOS SDK

1.3.4 via CocoaPods

Xcode

9.2

iOS Version

10.0

iOS Device

iPhone 7

@ceaglest
Copy link
Contributor

ceaglest commented Mar 9, 2018

Hey @pooja816,

As far as I'm aware there is no way to use Picture-in-Picture support with arbitrary video content (such as content drawn by TVIVideoView). If you want PiP like functionality, you would need to add your own support for it at the application level (some sort of draggable UIView which floats on top of your ViewControllers), and you would not be able to display video after backgrounding the app.

For reference, here are Apple's docs on AVPictureInPictureController.

I hope this helps, even if its not the answer that you were looking for.

Best,
Chris Eagleston

@ceaglest
Copy link
Contributor

ceaglest commented Mar 9, 2018

Thinking about your question a little more, there is no reason why the TwilioVideo objects (like a Room) need to be tied to a specific ViewController.

If your goal is to just do more than 1 thing at a time inside your application (forgetting about backgrounding concerns), then you could write a model controller class which is responsible for managing the usage of TwilioVideo and coordinating with your UIViewController(s) to display content, and respond to user interactions. Our sample code uses a single ViewController for simplicity, but there is nothing preventing your app from organizing and managing the SDK objects differently.

@ceaglest
Copy link
Contributor

Hey @pooja816,

I never heard back from you. Are you looking for more comprehensive sample code, or AVPictureInPictureController support so that you can play video outside of an app?

Best,
Chris

@pooja816
Copy link
Author

pooja816 commented Mar 15, 2018

Hi @ceaglest
Thanks.
I am trying model controller class which is responsible for managing the usage of TwilioVideo and coordinating with my UIViewController(s) to display content.

@pooja816
Copy link
Author

pooja816 commented Mar 20, 2018

Hi @ceaglest
I try to manage through model controller and when the view is disappeared the call functionality is working fine. But to add PIP mode support to traverse in the application during call(some kind of draggable view) is a little bit complex.

Is there any way to add the view in the status bar(UI like when our app move to background and the call continues).

Please help me as I am stuck on this.

@ceaglest
Copy link
Contributor

ceaglest commented Mar 20, 2018

Hi @pooja816,

Is there any way to add the view in the status bar(UI like when our app move to background and the call continues).

Not that I'm aware of, you shouldn't interact with the status bar directly. Apple's PiP support allows you to play in the background as long as you use AVPlayer. I don't believe the status bar is part of your key UIWindow, but rather something that springboard, or UIKit provides.

I try to manage through model controller and when the view is disappeared the call functionality is working fine. But to add PIP mode support to traverse in the application during call(some kind of draggable view) is a little bit complex.

I don't have an example to share at the moment, but we do plan on offering something more complex which will use multiple view controllers in the future.

I'll be in touch once we have a concrete example.

Best,
Chris

@pooja816
Copy link
Author

Hi @ceaglest
When 2 users are connected in a room, the call goes on properly. But I am stuck on the following-

  1. When one user accesses other features of the app during the call, how to pause the video? If I do this localVideoTrack?.isEnabled = false;, then black screen appears to the other user.
  2. After accessing other features of the app, when I come back to the video call screen, the remote video is not rendered properly. I am not getting callback on addedVideoTrack of TVIParticipantDelegate method so the remote video is not rendered.

@pooja816
Copy link
Author

pooja816 commented May 7, 2018

Hi @ceaglest
Any update for the same?

@ceaglest
Copy link
Contributor

ceaglest commented May 7, 2018

Hi @pooja816,

Sorry for the long delay, somehow I missed your last response.

When one user accesses other features of the app during the call, how to pause the video? If I do this localVideoTrack?.isEnabled = false;, then black screen appears to the other user.

What kind of Room are you using? Disabling a LocalVideoTrack will cause black frames to be sent in a Peer-to-Peer Room, but in a Group Room you should get the final frame and no black frames when pausing.

After accessing other features of the app, when I come back to the video call screen, the remote video is not rendered properly. I am not getting callback on addedVideoTrack of TVIParticipantDelegate method so the remote video is not rendered.

Can you provide more information here? What is the sequence of events from our SDK? Did you get a track removed callback? How do you manage your renderer when pushing / popping more view controllers?

Best,
Chris

@pooja816
Copy link
Author

pooja816 commented May 9, 2018

Hi @ceaglest

What kind of Room are you using? Disabling a LocalVideoTrack will cause black frames to be sent in a Peer-to-Peer Room, but in a Group Room you should get the final frame and no black frames when pausing.

Yes, I am using Peer-to-Peer Room. How to create a Group Room?

Did you get a track removed callback?

No, I am not getting callback on track removed. How to remove a track?

How do you manage your renderer when pushing / popping more view controllers?

When the call starts for the first time, the room is created and the participant is added to the room and when the call connects, the callback is received on addedVideoTrack and the remote video of the other participant is rendered. On popping the videoCall view, I am only disabling localVideoTrack.
localVideoTrack?.isEnabled = false;

And when I move to video call screen again, I am initialising the localVideoTrack using following code-

   `func startPreview() {
        if PlatformUtils.isSimulator {
          return
    }
      // Preview our local camera track in the local video preview view.
      camera = TVICameraCapturer(source: .frontCamera, delegate: self)
      localVideoTrack = TVILocalVideoTrack.init(capturer: camera!)
      if (localVideoTrack == nil) {
         logMessage(messageText: "Failed to create video track")
      } else {
        // Add renderer to video track for local preview
        localVideoTrack!.addRenderer(self.previewView)

        logMessage(messageText: "Video track created"
    }
}`

@piyushtank
Copy link
Contributor

@pooja816 Thanks for responding back with information. Chris is on vacation for the rest of the week, in the meantime, let me try to help solve the problem -

Yes, I am using Peer-to-Peer Room. How to create a Group Room?

Here is the REST API and documentation on how to create group Room. You can use Type=group while creating the Room.

No, I am not getting callback on track removed. How to remove a track?

You can remove a video track by calling [localParticipant unpublishVideoTrack:localVideoTrack] however, you may not want to reamote the video track for your use case. When you move outside of video screen, you can call localVideoTrack?.isEnabled = false, now Assuming you have not removed renderer from the video track, when you move back to the call screen, you should set it back to true localVideoTrack?.isEnabled = true.

Let me know if you have any questions.

@pooja816
Copy link
Author

Hi @piyushtank
Thanks for your reply.
When the call starts for the first time, the video call goes on properly. But when the user again come back to the video calling screen after accessing other features of the app, the remote video is not shown.
Here is the Demo what I have tried in VideoCallKitQuickStart project.

@piyushtank
Copy link
Contributor

@pooja816 Thanks for sharing the code. I will try it out and let you know my findings.

@piyushtank
Copy link
Contributor

@pooja816 I tried the sample code. I noticed you have added ViewController1 as a root view controller and embedded view controllers into the navigation view controller.

When you press on new call and connect to a Room, ViewController2 gets pushed on top of ViewController1. When you are connected to a Room, it displays two buttons "Back" and "Settings". All ok till this point.

Now you can observe call works as expected while comming back from Settings screen but does not give expected results while returning from ViewController1. Here is the reason: When you press the "Back" button, the existing ViewController2 gets destroyed. And since ViewController2 owns Room, Room and all tracks get destroyed, and it gets disconnected from the Room. So, when you are coming back to call screen by pressing the "Continue" button on ViewController1, the app creates a new instance of ViewController2 and pushing it on the top.

On the other end, when you press the "Settings" button, ViewController2 does not get destroyed and the SettingsViewController gets pushed on top. So while coming back to ViewController2 from SettingsViewController, call remains connected because it is using the existing ViewController2.

This is iOS UI Kit behavior, please see [this].(https://developer.apple.com/documentation/uikit/uinavigationcontroller) for more information.

Also, you should not post token or any secret keys while sharing the code in a public place.

Please let me know if you have any questions.

@pooja816
Copy link
Author

Hi @piyushtank

When you press the "Back" button, the existing ViewController2 gets destroyed. And since ViewController2 owns Room, Room and all tracks get destroyed, and it gets disconnected from the Room.

Yes, when press back, ViewController2 gets destroyed. But the room is owned by CallManager and while coming back again to the ViewController2, I am getting the same instance of Room(CallManager.shared.room).

Please guide me how to access other features of the app during video call and while coming back to the video call screen, everything work fine.

@piyushtank
Copy link
Contributor

@pooja816 Correct, I also noticed that the Room is stored in CallManager. Unfortunately, SDK does not allow resetting the room.delegate so you will be required to store the previous ViewController and use it.

I have fixed the app to access other features of the app during video call and while coming back to the video call screen. See this.

Let me know if you have any questions.

@pooja816
Copy link
Author

@piyushtank Thanks. It is working fine.
But is it good to save the viewController instance?

@KrisConrad
Copy link

Sorry to dig up an old issue, but is there any way, or future plans, to support AVPictureInPictureController? While a custom in app PIP mode is doable, I was hoping to leverage AVPictureInPictureController so the user is able to navigate outside of the app and keep their video call in PIP mode.

@ceaglest
Copy link
Contributor

ceaglest commented Oct 2, 2018

Hi @KrisConrad,

We don't have a solution to this one at the moment.

The problem is that AVPictureInPictureController only works with AVPlayerLayer, and we can't provide our content directly to AVPlayer to participate in this process. One thought is, could we make a TVIVideoRenderer that serves video content to AVPlayer via AVURLAsset? I don't know if its possible to package up I420 or NV12 frames into a format which is streamable over a network without actually encoding as H.264 etc.

Ideally, apple would extend AVPictureInPictureController to support other layer classes, like AVSampleBufferDisplayLayer.

Regards,
Chris

@ceaglest
Copy link
Contributor

ceaglest commented Apr 6, 2020

Closing, as supporting PiP isn't on our roadmap for 2020. If Apple offers improved support for AVPictureInPictureController on iPad OS 14 then we may revisit this issue.

@ceaglest ceaglest closed this as completed Apr 6, 2020
@tmm1
Copy link

tmm1 commented Jun 22, 2020

I reported FB7747223 to apple (on feedbackassistant.apple.com):

AVPictureInPictureController does not work with AVSampleBufferDisplayLayer
I have an iOS app that uses AVSampleBufferDisplayLayer for playback. I would like to integrate the new PIP apis on iOS 14, but I cannot use PIP because AVPictureInPictureController only accepts AVPlayerLayer

@ceaglest
Copy link
Contributor

I reported FB7747223 to apple (on feedbackassistant.apple.com):

Thanks! If Apple accepts my lab appointment for tomorrow I will mention your feedback.

@mcorner
Copy link

mcorner commented Jun 23, 2020

@ceaglest Does ios14 change anything here? Or just bring the iPad functionality to the iPhone? We would be interested in putting video calls into PIP.

@ceaglest
Copy link
Contributor

ceaglest commented Jun 23, 2020

@mcorner It looks like PiP functionality is coming to the iPhone but with the same constraint of the content being an AVPlayerLayer.

I will be in the AVFoundation lab later today to discuss it with Apple. If you constrain a Room to use H.264 then any of the video content could in theory be played by an AVPlayer if it were remuxed to a transport stream and provided to an AVAsset with a custom protocol. It's just in memory copies but the delay might be too much for this technique to work.

Edit: I am speaking about private Video APIs where I have access to the decrypted H.264 NALUs before they are decoded to a CVPixelBuffer. There aren't public APIs at this moment, but imagine a Renderer that receives CMSampleBuffers.

@ceaglest ceaglest reopened this Jun 23, 2020
@21-Hidetaka-Ko
Copy link

@tmm1 @ceaglest
Were you able to solve the problem here and implement PIP mode? I am having the same problem.

@tmm1
Copy link

tmm1 commented Jun 22, 2021

It is apparently unimplemented in tvOS 15 Beta 1.

@21-Hidetaka-Ko
Copy link

21-Hidetaka-Ko commented Jun 22, 2021

@tmm1
Thanks.
Does this mean that we won't be able to implement it this fall when IOS15 is released?
Do you think you'll be able to implement it this fall? 
Do you think it will be ready for implementation this fall, and do you have any way to confirm this?

@21-Hidetaka-Ko
Copy link

21-Hidetaka-Ko commented Jun 22, 2021

Most of the discussion about Picture in Picture is about video playback services like Youtube and Netflix, but I'm more interested in Picture in Picture mode for calling apps like Facetime.

There are two points I'm interested in.
1.Is it possible to implement Picture in Picture mode in calling apps before iOS15?
2.Is it likely to be possible to implement Picture in Picture mode in the calling app after iOS15 this fall?
3.Is it possible to mute my own voice or someone else's voice while the call screen is in Picture in Picture mode? With Facetime, you can't switch the mute in Picture in Picture mode. You need to jump to Facetime once.

@21-Hidetaka-Ko
Copy link

If Twitch and Netflix can support Picture in Picture mode, but other video calling apps can't, is it because they're not allowed by Apple? Or is it just a lack of resources?

@Derad6709
Copy link

If Twitch and Netflix can support Picture in Picture mode, but other video calling apps can't, is it because they're not allowed by Apple? Or is it just a lack of resources?

You can check some Apple documentation on it here!

@ceaglest
Copy link
Contributor

ceaglest commented Jun 22, 2021

Hi,

Seeing if I can answer some questions since I attended the AVKit lab during WWDC.

If Twitch and Netflix can support Picture in Picture mode, but other video calling apps can't, is it because they're not allowed by Apple? Or is it just a lack of resources?

You can check some Apple documentation on it here!

@Derad6709 is correct, you should take a look at the docs for iOS 15.0-beta1. In previous versions of iOS use of the camera in a multi-tasking environment was allowed but the entitlement was not documented anywhere. Apps like Netflix could use AVPlayer in PiP to play streaming video but even in the best possible scenario the delay was ~1s or more making it infeasible for live conferencing.

There are two points I'm interested in.
1.Is it possible to implement Picture in Picture mode in calling apps before iOS15?

It looks like the AVPictureInPictureVideoCallViewController API for PiP is only available in iOS 15 but multi-tasking with the camera has been available as an undocumented API / entitlement on iPadOS 14.

2.Is it likely to be possible to implement Picture in Picture mode in the calling app after iOS15 this fall?

Yes, but the approach you use will depend on if your app is using PiP for playback only (which won't require a new entitlement), or while running the camera.

3.Is it possible to mute my own voice or someone else's voice while the call screen is in Picture in Picture mode? With Facetime, you can't switch the mute in Picture in Picture mode. You need to jump to Facetime once.

Since you provide the UI with AVPictureInPictureVideoCallViewController it should be possible to have a mute button. I don't think it will work with the AVSampleBufferDisplayLayer version of the API which is for playback only.

Best,
Chris

@21-Hidetaka-Ko
Copy link

21-Hidetaka-Ko commented Jun 23, 2021

@Derad6709 @ceaglest
Thank you for the info.

Yes, but the approach you use will depend on if your app is using PiP for playback only (which won't require a new entitlement), or while running the camera.

It looks like the AVPictureInPictureVideoCallViewController API for PiP is only available in iOS 15 but multi-tasking with the camera has been available as an undocumented API / entitlement on iPadOS 14.

Do you think it would be difficult to implement on iOS15 or earlier (e.g. iOS14)?

Apps like Netflix could use AVPlayer in PiP to play streaming video but even in the best possible scenario the delay was ~1s or more making it infeasible for I'm sure you're right.

Is it the video that is delayed? Does the audio itself have a delay?
So that was it. Do you think that implementing it in Picture in Picture mode is not suitable for the nature of the call service (no lag is important)?

Yes, but the approach you use will depend on if your app is using PiP for playback only (which won't require a new entitlement), or while running the camera.

I would like to know what the approach will be in our case. Our case is not a video playback service, but a calling service. Since it is a call, the screen is basically never stopped. It is always live. In this case, which applies and what would be the approach? Is it the latter?

@Derad6709
Copy link

Derad6709 commented Jun 23, 2021

Multitasking camera documentation (including usage of the camera in background while pip?) is here.

Hi,

Seeing if I can answer some questions since I attended the AVKit lab during WWDC.

If Twitch and Netflix can support Picture in Picture mode, but other video calling apps can't, is it because they're not allowed by Apple? Or is it just a lack of resources?

You can check some Apple documentation on it here!

@Derad6709 is correct, you should take a look at the docs for iOS 15.0-beta1. In previous versions of iOS use of the camera in a multi-tasking environment was allowed but the entitlement was not documented anywhere. Apps like Netflix could use AVPlayer in PiP to play streaming video but even in the best possible scenario the delay was ~1s or more making it infeasible for live conferencing.

There are two points I'm interested in.
1.Is it possible to implement Picture in Picture mode in calling apps before iOS15?

It looks like the AVPictureInPictureVideoCallViewController API for PiP is only available in iOS 15 but multi-tasking with the camera has been available as an undocumented API / entitlement on iPadOS 14.

2.Is it likely to be possible to implement Picture in Picture mode in the calling app after iOS15 this fall?

Yes, but the approach you use will depend on if your app is using PiP for playback only (which won't require a new entitlement), or while running the camera.

3.Is it possible to mute my own voice or someone else's voice while the call screen is in Picture in Picture mode? With Facetime, you can't switch the mute in Picture in Picture mode. You need to jump to Facetime once.

Since you provide the UI with AVPictureInPictureVideoCallViewController it should be possible to have a mute button. I don't think it will work with the AVSampleBufferDisplayLayer version of the API which is for playback only.

Best,
Chris

@21-Hidetaka-Ko
Copy link

@Derad6709 @ceaglest
Thanks for the info.
I've read here that third party apps cannot be implemented, but when implementing a video calling service using an SDK for video calling services like Twillo, is it possible to implement it in PIP mode? 
Then.is it possible to implement PIP in a non-video call instead of a video call?

@21-Hidetaka-Ko
Copy link

@ceaglest
Picture-in-picture can't be implemented in a call without video, right?

@fukemy
Copy link

fukemy commented Nov 26, 2021

any new on this?
https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_for_video_calls?changes=_1

@pawansharmaAccolite
Copy link

@pooja816 Are you able to do that ?

@piyushtank
Copy link
Contributor

Hi all,

Thanks for your patience, and all the suggestions.

We have just shipped TwilioVideo 5.3 with Picture-in-picture support. Here is the changelogs. Try it out and let us know if you have any questions.

Best,
Piyush

@4brunu
Copy link

4brunu commented Nov 9, 2022

Hi @piyushtank,
First of all, thanks for making PiP available in Twilio Video, this is a functionality that we have been excited about for some time 🙂

PiP for video calls was introduced in iOS 15 by using the entitlement com.apple.developer.avfoundation.multitasking-camera-access.
The problem with PiP in iOS 15 is it requires asking Apple's approval to use it.
That's not ideal, specially if you manage multiple apps in different organisations, which is my case.

Since iOS 16, PiP is generally available via the property AVCaptureSession.isMultitaskingCameraAccessEnabled.

This means that, starting with iOS 16, we no longer need Apple's approval to use PiP, so I would like to use PiP only on iOS 16 and above.

Would it be possible for Twilio Video to expose a boolean to enable/disable PiP by setting AVCaptureSession.isMultitaskingCameraAccessEnabled in the AVCaptureSession that is used internally?

Thanks

@fukemy
Copy link

fukemy commented Nov 9, 2022

Hi @piyushtank,

First of all, thanks for making PiP available in Twilio Video, this is a functionality that we have been excited about for some time 🙂

PiP for video calls was introduced in iOS 15 by using the entitlement com.apple.developer.avfoundation.multitasking-camera-access.

The problem with PiP in iOS 15 is it requires asking Apple's approval to use it.

That's not ideal, specially if you manage multiple apps in different organisations, which is my case.

Since iOS 16, PiP is generally available via the property AVCaptureSession.isMultitaskingCameraAccessEnabled.

This means that, starting with iOS 16, we no longer need Apple's approval to use PiP, so I would like to use PiP only on iOS 16 and above.

Would it be possible for Twilio Video to expose a boolean to enable/disable PiP by setting AVCaptureSession.isMultitaskingCameraAccessEnabled in the AVCaptureSession that is used internally?

Thanks

Hi, I tried this with my iphone ios 16, but ismultitaskingcamerasupport return false, does it only support ipad?

@piyushtank
Copy link
Contributor

@4brunu Thanks! TwilioVideo SDK doesn't do anything with PiP and multi-tasking enablement, its apps responsibility. TwilioVideo provides a renderer which could be used to render video frames using PiP view controller. Our VideoApp demonstrates how to use PiP renderer in an app - https://github.com/twilio/twilio-video-app-ios. S, as of now, we don't see a need for adding an API for AVCaptureSession.isMultitaskingCameraAccessEnabled.

@fukemy iPhone 16 supports picture in picture.

@4brunu
Copy link

4brunu commented Nov 9, 2022

Hi @piyushtank, thanks for the fast response.

I understand what you are saying, but there are two ways of enabling PiP:

  • One is using the entitlement com.apple.developer.avfoundation.multitasking-camera-access, sending a request to Apple asking for authorisation to use this entitlement, and with luck they will aprouve it and let you use it. If they don't, PiP won't work at all.

  • The other option, is to use the new API available on iOS 16, by setting AVCaptureSession.isMultitaskingCameraAccessEnabled = true. This way it doesn't require approval from Apple, it works right away. The problem is that TwilioVideo uses AVCaptureSession internally and it doesn't expose the AVCaptureSession instance to outside of the SDK (and I understand why it's not exposed), therefore we can't activate PiP with the new official way. This is why I was suggesting to allow somehow to pass a configuration to the TwilioVideo SDK, that would internally be passed to the internal instance of AVCaptureSession.

Honestly having to ask Apple permission to use PiP is a bit cumbersome.

@fukemy
Copy link

fukemy commented Nov 10, 2022

hi @piyushtank , please tell me if my code is wrong, tested with Iphone 11 promax IOS 16

if (@available(iOS 16.0, *)) {
            [self.capturer.captureSession beginConfiguration];
            if([self.capturer.captureSession isMultitaskingCameraAccessSupported]){
    //            [self.capturer.captureSession isMultitaskingCameraAccessEnabled];
                [self.capturer.captureSession setMultitaskingCameraAccessEnabled:YES];
                NSLog(@"setMultitaskingCameraAccessEnabled success");
            }else{
                NSLog(@"isMultitaskingCameraAccessEnabled not supported");
            }
            [self.capturer.captureSession commitConfiguration];
        } else {
            NSLog(@"isMultitaskingCameraAccessEnabled not supported because IOS < 16");
        }

here is my log:
2022-11-10 01:51:54.060894+0700 ABCApp[70459:7233238] isMultitaskingCameraAccessEnabled not supported

@piyushtank
Copy link
Contributor

piyushtank commented Nov 10, 2022

@4brunu Thanks for the suggestions. We can open a new API for AVCaptureSession.isMultitaskingCameraAccessEnabled, its just that we can't commit on the timeline as we all are working on our media engine (webrtc) upgrade. If we get a chance we will can implement this sooner, I will keep you posted. Also, I believe, like face time, you should still be able to render remote participant using PiP View controller using our new renderer. Let me know if you have any questions.

@fukemy Sounds like you have a custom capturer implementation. I am not sure why isMultitaskingCameraAccessSupported would return fail on iOS16. Try out our video app.

@Chandlerdea
Copy link

Chandlerdea commented Jan 5, 2023

I ran the video app on my iPhone 13 mini, and it seems like multitasking camera functionality is not supported. When I background the application during a video call, the local video track is paused and doesn't resume until the app is brought back into the foreground (I've uploaded a screen recording ). There is no AVCaptureSession to modify in the project so it seems like we cannot use AVCaptureSession.isMultitaskingCameraAccessEnabled until the Twilio SDK gives us a way to modify that property.

@piyushtank
Copy link
Contributor

@Chandlerdea Thanks for reaching out. We are planning to add support in short term, mostly this quarter.
We will update the ticket.

@piyushtank
Copy link
Contributor

Update:

I wanted to reach out to to confirm that the suggested enhancement is under development. While developing the enhancement, we figured that, even though the AVCaptureSession.isMultitaskingCameraAccessSupported is available on iOS 16+ and iPadOS 16+, it is supported only on the iPad that supports Stage Manager with an extended display. We have posted a question to apple developer forum to get more clarification on this. Let us know if you have any questions.

Thanks!

@jayPbriskstar
Copy link

@piyushtank

I have implement the video call in PIP but Once i go to the background video is not showing so How can i continue the video call in background for now?

@arunmbriskstar
Copy link

@piyushtank

We have implemented twilio as a major feature for our app, But due to this feature we are stuck at a very crucial stage.
Is there any support or something that we can do to over come this issue.

Please let us know

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests