You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've implemented screen sharing using WebRTC in my React Native app, but I'm encountering an issue where the audio from other apps, such as YouTube, is not being captured during screen broadcasting.
Below is the React Native code snippet I'm using for screen sharing:
You appear to be asking for screen-sharing video (getDisplayMedia) but microphone-derived audio (getUserMedia). You should instead call getDisplayMedia({video: true, audio: true}).
Note that availability of video/audio for screen-sharing differs by (1) browser, (2) OS and (3) what the user ends up sharing (tab/window/screen). For example, Chrome does not currently support screen-sharing on Android. Such issues belong in the browser's bug-tracker rather than here, though.
I've implemented screen sharing using WebRTC in my React Native app, but I'm encountering an issue where the audio from other apps, such as YouTube, is not being captured during screen broadcasting.
Below is the React Native code snippet I'm using for screen sharing:
const screenShareStreamRef = useRef();
const [screenShareStream, setScreenShareStream] = useState(null);
const screenShareVideoProducer = useRef();
const screenShareAudioProducer = useRef();
For both iOS and Android, I've followed the setup provided in this guide: Link to the setup guide
Additionally, for iOS, I've added the necessary files into the project, as mentioned in the guide: Link to the files
I'm looking for insights or solutions on how to capture audio from other apps during screen sharing. Any help would be appreciated. Thank you!
The text was updated successfully, but these errors were encountered: