Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

voice breaking when in speaker mode in android 9 #1554

Open
yadavmurari111 opened this issue Apr 24, 2024 · 6 comments
Open

voice breaking when in speaker mode in android 9 #1554

yadavmurari111 opened this issue Apr 24, 2024 · 6 comments

Comments

@yadavmurari111
Copy link

Expected Behavior:

Audio should not break in between while in speaker mode

Observed Behavior:

Audio is breaking in between when turning on speaker mode

Steps to reproduce the issue:

Implement video call functionality in react native project .

Establish a successful connection between 2 android device , one device must be at Android version 9 . Audio is audible at both end in earpiece perfectly , but when turn on speaker mode audio is breaking in between in android 9 device .

Platform Information

  • React Native Version: ^0.73.5
  • WebRTC Module Version: ^118.0.3
  • Platform OS + Version: Android 9
@saghul
Copy link
Member

saghul commented Apr 24, 2024

Can you reproduce this consistently? Across multiple devices?

@yadavmurari111
Copy link
Author

This is the scenario we encountered while my friend and I were conducting tests, @saghul.
Able to hear a voice from both sides in loudspeaker. But there is one issue when me & my friend testing. we found that when my device which is Android 9 connects with a Wi-Fi network I can not hear my friend's voice clearly It breaks so much but my friend can hear my voice clearly on his device which is Android 11. when I connect my device to mobile data everything works fine on both sides.

   peerConnection.current.addEventListener('connectionstatechange', e => {
       console.log(
         'Peer Connection : Connection State Change ->',
        peerConnection.current.connectionState + ' ' + callerId,
       );
       if (peerConnection.current.connectionState === 'connected') {
         setTimeout(() => {
        // start
         inCallManager.setSpeakerphoneOn(true);
           inCallManager.setForceSpeakerphoneOn(true);
         }, 600);
      }
     });

after debugging I found that when my device connected to the Wi-Fi network while calling "connectionstatechanges" frequently change from "connect" to "disconnect" can you tell me why this could happen. this is not happening when connected with mobile data (Stable connection).

Code For Reference: for the signaling server, I have used Firebase realtime db:


import React, {useEffect, useRef, useState} from 'react';
import {Alert, Button} from 'react-native';
import {
  RTCIceCandidate,
  RTCPeerConnection,
  RTCSessionDescription,
  RTCView,
  mediaDevices,
} from 'react-native-webrtc';
import {Text, View} from 'tamagui';
import inCallManager from 'react-native-incall-manager';
import OutgoingCallScreen from '../../components/WebRTC/OutgoingCallScreen';
import IncomingCallScreen from '../../components/WebRTC/IncomingCallScreen';
import JoinScreen from '../../components/WebRTC/JoinScreen';
import database from '@react-native-firebase/database';
import {callStatus} from '../../constants';

function WebRTCScreen() {
  // Stream of local user
  const [localStream, setlocalStream] = useState(null);

  // When a call is connected, the video stream from the receiver is appended to this state in the stream
  const [remoteStream, setRemoteStream] = useState(null);

  const [type, setType] = useState('JOIN');

  //We'll store a 5 digit Random CallerId which will represent this user, and can be referred by another connected user.

  const [callerId] = useState(
    Math.floor(100000 + Math.random() * 900000).toString(),
  );
  const otherUserId = useRef(null);
  const [localMicOn, setlocalMicOn] = useState(true);
  const [speaker, setSpeaker] = useState(true);
  const [localWebcamOn, setlocalWebcamOn] = useState(true);

  //This creates an WebRTC Peer Connection, which will be used to set local/remote descriptions and offers.
  const peerConnection = useRef(
    new RTCPeerConnection({
      iceServers: [
        {
          urls: 'stun:stun.l.google.com:19302',
        },
        {
          urls: 'stun:stun1.l.google.com:19302',
        },
        {
          urls: 'stun:stun2.l.google.com:19302',
        },
        {
          url: 'turn:global.turn.twilio.com:3478?transport=udp',
          username: 'xxxxxxxxxxxxxxxx',
          urls: 'turn:global.turn.twilio.com:3478?transport=udp',
          credential: 'xxxxxxxxxxxxxxxx',
        },
        {
          url: 'turn:global.turn.twilio.com:3478?transport=tcp',
          username: 'xxxxxxxxxxxxxxxx',
          urls: 'turn:global.turn.twilio.com:3478?transport=tcp',
          credential: 'xxxxxxxxxxxxxxxx',
        },
        {
          url: 'turn:global.turn.twilio.com:443?transport=tcp',
          username: 'xxxxxxxxxxxxxxxx',
          urls: 'turn:global.turn.twilio.com:443?transport=tcp',
          credential: 'xxxxxxxxxxxxxxxx',
        },
      ],
      iceCandidatePoolSize: 2,
    }),
  );

  //  Establishing a webRTC call

  let remoteRTCMessage = useRef(null);

  useEffect(() => {
    inCallManager.start();
    //This event occurs whenever any peer wishes to establish a call with you.
    database()
      .ref(`/users/${callerId}`)
      .on('value', snapshot => {
        console.log('User data: ', snapshot.val());
        let data = snapshot?.val();

        //   whenever someone make the request to connect
        if (data?.status == callStatus.newCall) {
          remoteRTCMessage.current = data.rtcMessage;
          otherUserId.current = data.callerId;
          setType('INCOMING_CALL');
          // processAccept();
          peerConnection.current.setRemoteDescription(
            new RTCSessionDescription(remoteRTCMessage.current),
          );
        } else if (data?.status == callStatus.accept) {
          console.log('data callAnswered: ', data);

          remoteRTCMessage.current = data.rtcMessage;
          peerConnection.current.setRemoteDescription(
            new RTCSessionDescription(remoteRTCMessage.current),
          );

          setType('WEBRTC_ROOM');
        }
      });

    // whenver new offer  ICEcandidate offer => receiver side trigger
    database()
      .ref(`/ICEcandidate_offer/${callerId}`)
      .on('value', snapshot => {
        console.log('ICEcandidate data: ', snapshot.val());
        let data = snapshot.val();
        if (!data) {
          return;
        }

        let message = data.rtcMessage;
        console.log(
          'ICEcandidate called: ',
          peerConnection.current.remoteDescription,
        );
        if (peerConnection.current.remoteDescription == null) {
          console.log('its null go back:');
          // return;
        }
        if (peerConnection.current) {
          peerConnection?.current
            .addIceCandidate(
              new RTCIceCandidate({
                candidate: message.candidate,
                sdpMid: message.id,
                sdpMLineIndex: message.label,
              }),
            )
            .then(data => {
              console.log('SUCCESS offer: ', callerId);
            })
            .catch(err => {
              console.log('Error offer side', err);
            });
        }
      });
    // ICEcandidate answer => caller side trigger
    database()
      .ref(`/ICEcandidate_ans/${callerId}`)
      .on('value', snapshot => {
        console.log('ICEcandidate data: ', snapshot.val());
        let data = snapshot.val();
        if (!data) {
          return;
        }

        let message = data.rtcMessage;
        console.log(
          'ICEcandidate called: ',
          peerConnection.current.remoteDescription,
        );
        if (peerConnection.current.remoteDescription == null) {
          console.log('its null go back:');
          // return;
        }
        if (peerConnection.current) {
          peerConnection?.current
            .addIceCandidate(
              new RTCIceCandidate({
                candidate: message.candidate,
                sdpMid: message.id,
                sdpMLineIndex: message.label,
              }),
            )
            .then(data => {
              console.log('SUCCESS ans: ', callerId);
            })
            .catch(err => {
              console.log('Error answer side', err);
            });
        }
      });

    let isFront = false;

    // The MediaDevices interface allows you to access connected media inputs such as cameras and microphones. We ask the user for permission to access those media inputs by invoking the mediaDevices.getUserMedia() method.
    console.log('mediaDevices: ', mediaDevices);
    mediaDevices.enumerateDevices().then(sourceInfos => {
      let videoSourceId;
      for (let i = 0; i < sourceInfos.length; i++) {
        const sourceInfo = sourceInfos[i];
        if (
          sourceInfo.kind == 'videoinput' &&
          sourceInfo.facing == (isFront ? 'user' : 'environment')
        ) {
          videoSourceId = sourceInfo.deviceId;
        }
      }

      mediaDevices
        .getUserMedia({
          audio: true,
          video: {
            mandatory: {
              minWidth: 500, // Provide your own width, height and frame rate here
              minHeight: 300,
              minFrameRate: 30,
            },
            facingMode: isFront ? 'user' : 'environment',
            optional: videoSourceId ? [{sourceId: videoSourceId}] : [],
          },
        })
        .then(stream => {
          // Got stream!
          console.log('stream: ', stream);
          setlocalStream(stream);

          // setup stream listening
          //   peerConnection.current.addStream(stream);

          stream.getTracks().forEach(track => {
            peerConnection.current.addTrack(track, stream);
          });
        })
        .catch(error => {
          // Log error
          console.log('err stream: ', error);
        });
    });

    peerConnection.current.onaddstream = event => {
      setRemoteStream(event.stream);
    };

    peerConnection.current.addEventListener('connectionstatechange', e => {
      console.log(
        'Peer Connection : Connection State Change ->',
        peerConnection.current.connectionState + ' ' + callerId,
      );
      if (peerConnection.current.connectionState === 'connected') {
        setTimeout(() => {
          // start
          inCallManager.setSpeakerphoneOn(true);
          inCallManager.setForceSpeakerphoneOn(true);
        }, 600);
      }
    });

    return () => {
      database().ref('/users').off();
      database().ref('/ICEcandidate_offer').off();
      database().ref('/ICEcandidate_ans').off();
      inCallManager.stop();
    };
  }, []);

  //  to making a call (caller side)
  async function processCall() {
    peerConnection.current.onicecandidate = async event => {
      console.log(
        'iceconnectionstatechange: processCall',
        peerConnection.current.connectionState,
      );
      if (event.candidate) {
        const data = {
          calleeId: otherUserId.current,
          rtcMessage: {
            label: event.candidate.sdpMLineIndex,
            id: event.candidate.sdpMid,
            candidate: event.candidate.candidate,
          },
        };
        try {
          await database()
            .ref('/ICEcandidate_offer/' + data.calleeId)
            .set({
              data: callerId,
              rtcMessage: data.rtcMessage,
            });
        } catch (error) {
          console.log('error: ', error);
        }
      }
    };
    // add ice candidate event
    let sessionConstraints = {
      mandatory: {
        OfferToReceiveAudio: true,
        OfferToReceiveVideo: true,
        VoiceActivityDetection: true,
      },
    };

    const sessionDescription = await peerConnection.current.createOffer(
      sessionConstraints,
    );
    await peerConnection.current.setLocalDescription(sessionDescription);
    console.log('otherUserId: ', otherUserId);
    console.log('sessionDescription: ', sessionDescription);
    sendCall({
      calleeId: otherUserId.current,
      rtcMessage: sessionDescription,
    });
  }

  // to accept the call (receiver side)
  async function processAccept() {
    peerConnection.current.onicecandidate = async event => {
      console.log(
        'peerConnection.current.connectionState: accept',
        peerConnection.current.connectionState,
      );

      if (event.candidate) {
        const data = {
          calleeId: otherUserId.current,
          rtcMessage: {
            label: event.candidate.sdpMLineIndex,
            id: event.candidate.sdpMid,
            candidate: event.candidate.candidate,
          },
        };
        try {
          await database()
            .ref('/ICEcandidate_ans/' + otherUserId.current)
            .set({
              data: callerId,
              rtcMessage: data.rtcMessage,
            });
        } catch (error) {
          console.log('error: ', error);
        }
      }
    };

    console.log('remoteRTCMessage.current: ', remoteRTCMessage.current);

    const sessionDescription = await peerConnection.current.createAnswer();
    console.log('sessionDescription: processAccept', sessionDescription);

    await peerConnection.current.setLocalDescription(sessionDescription);
    answerCall({
      callerId: otherUserId.current,
      rtcMessage: sessionDescription,
    });
  }

  function answerCall(data) {
    database()
      .ref('/users/' + data.callerId)
      .set({
        callee: callerId,
        rtcMessage: data.rtcMessage,
        status: callStatus.accept,
      })
      .then(() => {
        console.log('Data set. sendCall');
        setType('WEBRTC_ROOM');
      });
  }

  function sendCall(data) {
    // socket.emit('call', data);
    database()
      .ref('/users/' + callerId)
      .set({...data, status: callStatus.call})
      .then(() => console.log('Data set. sendCall'));

    database()
      .ref('/users/' + data.calleeId)
      .set({
        callerId: callerId,
        rtcMessage: data.rtcMessage,
        status: callStatus.newCall,
      })
      .then(() => console.log('Data set. sendCall receiver side'));
  }

  // Function to enable/disable Mic
  function toggleMic() {
    localMicOn ? setlocalMicOn(false) : setlocalMicOn(true);
    localStream.getAudioTracks().forEach(track => {
      console.log('track audio: ', track);
      localMicOn ? (track.enabled = false) : (track.enabled = true);
    });
  }

  function toggleSpeaker() {
    inCallManager.setForceSpeakerphoneOn(!speaker);
    setSpeaker(!speaker);
  }

  // Function to leave the call (Destroys webRTC connection)
  function leave() {
    peerConnection.current.close();
    setlocalStream(null);
    setType('JOIN');
  }
  const WebrtcRoomScreen = () => (
    <View
      style={{
        flex: 1,
        backgroundColor: '#050A0E',
        paddingHorizontal: 12,
        paddingVertical: 12,
      }}>
      {localStream ? (
        <RTCView
          objectFit={'cover'}
          style={{flex: 1, backgroundColor: '#050A0E'}}
          streamURL={localStream.toURL()}
        />
      ) : null}
      {remoteStream ? (
        <RTCView
          objectFit={'cover'}
          style={{
            flex: 1,
            backgroundColor: '#050A0E',
            marginTop: 8,
          }}
          streamURL={remoteStream.toURL()}
        />
      ) : null}
      <View
        style={{
          marginVertical: 12,
          flexDirection: 'row',
          justifyContent: 'space-evenly',
        }}>
        <Button
          title="CallEnd"
          onPress={() => {
            leave();
          }}
        />
        <Button
          title={localMicOn ? 'Mic ON' : 'Mic OFF'}
          onPress={() => {
            toggleMic();
          }}
        />
        <Button
          title={speaker ? 'Speaker ON' : 'Speaker OFF'}
          onPress={() => {
            toggleSpeaker();
          }}
        />
      </View>
    </View>
  );

  switch (type) {
    case 'JOIN':
      return (
        <JoinScreen
          callerId={callerId}
          setType={setType}
          otherUserId={otherUserId}
          processCall={processCall}
        />
      );
    case 'INCOMING_CALL':
      return (
        <IncomingCallScreen
          processAccept={processAccept}
          setType={setType}
          callerId={callerId}
          otherUserId={otherUserId}
        />
      );
    case 'OUTGOING_CALL':
      return (
        <OutgoingCallScreen
          callerId={callerId}
          setType={setType}
          otherUserId={otherUserId}
        />
      );
    case 'WEBRTC_ROOM':
      return WebrtcRoomScreen();
    default:
      return null;
  }
}

export default WebRTCScreen;

@yadavmurari111
Copy link
Author

@saghul we have tested in android 11 and android 13 devices , its working fine , issue is only happening in android 9 in our testing .

@saghul
Copy link
Member

saghul commented Apr 25, 2024

If it works on other devices, i don't see how there is something specific to Android 9 that would cause what you report.

Did you test multiple Android 9 devicws?

@yadavmurari111
Copy link
Author

@saghul yes , we have tested in multiple android 9 devices , issue is consistently there .
Note: issue only happens when connected with the wifi network not facing it on mobile data connection.

@saghul
Copy link
Member

saghul commented May 3, 2024

That is really odd. There is nothing in the media engine that would cause different audio codec behavior depending on the network type.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants