feat: start remote track as muted and unmute on first packet receive#32
feat: start remote track as muted and unmute on first packet receive#32santhoshvai merged 3 commits intomasterfrom
Conversation
Remote tracks now start muted and fire unmute when first media data arrives, matching the W3C WebPlatformTest expectation that remoteTrack.muted is true inside ontrack. Video: mutedState starts true; first decoded frame fires unmute immediately (no 3s wait); the existing periodic timer continues to detect mid-call stalls. Audio (parity gap): new AudioTrackAdapter fires unmute on the first decoded PCM buffer via AudioTrackSink (Android) and RTCAudioRenderer (iOS). Note: only the initial mute → unmute is detectable; subsequent stalls cannot be observed from the sink because Android's audio render path and iOS NetEq synthesize silence / PLC frames when RTP stops. Documented inline. JS: MediaStreamTrack._muted defaults to true for remote tracks; _setMutedInternal now guards against duplicate events per mediacapture-main "set a track's muted state". A pending-mute buffer in RTCPeerConnection absorbs native events that arrive before the JS track is constructed in setRemoteDescription (fast/loopback races).
This reverts commit 3957df7.
Remote tracks now start muted and fire unmute when first media data arrives, matching the W3C WebPlatformTest expectation that remoteTrack.muted is true inside ontrack. Video: mutedState starts true; first decoded frame fires unmute immediately (no 3s wait); the existing periodic timer continues to detect mid-call stalls. Audio (parity gap): new AudioTrackAdapter fires unmute on the first decoded PCM buffer via AudioTrackSink (Android) and RTCAudioRenderer (iOS). Note: only the initial mute → unmute is detectable; subsequent stalls cannot be observed from the sink because Android's audio render path and iOS NetEq synthesize silence / PLC frames when RTP stops. Documented inline. JS: MediaStreamTrack._muted defaults to true for remote tracks; _setMutedInternal now guards against duplicate events per mediacapture-main "set a track's muted state". A pending-mute buffer in RTCPeerConnection absorbs native events that arrive before the JS track is constructed in setRemoteDescription (fast/loopback races).
📝 WalkthroughWalkthroughThis pull request adds audio track adapter functionality across Android and iOS to detect when remote audio tracks first receive data and emit unmute events. It also updates remote track initialization to start in a muted state and adds pending mute state buffering in TypeScript to handle race conditions. Changes
Sequence DiagramsequenceDiagram
actor Native as Native Layer
participant Adapter as Audio/Video Adapter
participant Track as MediaStreamTrack
participant App as App/Handler
Native->>Native: Remote track added
Native->>Adapter: register adapter/renderer
Adapter->>Adapter: initialize with muted=true
Note over Adapter: waiting for first data...
Native->>Adapter: onData/renderBuffer called
Adapter->>Adapter: first data received?
alt First data
Adapter->>Adapter: set muted=false
Adapter->>Track: emit mediaStreamTrackMuteChanged<br/>(muted: false)
Track->>App: mute/unmute event
else Subsequent data
Adapter->>Adapter: ignore (one-shot only)
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 6
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java (1)
104-121:⚠️ Potential issue | 🟠 MajorPrevent duplicate mute events via atomic state transitions and add disposed check.
The
volatileboolean check/set is not atomic: bothonFrame()(line 118) and the timer task (line 142) can observemutedState == true, transition it tofalse, and both emit unmute events. UseAtomicBooleanwithcompareAndSet()to ensure only one thread wins the state transition. Also,onFrame()should checkdisposedbefore processing (unlike the current code at line 114), matching the pattern inVideoDimensionDetectorImpl.onFrame()(line 193).Suggested direction
import java.util.concurrent.atomic.AtomicInteger; +import java.util.concurrent.atomic.AtomicBoolean; @@ - private volatile boolean mutedState = true; + private final AtomicBoolean mutedState = new AtomicBoolean(true); @@ `@Override` public void onFrame(VideoFrame frame) { + if (disposed) { + return; + } // incrementAndGet() == 1 is the atomic "first frame" check — fire // unmute immediately instead of waiting up to INITIAL_MUTE_DELAY // for the periodic timer. - if (frameCounter.incrementAndGet() == 1 && mutedState) { - mutedState = false; + if (frameCounter.incrementAndGet() == 1 && mutedState.compareAndSet(true, false)) { emitMuteEvent(false); } } @@ - if (isMuted != mutedState) { - mutedState = isMuted; + boolean previousMuted = mutedState.get(); + if (isMuted != previousMuted && mutedState.compareAndSet(previousMuted, isMuted)) { emitMuteEvent(isMuted); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java` around lines 104 - 121, TrackMuteUnmuteImpl uses a volatile boolean mutedState which allows both onFrame() and the timer task to observe true and each emit an unmute; replace mutedState with an AtomicBoolean and perform the transition using compareAndSet(true, false) so only one thread emits emitMuteEvent(false) after frameCounter.incrementAndGet() == 1, and also add an early return in onFrame() that checks the disposed flag (same pattern as VideoDimensionDetectorImpl.onFrame()) before doing any work to avoid processing after disposal.
🧹 Nitpick comments (1)
src/RTCPeerConnection.ts (1)
93-96: Pending mute buffer is not cleared onclose().
_pendingMuteStatesis drained only when a matching remote track is created insetRemoteDescription. If native emitsmediaStreamTrackMuteChangedfor a track whose JS counterpart is never constructed (e.g. PC torn down before sRD completes, or trackId never appears in a subsequentnewTransceiversbatch), entries remain in the map for the lifetime of theRTCPeerConnectioninstance. Consider clearing the map inclose()alongside the existing cleanup, so dangling entries don't keeptrackIdstrings alive after the PC is closed.♻️ Proposed cleanup in close()
close(): void { log.debug(`${this._pcId} close`); if (this.connectionState === 'closed') { return; } WebRTCModule.peerConnectionClose(this._pcId); // Mark transceivers as stopped. this._transceivers.forEach(({ transceiver })=> { transceiver._setStopped(); }); + + this._pendingMuteStates.clear(); }Also applies to: 180-180, 382-393, 874-879
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/RTCPeerConnection.ts` around lines 93 - 96, The _pendingMuteStates map can retain trackId strings if native mute events arrive for tracks that never get JS tracks; modify the RTCPeerConnection.close() method to clear or reset _pendingMuteStates as part of its existing cleanup (similar to how other internal maps/collections are drained), ensuring any dangling entries are removed so GC can reclaim trackId strings; also verify consistency with setRemoteDescription and any other teardown paths that currently drain the map (e.g., ensure you don't double-use the map after clearing).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@android/src/main/java/com/oney/WebRTCModule/AudioTrackAdapter.java`:
- Around line 43-64: The sinks HashMap is accessed concurrently by addAdapter
and removeAdapter (called on different threads), causing a race; replace the
unsynchronized Map with a thread-safe structure or guard accesses: change the
sinks field to a ConcurrentHashMap<String, FirstDataUnmuteSink> (or synchronize
all accesses to sinks) and update any related code that references sinks
accordingly (ensure addAdapter, removeAdapter and any other methods use the new
concurrent map or respect the synchronization to avoid
ConcurrentModificationException and data corruption).
In `@ios/RCTWebRTC/WebRTCModule`+AudioTrackAdapter.h:
- Around line 2-11: The header is missing the RTCAudioTrack type definition used
by the RTCPeerConnection (AudioTrackAdapter) category; add an import for
RTCAudioTrack (e.g. `#import` <WebRTC/RTCAudioTrack.h>) to the top of
WebRTCModule+AudioTrackAdapter.h so the method signatures
-addAudioTrackAdapter:(RTCAudioTrack *)track and
-removeAudioTrackAdapter:(RTCAudioTrack *)track compile correctly.
In `@ios/RCTWebRTC/WebRTCModule`+AudioTrackAdapter.m:
- Around line 85-112: addAudioTrackAdapter: and removeAudioTrackAdapter:
currently assume self.audioTrackAdapters is non-nil which can silently no-op and
leak renderers; ensure the backing NSMutableDictionary is lazily initialized or
asserted non-nil before use (e.g. initialize self.audioTrackAdapters if nil at
start of addAudioTrackAdapter: or implement a getter that lazy-inits the
dictionary), so setObject: and removeObjectForKey: always operate on a real
dictionary; reference audioTrackAdapters, addAudioTrackAdapter:,
removeAudioTrackAdapter:, and FirstBufferUnmuteRenderer (and consider
peerConnectionInit initialization consistency) when making the change.
- Around line 76-83: The audioTrackAdapters NSMutableDictionary is accessed from
different threads (addAudioTrackAdapter: on self.workerQueue and
removeAudioTrackAdapter: called from peerConnectionDispose:) causing a race; fix
by serializing all access to the dictionary—either add an
NSLock/dispatch_semaphore and wrap every read/write in audioTrackAdapters,
setAudioTrackAdapters:, addAudioTrackAdapter:, removeAudioTrackAdapter: (and any
other accessors) with that lock, or ensure all accesses are dispatched to the
same serial queue (e.g., use self.workerQueue for all getter/setter and mutation
calls) so mutations cannot run concurrently and the race is eliminated.
In `@ios/RCTWebRTC/WebRTCModule`+VideoTrackAdapter.m:
- Around line 115-122: renderFrame: currently updates _muted and calls
emitMuteEvent: from the WebRTC render thread without checking _disposed or
synchronizing with the timer/main-queue handler; change it to bail out if
_disposed is set and serialize state transitions (either by making _muted an
atomic flag or dispatching the mute/unmute logic to the same serial queue used
by the timer handler), and only call emitMuteEvent: after confirming !_disposed;
ensure the corresponding timer event handler (dispatch_source_set_event_handler)
and dispose method also use the same synchronization/atomic checks so
reads/writes of _muted and _disposed are race-free and no emits occur after
dispose.
In `@macos/RCTWebRTC.xcodeproj/project.pbxproj`:
- Around line 40-41: The macOS Xcode project contains PBXFileReference entries
for WebRTCModule+AudioTrackAdapter.h and WebRTCModule+AudioTrackAdapter.m in the
RCTWebRTC group but those source files only exist under ios/RCTWebRTC/, causing
macOS build failures; either move the two files into the shared apple/RCTWebRTC
directory (so the existing references resolve) or remove the PBXFileReference
entries for "WebRTCModule+AudioTrackAdapter.h" and
"WebRTCModule+AudioTrackAdapter.m" from the macOS project.pbxproj (and any
references at the other mentioned locations) so the macOS target no longer
expects those files.
---
Outside diff comments:
In `@android/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.java`:
- Around line 104-121: TrackMuteUnmuteImpl uses a volatile boolean mutedState
which allows both onFrame() and the timer task to observe true and each emit an
unmute; replace mutedState with an AtomicBoolean and perform the transition
using compareAndSet(true, false) so only one thread emits emitMuteEvent(false)
after frameCounter.incrementAndGet() == 1, and also add an early return in
onFrame() that checks the disposed flag (same pattern as
VideoDimensionDetectorImpl.onFrame()) before doing any work to avoid processing
after disposal.
---
Nitpick comments:
In `@src/RTCPeerConnection.ts`:
- Around line 93-96: The _pendingMuteStates map can retain trackId strings if
native mute events arrive for tracks that never get JS tracks; modify the
RTCPeerConnection.close() method to clear or reset _pendingMuteStates as part of
its existing cleanup (similar to how other internal maps/collections are
drained), ensuring any dangling entries are removed so GC can reclaim trackId
strings; also verify consistency with setRemoteDescription and any other
teardown paths that currently drain the map (e.g., ensure you don't double-use
the map after clearing).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 1fde1d94-9463-485d-8c4c-2686bd76da00
📒 Files selected for processing (11)
android/src/main/java/com/oney/WebRTCModule/AudioTrackAdapter.javaandroid/src/main/java/com/oney/WebRTCModule/PeerConnectionObserver.javaandroid/src/main/java/com/oney/WebRTCModule/VideoTrackAdapter.javaios/RCTWebRTC.xcodeproj/project.pbxprojios/RCTWebRTC/WebRTCModule+AudioTrackAdapter.hios/RCTWebRTC/WebRTCModule+AudioTrackAdapter.mios/RCTWebRTC/WebRTCModule+RTCPeerConnection.mios/RCTWebRTC/WebRTCModule+VideoTrackAdapter.mmacos/RCTWebRTC.xcodeproj/project.pbxprojsrc/MediaStreamTrack.tssrc/RTCPeerConnection.ts
Aligns remote track mute/unmute events with the W3C MediaStreamTrack spec. Remote tracks now start
muted=trueand fire unmute only when media data actually flows.Demo
0420.mp4
Summary by CodeRabbit