Skip to content
This repository has been archived by the owner on Mar 22, 2022. It is now read-only.

Choppy video playback in production Unity application #101

Open
eanders-ms opened this issue Oct 15, 2019 · 11 comments
Open

Choppy video playback in production Unity application #101

eanders-ms opened this issue Oct 15, 2019 · 11 comments
Labels
bug Something isn't working

Comments

@eanders-ms
Copy link
Contributor

When I play a WebRTC video stream in a standalone test scene, playback is very smooth. But in our production app, playback is choppy and regularly halts for a second or more. Our app has a lot going on in it, but runs at a consistently high frame rate. Do you think the WebRTC worker threads could be getting starved? I'd like to try bumping their priority up. Do you have advice on how to best accomplish this? Would it make sense to add a field to PeerConnectionConfiguration to specify a thread priority that could be passed on to Google's native library?

I'm running with a locally built Microsoft.MixedReality.WebRTC.Native.dll, Release/x64 configuration.

@djee-ms
Copy link
Member

djee-ms commented Oct 16, 2019

Are you using H.264? On what device?
We have some ongoing issues with the H.264 encoder code and other parts of the Media Foundation pipeline; we observed blockiness and freezes too, and we have several teams involved to root cause and fix that. If this is what you are hitting, and it sounds like it is, then I don't think bumping the WebRTC worker thread priority will help much.

@eanders-ms
Copy link
Contributor Author

Running on Windows 10 desktop. Here's the offer. I think it's using VP8:

v=0
o=- 1455468138398065781 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE 0 1 2
a=msid-semantic: WMS
m=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 102 0 8 106 105 13 110 112 113 126
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:aMmX
a=ice-pwd:oCSmb7IJTaa2aEn67NBKWIi6
a=ice-options:trickle
a=fingerprint:sha-256 F9:60:82:FC:93:DA:F0:01:5C:DE:B8:9F:B9:18:68:DA:A3:A3:58:79:2D:02:6C:66:19:2E:9F:DD:F2:2A:20:1A
a=setup:active
a=mid:0
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:3 urn:ietf:params:rtp-hdrext:sdes:mid
a=recvonly
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=rtcp-fb:111 transport-cc
a=fmtp:111 minptime=10;useinbandfec=1
a=rtpmap:103 ISAC/16000
a=rtpmap:104 ISAC/32000
a=rtpmap:9 G722/8000
a=rtpmap:102 ILBC/8000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:106 CN/32000
a=rtpmap:105 CN/16000
a=rtpmap:13 CN/8000
a=rtpmap:110 telephone-event/48000
a=rtpmap:112 telephone-event/32000
a=rtpmap:113 telephone-event/16000
a=rtpmap:126 telephone-event/8000
m=video 9 UDP/TLS/RTP/SAVPF 96 97 98 99 127 124 125
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:aMmX
a=ice-pwd:oCSmb7IJTaa2aEn67NBKWIi6
a=ice-options:trickle
a=fingerprint:sha-256 F9:60:82:FC:93:DA:F0:01:5C:DE:B8:9F:B9:18:68:DA:A3:A3:58:79:2D:02:6C:66:19:2E:9F:DD:F2:2A:20:1A
a=setup:active
a=mid:1
a=extmap:14 urn:ietf:params:rtp-hdrext:toffset
a=extmap:13 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=extmap:12 urn:3gpp:video-orientation
a=extmap:2 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01
a=extmap:11 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay
a=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/video-content-type
a=extmap:7 http://www.webrtc.org/experiments/rtp-hdrext/video-timing
a=extmap:8 http://tools.ietf.org/html/draft-ietf-avtext-framemarking-07
a=extmap:3 urn:ietf:params:rtp-hdrext:sdes:mid
a=recvonly
a=rtcp-mux
a=rtcp-rsize
a=rtpmap:96 VP8/90000
a=rtcp-fb:96 goog-remb
a=rtcp-fb:96 transport-cc
a=rtcp-fb:96 ccm fir
a=rtcp-fb:96 nack
a=rtcp-fb:96 nack pli
a=rtpmap:97 rtx/90000
a=fmtp:97 apt=96
a=rtpmap:98 VP9/90000
a=rtcp-fb:98 goog-remb
a=rtcp-fb:98 transport-cc
a=rtcp-fb:98 ccm fir
a=rtcp-fb:98 nack
a=rtcp-fb:98 nack pli
a=fmtp:98 x-google-profile-id=0
a=rtpmap:99 rtx/90000
a=fmtp:99 apt=98
a=rtpmap:127 red/90000
a=rtpmap:124 rtx/90000
a=fmtp:124 apt=127
a=rtpmap:125 ulpfec/90000
m=application 9 DTLS/SCTP 5000
c=IN IP4 0.0.0.0
b=AS:30
a=ice-ufrag:aMmX
a=ice-pwd:oCSmb7IJTaa2aEn67NBKWIi6
a=ice-options:trickle
a=fingerprint:sha-256 F9:60:82:FC:93:DA:F0:01:5C:DE:B8:9F:B9:18:68:DA:A3:A3:58:79:2D:02:6C:66:19:2E:9F:DD:F2:2A:20:1A
a=setup:active
a=mid:2
a=sctpmap:5000 webrtc-datachannel 1024

@djee-ms
Copy link
Member

djee-ms commented Oct 16, 2019

Yes it's using VP8.

m=video 9 UDP/TLS/RTP/SAVPF 96 97 98 99 127 124 125
a=rtpmap:96 VP8/90000

No idea then. It sounds indeed like your CPU is overloaded. You can try changing the thread priority, see if that helps. Unfortunately this is the kind of settings that is much too advanced level and specific to put in MixedReality-WebRTC; we are aiming at something easy to use for developers, and fiddling with threads priority is out of scope. Not to mention this is very subtle to get right, and would likely be platform and device dependent, so too much of a burden to maintain here. I would recommend to try and profile the WebRTC threads, and possibly switch to hardware-accelerated video if CPU load is the main issue. Or work around with more lower resolutions and/or framerate. Maybe another less CPU intensive codec can help too (so not VP8/VP9).

@eanders-ms
Copy link
Contributor Author

Thanks for the info. Our app has something similar to the webrtc plugin: a streaming mp3 player. Its playback is smooth and performant under similar conditions, so I'm hopeful this is solvable given the current CPU load. That said, simply boosting a thread's priority rarely solves this kind of thing, but might provide insight.

@tjhappy
Copy link

tjhappy commented Oct 17, 2019

Why SDP does not have ssrc data?
a=ssrc;xxx

@djee-ms
Copy link
Member

djee-ms commented Oct 17, 2019

Not sure. @eanders-ms are your audio and video feeds not synchronized on purpose? I just checked and I typically have some a=ssrc line in my SDP requests when testing locally.

@eirikhollis
Copy link

I have a=ssrc on the first peerconnection i create, but not the subsequent ones

@djee-ms djee-ms added the need info More information is needed from the author to answer label Oct 17, 2019
@eanders-ms
Copy link
Contributor Author

a=ssrc is out of my area of knowledge. What is its significance? I am still learning all the things about WebRTC. Audio and video seem to be in sync insofar as I've been able to test. On past projects I've had to explicitly write code to sync video to audio (for streaming mp4 -- not quite the same thing). Does WebRTC sync the streams at a lower level or will I need to handle it at the application level?

@djee-ms
Copy link
Member

djee-ms commented Oct 21, 2019

As far as I know, but I am also not an expert on those, a=ssrc will advertize some tracks as being grouped together for synchronization purpose at the RTP level, generally for audio and video tracks from the same source. This synchronizes the tracks in the sense that they share the same timings in the RTP packets. But I think it's up to the receiver to actually do the synchronized playback.

See RFC 3550:

Synchronization source (SSRC): The source of a stream of RTP
packets, identified by a 32-bit numeric SSRC identifier carried in
the RTP header so as not to be dependent upon the network address.
All packets from a synchronization source form part of the same
timing and sequence number space, so a receiver groups packets by
synchronization source for playback. Examples of synchronization
sources include the sender of a stream of packets derived from a
signal source such as a microphone or a camera, or an RTP mixer
(see below).

@djee-ms
Copy link
Member

djee-ms commented Nov 8, 2019

@eanders-ms any update on that issue? Did you manage to try boosting the thread priority and observe any effect on the video? What about testing with other codecs like VP9 if possible, as a comparison point?

@eanders-ms
Copy link
Contributor Author

eanders-ms commented Nov 8, 2019

I ended up writing a native Unity plugin to render the frames using low-level graphics API calls (DirectX, OpenGL). It hooks into the native webrtc peer connection's video frame callback to queue the frames, then Unity triggers the DLL to render them from its graphics thread. It worked; I see smooth playback now even when there's lots of other CPU and GPU activity happening.

It's only been tested on Windows desktop x86/64 so far. I'm currently working on getting it to build on Android. UWP after that.

Here's my fork with the addition of the native rendering DLL, in case it can help others: https://github.com/AltspaceVR/MixedReality-WebRTC/tree/eanders/native-rendering

It doesn't have a sample scene yet, so no example showing how to hook the native renderer to the peer connection. But it's just a few lines of C#. Anyone interested in trying it out should ping me here.

Note this is very much a work in progress at this point.

Edited to add: The native renderer only supports remote video and YUV 420 frame format at the moment.

@djee-ms djee-ms added bug Something isn't working and removed need info More information is needed from the author to answer labels Dec 19, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants