Choppy video playback in production Unity application #101
Comments
Are you using H.264? On what device? |
Running on Windows 10 desktop. Here's the offer. I think it's using VP8:
|
Yes it's using VP8.
No idea then. It sounds indeed like your CPU is overloaded. You can try changing the thread priority, see if that helps. Unfortunately this is the kind of settings that is much too advanced level and specific to put in MixedReality-WebRTC; we are aiming at something easy to use for developers, and fiddling with threads priority is out of scope. Not to mention this is very subtle to get right, and would likely be platform and device dependent, so too much of a burden to maintain here. I would recommend to try and profile the WebRTC threads, and possibly switch to hardware-accelerated video if CPU load is the main issue. Or work around with more lower resolutions and/or framerate. Maybe another less CPU intensive codec can help too (so not VP8/VP9). |
Thanks for the info. Our app has something similar to the webrtc plugin: a streaming mp3 player. Its playback is smooth and performant under similar conditions, so I'm hopeful this is solvable given the current CPU load. That said, simply boosting a thread's priority rarely solves this kind of thing, but might provide insight. |
Why SDP does not have ssrc data? |
Not sure. @eanders-ms are your audio and video feeds not synchronized on purpose? I just checked and I typically have some |
I have a=ssrc on the first peerconnection i create, but not the subsequent ones |
|
As far as I know, but I am also not an expert on those, See RFC 3550:
|
@eanders-ms any update on that issue? Did you manage to try boosting the thread priority and observe any effect on the video? What about testing with other codecs like VP9 if possible, as a comparison point? |
I ended up writing a native Unity plugin to render the frames using low-level graphics API calls (DirectX, OpenGL). It hooks into the native webrtc peer connection's video frame callback to queue the frames, then Unity triggers the DLL to render them from its graphics thread. It worked; I see smooth playback now even when there's lots of other CPU and GPU activity happening. It's only been tested on Windows desktop x86/64 so far. I'm currently working on getting it to build on Android. UWP after that. Here's my fork with the addition of the native rendering DLL, in case it can help others: https://github.com/AltspaceVR/MixedReality-WebRTC/tree/eanders/native-rendering It doesn't have a sample scene yet, so no example showing how to hook the native renderer to the peer connection. But it's just a few lines of C#. Anyone interested in trying it out should ping me here. Note this is very much a work in progress at this point. Edited to add: The native renderer only supports remote video and YUV 420 frame format at the moment. |
When I play a WebRTC video stream in a standalone test scene, playback is very smooth. But in our production app, playback is choppy and regularly halts for a second or more. Our app has a lot going on in it, but runs at a consistently high frame rate. Do you think the WebRTC worker threads could be getting starved? I'd like to try bumping their priority up. Do you have advice on how to best accomplish this? Would it make sense to add a field to PeerConnectionConfiguration to specify a thread priority that could be passed on to Google's native library?
I'm running with a locally built Microsoft.MixedReality.WebRTC.Native.dll, Release/x64 configuration.
The text was updated successfully, but these errors were encountered: