-
Notifications
You must be signed in to change notification settings - Fork 331
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to reduce latency? #21
Comments
@dogtownmedia I would try to answer your question by first asking what you mean by the latency? If you refer to the amount of time it takes from the frame being captured to the time it gets rendered in the console video playback screen then we can dissect it into two very separate parts.
Let's discuss 2) first. Console playback IS NOT a real-time playback - it's more of an HLS/MPEG-DASH type of playback which is based on the fragments. So, there is the fragment duration which is an inherent latency (as the fragment needs to be ingested first), then, as the console playback streams only from the persistent storage, it will have to be first stored, indexed and then de-indexed and retrieved from the storage, adapted/trans-packaged to MPEG and send to the console for the playback. On the console side, many of the MSE implementations of the browser have an inherent buffer so it also affects the latency.
If you want to lower the latency of the the console playback then one of the options is to configure the media pipeline to produce shorter fragment durations. If you intend to have a "Real-time" playback then the console playback is not very suitable - we are working on providing easy means for the Real-time playback on the consumer side. However, if you implemented your own consumer which uses the Parser Library to get the elementary stream to later be sent to some playback engine then we should look at the latency of the playback engine. Depending on the playback engine, the decoder/renderer might have their own buffering requirements which would add up to the overall latency. Please help us understand better your scenario. |
Hi @MushMal sorry for the vague question. I'm starting to understand how things work now. Could you tell me how to configure the media pipeline to produce shorter fragment durations. Thanks Rob. |
Hi Rob, Can you describe what your media pipeline looks like? Is this a GStreamer pipeline? If so, take a look at the sample gstreamer application included in the SDK. There is a Key-frame fragmentation value you can modify: g_object_set(G_OBJECT (data.encoder), "bframes", 0, "key-int-max", 45, "bitrate", 512, NULL); This tells the encoder to generate an Idr frame every 45th frame. The frame rate is configured as 30fps, which means that in this case it will generate a fragment with 1.5 seconds duration. |
That's exactly what i was after, thanks |
How would we add less data per Kinesis fragment but speed up the frequency in which they were sent to provide lower latency?
I see these stream variables but unsure which ones to adjust?
Thanks
The text was updated successfully, but these errors were encountered: