Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backpressure/buffering for hardware decoders? #27

Closed
mfoltzgoogle opened this issue Sep 18, 2019 · 5 comments
Closed

Backpressure/buffering for hardware decoders? #27

mfoltzgoogle opened this issue Sep 18, 2019 · 5 comments
Labels
editorial changes to wording, grammar, etc that don't modify the intended behavior

Comments

@mfoltzgoogle
Copy link
Contributor

The Streams spec assumes you can get some backpressure signal from the implementation of a TransformStream.

But a hardware codec may not give you that much control over what happens after frame data is passed into the codec API. It will typically decode the frame data immediately and render it into a GPU frame buffer.

So the high level feedback is: the spec (at a future level of maturity) should map the behavior of an abstract codec onto the behavior of the TransformStream, and allow codec implementations that run in immediate mode (without buffering). (Or require implementations to do this buffering internally.)

@steveanton
Copy link
Contributor

I think we'd require the implementation to do the buffering internally. I suspect browsers have general frameworks for implementing Streams that can do this (the Streams spec describes it).

@pthatcherg could we get a "spec language" label?

@pthatcherg pthatcherg added the editorial changes to wording, grammar, etc that don't modify the intended behavior label Sep 23, 2019
@sandersdan
Copy link
Contributor

My current implementation estimates the number of frames buffered internally by the codec (assuming one input chunk = one output chunk), and tries to keep internal buffered frames + output buffered frames below a constant.

It'll need more tuning to be production ready but it seems workable.

@padenot
Copy link
Collaborator

padenot commented May 5, 2021

I believe this is handled via (e.g.) the decodeQueueSize attribute (and its encode counterpart of course). @sandersdan, do you feel this addresses the concerned expressed here?

@sandersdan
Copy link
Contributor

sandersdan commented May 5, 2021

Yes, decodeQueueSize is adequate, and was designed to fit will well with the streams backpressure model.

There is more we could do (a few different statistics, an event when space becomes available), but decodeQueueSize seems fine for V1.

@chcunningham
Copy link
Collaborator

For new queuing stats/events, I vote to track separately should requests for those arise.

Otherwise seems like discussion here is concluded. Please re-open if I've overlooked anything.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
editorial changes to wording, grammar, etc that don't modify the intended behavior
Projects
None yet
Development

No branches or pull requests

6 participants