Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming video #10

Open
madCoder24 opened this issue Jan 23, 2018 · 5 comments
Open

Streaming video #10

madCoder24 opened this issue Jan 23, 2018 · 5 comments

Comments

@madCoder24
Copy link

First, great library and thank you for writing this! Using this library in C++ is painful even if it is a necessary evil at times.

Have you tried this with streaming H.264 video data? My use case for this library would be streaming video part of a wireless meeting project. So, I am sending out the individual NAL units of encoded data until the entire frame is sent then it is rendered. Would this project be able to handle that kind of data flow? If so, is there an example?

Thank you.

@jean343
Copy link
Owner

jean343 commented Jan 23, 2018

Absolutely, this library would be perfect for it.
I have another similar c++ project where we can use a PI for Remote desktop: https://github.com/jean343/RPI-GPU-rdpClient
I have to agree that while it works, it's quite annoying to write it in c++ and have to recompile it all the time!

For your project, I would start from this example:
https://github.com/jean343/Node-OpenMAX/blob/master/examples/ts/SimpleVideoDecoderRender.ts

All it does is read buffers from a file, and pipe them to the decoder and render. You could simply make a node network stream and pipe your NAL buffers into the Decoder. To start, pipe an h.264 file through a network stream to test.

Let me know how it goes!

@madCoder24
Copy link
Author

I discovered your project at an earlier time and used yours as an example to currently build what I have. But, I am in a state of rebuilding the receiving side because the C++ library for OpenMax is just hard to work with. So discovering your library is a great sight to see.

My project is using UDP and, maybe I am misunderstanding, but how do you do a pipe with a socket? Because, I need to listen to the 'message' event. Do you mean build up the buffer inside that event and then hand that buffer to the VideoDecoder and VideoRenderer?

@jean343
Copy link
Owner

jean343 commented Jan 26, 2018

Thanks so much for your interest.

In the past, I have been using TCP exclusively as I like to have long key frame intervals. You could definitely use UDP, but you will need shorter key frame intervals, and possibly a buffer to keep the packets ordered.

Yes, listen to the message, create the buffer and send it to your stream. Then, my library will take care of sending to to the decoder.

You can easily build a UDP pipe, with a readable stream, and push messages to the stream, or you could try:
https://github.com/wankdanker/node-datagram-stream
https://www.npmjs.com/package/udp-stream

@madCoder24
Copy link
Author

Sure, that makes sense. Currently, I am sending an object which contains an individual NAL unit and a flag that tells me if the end of the current frame has been hit. If it is then the buffer containing the NAL units is sent to the decoder. That has been working for me, so far.

Thanks for the advice and package recommendations. I will let you know how it goes.

@jean343
Copy link
Owner

jean343 commented Jan 26, 2018

The RPI library is smart enough to assemble NAL units, you can pass the buffer on an incomplete NAL, and it will be OK.
Of course, that is prone to error when packages are re-ordered. But in TCP it works perfectly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants