Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Webrtc stream instead of camera feed ? #28

Open
wardhanster opened this issue Dec 23, 2017 · 6 comments
Open

Webrtc stream instead of camera feed ? #28

wardhanster opened this issue Dec 23, 2017 · 6 comments

Comments

@wardhanster
Copy link

Is there possibly a way to achieve that ? I was able to get demo up with tensor flow but the frame rates are really low, and In my research this module pops up ! - Amazing work guys !
Now because the detection API of the TF demo is exposed over http POST request, I have to resort to traditional pooling for object detection . which works but has serious framerate issues.
I am not sure how to do this over a socket and if TF is efficient enough to work with the requirements.

This brings to my next question - is there a way I can use webrtc API in the browser to a locally running server running node-yolo ?

@OrKoN
Copy link
Contributor

OrKoN commented Dec 23, 2017

Hi @wardhanster,

the easiest way to implement streaming is too use ffmpeg. What we tried is streaming results to a ffmpeg server like shown here https://github.com/moovel/node-yolo/blob/master/test/darknetImageFfmpegTest.js

ffmpeg server allows users to access and see the stream in the browser. Probably you can write an own frontend which connects to the ffmpeg server. See docs about the service https://www.ffmpeg.org/ffserver.html

I am not familiar with WebRTC to give any other hints.

@wardhanster
Copy link
Author

Oh Okay , I will try that approach - I am not familiar with a lot of the things that you mentioned here.
also webrtc is extremely easy to implement on a local machine, something that might interest you in future.
I will try to implement this with webrtc and create a PR if things went well

@tanmoyAtb
Copy link

How can use ffserver to process video from webcam and stream it at the same time?

I tried reading docs from fserver, but I just can't seem to decode the modified chunks.

@OrKoN
Copy link
Contributor

OrKoN commented Dec 27, 2017

@tanmoy12 ffmpeg is capable of transforming raw chunks into the suitable format. See here https://github.com/moovel/node-yolo/blob/master/test/darknetTest.js#L17 the input is raw video in bgr24 and ffmpeg transforms it to the output format. Instead of the filename one can define a URL for a ffmpeg server which can then broadcast the stream.

@tanmoyAtb
Copy link

I tried defining an URL like

'localhost:8080'

in place of

detected.mp4

Invalid error.

Do i have to define another child process for ffserver?

@OrKoN
Copy link
Contributor

OrKoN commented Dec 27, 2017

@tanmoy12 ffserver is a separate process/tool. Also, you need to configure it separately.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants