Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hey, had a few questions around the publishing a stream? #20

Closed
ajay-collab opened this issue Jul 1, 2020 · 3 comments
Closed

Hey, had a few questions around the publishing a stream? #20

ajay-collab opened this issue Jul 1, 2020 · 3 comments
Labels
question Further information is requested

Comments

@ajay-collab
Copy link

You use ffmpeg as your publisher, I've been looking to find a way to publish images as my streams. Basically I would want to have a process where I stream video via a file or network and capture those frames and do some processing and publish them on the server. Could you please let me know how to achieve this?

@aler9 aler9 added the question Further information is requested label Jul 3, 2020
@aler9
Copy link
Member

aler9 commented Jul 3, 2020

Hi, anything published on the server must be encoded in a streaming format; video and audio formats (h264, ogg, mp3, aac) are streaming formats, while image formats are not, therefore if you want to publish images you have to convert them to video.
In my opinion there are two ways to achieve this:

  • output directly video from your script, which is easier. There are a lot of libraries to build videos from image buffers, and send them to a RTSP server, the most common being OpenCV, which have bindings for C++ and Python
  • convert images to video with an external program, like ffmpeg. It's difficult because you have to produce images with a constant rate. A simple google search is enough to find how to perform this task (https://stackoverflow.com/questions/24961127/how-to-create-a-video-from-images-with-ffmpeg)

Hope this helps

@ajay-collab
Copy link
Author

ajay-collab commented Jul 3, 2020

Hmm, the problem is I would have opencv read frames from rtsp uri and then I'm not sure how to encode them so that I could push it to this. I right now have the only option to push images not videos. Any idea around this?

@aler9
Copy link
Member

aler9 commented Jul 18, 2020

Hello, now there's a new option to push frames into rtsp-simple-server, that consists in using gstreamer in this way:
opencv --> rtspclientsink element of gstreamer --> rtsp-simple-server

You can start from an opencv-to-gstreamer example, like this:
https://stackoverflow.com/questions/37339184/how-to-write-opencv-mat-to-gstreamer-pipeline

Then replace

mpegtsmux ! udpsink host=localhost port=9999

with

rtspclientsink location=rtsp://localhost:8554/mystream

and after some tries it should work.

@aler9 aler9 closed this as completed Jul 18, 2020
@aler9 aler9 reopened this Jul 18, 2020
@aler9 aler9 closed this as completed Mar 30, 2021
@bluenviron bluenviron locked and limited conversation to collaborators Mar 30, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants