-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Present mjpeg stream instead of h264 #56
Comments
The starting point is GetProfiles and GetProfile. They return a Video Profile which ties up 3 other data structures So you will need to adapt the VideoEncoderConfiguration to describe some MJPEG capabilities. Then you will need to sort out the RTSP server so it streams mjpeg video. |
It would be really nice if we could implement H264 and MJPEG together. The problem currently is that we use the v4l2 driver for the Raspberry Pi cameras and that driver can only deliver h264 or mjpeg (and not both). But a starting point is probably a flag in rposConfig.json for H264 or MJPEG and then running one codec or the other. |
Thanks, i will look into it.
…On Wed, Jan 16, 2019 at 1:30 PM Roger Hardiman ***@***.***> wrote:
It would be really nice if we could implement H264 and MJPEG together. The
problem currently is that we use the v4l2 driver for the Raspberry Pi
cameras and that driver can only deliver h264 or mjpeg (and not both).
It would be nice if we could get raw YUV from the camera and then feed
into a H264 encoder and a MJPEG encoder and stream both of those. It looks
like that may be possible building on work from @Schwaneberg
<https://github.com/Schwaneberg> to use GStreamer as the video pipeline.
I believe there are some GStreamer plugins to access the Pi Hardware H264
encoding so performance with HD could be maintained.
This could also get us a JPEG Snapshot (another ONVIF command) that is
currently implemented by a ffmpeg hack.
But a starting point is probably a flag in rposConfig.json for H264 or
MJPEG and then running one codec or the other.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#56 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AG85yTsO54gomyPp86pWaf8RqMdh330lks5vDxt2gaJpZM4aC70r>
.
|
Hello! Yes, it is possible to encode the raw stream with hardware
accelerated encoders in GStreamer, but I do not recommend this. I tried it
once and the quality is lower compared to the rpicamsrc encoder. The delay
is also much higher (~180ms to >500ms).
So, I recommend to disable the v4l2 interface and use native rpicamsrc
instead. Then, we can switch between MJPEG, RAW and H264 just be rebuilding
the GStreamer pipeline.
I agree that introducing a field for the codec would be a good starting
point. Lets call it "Codec" with supported values "raw", "mjpeg" and
"h264". Ok?
Am Mi., 16. Jan. 2019 um 13:30 Uhr schrieb Roger Hardiman <
notifications@github.com>:
… It would be really nice if we could implement H264 and MJPEG together. The
problem currently is that we use the v4l2 driver for the Raspberry Pi
cameras and that driver can only deliver h264 or mjpeg (and not both).
It would be nice if we could get raw YUV from the camera and then feed
into a H264 encoder and a MJPEG encoder and stream both of those. It looks
like that may be possible building on work from @Schwaneberg
<https://github.com/Schwaneberg> to use GStreamer as the video pipeline.
I believe there are some GStreamer plugins to access the Pi Hardware H264
encoding so performance with HD could be maintained.
This could also get us a JPEG Snapshot (another ONVIF command) that is
currently implemented by a ffmpeg hack.
But a starting point is probably a flag in rposConfig.json for H264 or
MJPEG and then running one codec or the other.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#56 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AhnptQIbKZXQBNi-AVBO-slGCV6rsx0vks5vDxt2gaJpZM4aC70r>
.
|
I have a project where I would like to present the video as mjpeg over http. I know the info is in the Onvif standard how to do this. But its alot of info to go thru.
Perhaps someone here could point out some directions or fill in some additional info?
The text was updated successfully, but these errors were encountered: