Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GStreamer not working for IMX519 #15

Open
JoeAYeung opened this issue Feb 14, 2022 · 8 comments
Open

GStreamer not working for IMX519 #15

JoeAYeung opened this issue Feb 14, 2022 · 8 comments

Comments

@JoeAYeung
Copy link

I am using RPi3 and followed instruction to install driver using the install script
https://github.com/ArduCAM/Arducam-Pivariety-V4L2-Driver/releases/tag/install_script

Only libcamera-still -t 0 worked

Opening /dev/video0 with VLC did not give me any image.
But more importantly, gstreamer doesn't work.

I noticed there are newer version of libcamera than what was referred to in the install_script
in https://github.com/ArduCAM/Arducam-Pivariety-V4L2-Driver/releases

Should the install_script be updated? Or is my problem not related to libcamera version?

I am using RPi OS with 5.10.63-v7+ kernel

[all]
gpu_mem=256
dtoverlay=imx519

Below is the actual gstreamer error message

pi@rpi3-cam:~ $ GST_DEBUG=libcamerasrc:7 gst-launch-1.0 libcamerasrc ! xvimagesink
Setting pipeline to PAUSED ...
0:00:00.202239381  2038  0x12cf220 DEBUG           libcamerasrc gstlibcamerasrc.cpp:205:gst_libcamera_src_open:<libcamerasrc0> Opening camera device ...
[0:04:43.893669457] [2038]  INFO Camera camera_manager.cpp:293 libcamera v0.0.0
[0:04:43.928294680] [2040]  WARN CameraSensorProperties camera_sensor_properties.cpp:141 No static properties available for 'imx519'
[0:04:43.928669217] [2040]  WARN CameraSensorProperties camera_sensor_properties.cpp:143 Please consider updating the camera sensor properties database
[0:04:44.318342750] [2040] ERROR DelayedControls delayed_controls.cpp:87 Delay request for control id 0x009a090a but control is not exposed by device /dev/video0
0:00:00.633106104  2038  0x12cf220 INFO            libcamerasrc gstlibcamerasrc.cpp:240:gst_libcamera_src_open:<libcamerasrc0> Using camera '/base/soc/i2c0mux/i2c@1/imx519@1a'
Pipeline is live and does not need PREROLL ...
0:00:00.634094088  2038  0x12e8438 DEBUG           libcamerasrc gstlibcamerasrc.cpp:358:gst_libcamera_src_task_enter:<libcamerasrc0> Streaming thread has started
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
[0:04:44.334526334] [2043]  INFO Camera camera.cpp:937 configuring streams: (0) 1280x1080-YUV420
[0:04:44.335359940] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 4656x3496 fmt RG10 Score: 3002.98 (best 3002.98)
[0:04:44.335621559] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 3840x2160 fmt RG10 Score: 2632.22 (best 2632.22)
[0:04:44.335740936] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 2328x1748 fmt RG10 Score: 1983.98 (best 1983.98)
[0:04:44.335849792] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 1920x1080 fmt RG10 Score: 1882.22 (best 1882.22)
[0:04:44.335959950] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 1280x720 fmt RG10 Score: 2442.22 (best 1882.22)
[0:04:44.336082243] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 4656x3496 fmt pRAA Score: 2502.98 (best 1882.22)
[0:04:44.336190839] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 3840x2160 fmt pRAA Score: 2132.22 (best 1882.22)
[0:04:44.336300372] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 2328x1748 fmt pRAA Score: 1483.98 (best 1483.98)
[0:04:44.336407144] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 1920x1080 fmt pRAA Score: 1382.22 (best 1382.22)
[0:04:44.336513344] [2040]  INFO RPI raspberrypi.cpp:122 Mode: 1280x720 fmt pRAA Score: 1942.22 (best 1382.22)
[0:04:44.337058665] [2040]  INFO RPI raspberrypi.cpp:624 Sensor: /base/soc/i2c0mux/i2c@1/imx519@1a - Selected mode: 1920x1080-pRAA
[0:04:44.418481901] [2040]  INFO RPISTREAM rpi_stream.cpp:122 No buffers available for ISP Output0
[0:04:44.418649925] [2040]  INFO RPISTREAM rpi_stream.cpp:122 No buffers available for ISP Output0
[0:04:44.418744926] [2040]  INFO RPISTREAM rpi_stream.cpp:122 No buffers available for ISP Output0
0:00:00.908856926  2038  0x12e8438 TRACE           libcamerasrc gstlibcamerasrc.cpp:299:gst_libcamera_src_task_run:<libcamerasrc0> Requesting buffers
[0:04:44.624975091] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (6 left)
[0:04:44.660382149] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (5 left)
[0:04:44.695495452] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (4 left)
[0:04:44.725437060] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (3 left)
[0:04:44.757822143] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (2 left)
[0:04:44.792536430] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (1 left)
[0:04:44.826894877] [2040]  INFO RPI raspberrypi.cpp:1731 Dropping frame at the request of the IPA (0 left)
0:00:01.178490779  2038 0x73b16400 DEBUG           libcamerasrc gstlibcamerasrc.cpp:156:requestCompleted:<libcamerasrc0> buffers are ready
0:00:01.178975526  2038  0x12e8438 TRACE           libcamerasrc gstlibcamerasrc.cpp:299:gst_libcamera_src_task_run:<libcamerasrc0> Requesting buffers
0:00:01.179304177  2038  0x12e8438 WARN            libcamerasrc gstlibcamerasrc.cpp:323:gst_libcamera_src_task_run:<libcamerasrc0> error: Internal data stream error.
0:00:01.179356209  2038  0x12e8438 WARN            libcamerasrc gstlibcamerasrc.cpp:323:gst_libcamera_src_task_run:<libcamerasrc0> error: streaming stopped, reason not-negotiated (-4)
ERROR: from element /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: Internal data stream error.
Additional debug info:
../src/gstreamer/gstlibcamerasrc.cpp(323): gst_libcamera_src_task_run (): /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0:
streaming stopped, reason not-negotiated (-4)
0:00:01.179813820  2038  0x12e8438 DEBUG           libcamerasrc gstlibcamerasrc.cpp:490:gst_libcamera_src_task_leave:<libcamerasrc0> Streaming thread is about to stop
Execution ended after 0:00:00.545425249
Setting pipeline to NULL ...
0:00:01.195103900  2038 0x73b16400 DEBUG           libcamerasrc gstlibcamerasrc.cpp:156:requestCompleted:<libcamerasrc0> buffers are ready
0:00:01.195234423  2038 0x73b16400 DEBUG           libcamerasrc gstlibcamerasrc.cpp:164:requestCompleted:<libcamerasrc0> Request was cancelled
0:00:01.259459894  2038  0x12cf220 DEBUG           libcamerasrc gstlibcamerasrc.cpp:508:gst_libcamera_src_close:<libcamerasrc0> Releasing resources
Freeing pipeline ...
@kbingham
Copy link

The gstreamer element is lacking some functionality at the moment. I hope that it will get resolved soon, but the existing contributors who worked towards adding the required support have not completed it, so it hasn't made it into libcamera.

Meanwhile, you can avoid that limitation by explicitly stating the required capabilities, or using a videoconvert element.

Try putting in the videoconverter which will help fix up missing caps,

gst-launch-1.0 libcamerasrc ! videoconvert ! xvimagesink !

or experiment with setting the required caps directly. In particular, it may specifically require colorimetry and framerate to be set explicitly:

gst-launch-1.0 libcamerasrc ! 'video/x-raw,width=1920,height=1080,colorimetry=bt709,framerate=30/1' ! autovideosink

or encode with something like:

gst-launch-1.0 libcamerasrc ! video/x-raw, colorimetry=bt709,format=NV12,interlace-mode=progressive,width=1280,height=720,framerate=30/1 ! videoconvert ! v4l2h264enc ! filesink location=captured.h264

I'm really keen to see the gstreamer element be more useful so let me know what you discover.

@JoeAYeung
Copy link
Author

Thanks @kbingham !!
videoconvert did the trick!!

@JoeAYeung
Copy link
Author

Hi @kbingham ,

Have you seen this error message in the last command?
How much GPU memory should I reserve?

../sys/v4l2/gstv4l2videoenc.c(828): gst_v4l2_video_enc_handle_frame (): /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0:
Maybe be due to not enough memory or failing driver

@kbingham
Copy link

I believe the v4l2 h264 encoder will be providing memory from the CPU, not the GPU. So if you have a large gpu_mem=256, then it is taking memory /away/ from the pipeline.

Try reducing gpu_mem=256. I've also seen reports that you might find better functionality if you ensure the vc4-kms-v3d overlay is enabled:

#14 (comment)

 # Enable DRM VC4 V3D driver
dtoverlay=vc4-kms-v3d

@kbingham
Copy link

Oh - and also if you try to remove the videoconvert element you'll get better performance, as that can cause CPU conversions to occur. But that is the part that currently requires getting the magic incantation for the gstreamer caps correct:
video/x-raw,colorimetry=bt709,format=NV12,interlace-mode=progressive,width=1280,height=720,framerate=30/1
is just an example, and may need more checking.

@adrien-n
Copy link

adrien-n commented Apr 1, 2022

I believe the error above is not the actual error being faced. I haven't kept the link but on the rpi forums, there are topics about this (and I had the problem myself). Basically, IIRC, recent-ish kernels make it possible for gstreamer to try to encode using other H264 profiles but the hardware cannot actually do that or not at the corresponding bitrates.

The following caps helped me:

v4l2h264enc ! 'video/x-h264,level=(string)4'

The whole command is:

gst-launch-1.0 libcamerasrc ! video/x-raw, colorimetry=bt709,format=NV12,interlace-mode=progressive,width=1920,height=1080,framerate=30/1 ! v4l2h264enc ! 'video/x-h264,level=(string)4' ! filesink location=captured.h264

Well, except that my pi0 only goes up to 720p. Not sure I can get 1080p on that hardware (pi0 2 should might arrive on tomorrow), or maybe I did. I'm having contradicting memories. :D

@kbingham
Copy link

kbingham commented Apr 1, 2022

Yes, - unfortunately - on the RPi - it is not sufficient to simply place the v4l2h264enc element without the level caps afterwards. Perhaps that can be fixed by RPi sometime or gstreamer, - but if it helps anyone, I've been developing on an IMX519 lately and testing the autofocus features in libcamera.

I use the following pipelines to stream the captured video from my RPi to my desktop

gst-launch-1.0 -vvv libcamerasrc ! video/x-raw,colorimetry=bt709,format=NV12,interlace-mode=progressive,width=1280,height=720,framerate=30/1 ! v4l2h264enc extra-controls=controls,video_bitrate_mode=0,video_bitrate=1000000,repeat_sequence_header=1 ! video/x-h264,profile=high,level=(string)4.2 ! h264parse ! rtph264pay ! udpsink host=monstersaurus port=11264

Note: 'monstersaurus' is the destination to send the video, which is my PC hostname.

You should not need to add a videoconvert element, and should avoid doing so - as that can cause software format conversions.

To receive the video on my PC I use

gst-launch-1.0 -v udpsrc port=11264 caps = 'application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96' ! rtph264depay ! decodebin ! videoconvert ! autovideosink

@adrien-n
Copy link

adrien-n commented Apr 1, 2022

Thanks for the command-line, I never get these completely right. :D

I've edited my post to remove the videoconvert. I thought I didn't use it anymore actually. I guess I didn't notice because it doesn't change CPU usage and probably does nothing here. Still, better to avoid it.

Thanks for the autofocus work, that will be awesome!

By the way, I've gotten 1080p working on my rpi0 v1. I had to increase gpu_mem from 32 to 64. My /boot/config.txt now contains the following:

dtoverlay=vc4-kms-v3d,cma-32
max_framebuffers=1
gpu_mem=64

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants