Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Libcamera - proper 60fps support missing for rpi HQ camera #30

Closed
Consti10 opened this issue Nov 1, 2022 · 36 comments
Closed

Libcamera - proper 60fps support missing for rpi HQ camera #30

Consti10 opened this issue Nov 1, 2022 · 36 comments

Comments

@Consti10
Copy link

Consti10 commented Nov 1, 2022

Our goal is to use use both 1280x720@60 and 640x480@90 fps modes on the rpi default HQ camera with libcamera.

720p60 and 640x480@90 both work flawlessly in the mmal land (we use gst-rpicamsrc for that, but that is just the same as using hello-video or similar).

However, I kinda get the feeling that these 2 modes are still missing in libcamera, or at least not implemented as well as in mmal.

Example 1)
Running libcamera-hello with at 720p60:

sudo libcamera-hello --width 1280 --height 720 --framerate 60 Made DRM preview window [0:17:28.096152832] [8914] INFO Camera camera_manager.cpp:293 libcamera v0.0.1+21-7c855784 [0:17:28.124998782] [8915] INFO RPI raspberrypi.cpp:1414 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media4 and ISP device /dev/media0 [0:17:28.125688685] [8914] INFO Camera camera.cpp:1026 configuring streams: (0) 2028x1140-YUV420 [0:17:28.126073275] [8915] INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC
it shows me that the actual sensor format selected is 2028x1080, and there is no way for the rpi ISP to handle that much data at 60fps. Or the data is cropped before going through the ISP, but that is a much worse solution than mmal had.

Same for 640x480@90:
sudo libcamera-hello --width 640 --height 480 --framerate 90 Made DRM preview window [0:20:57.802223521] [8949] INFO Camera camera_manager.cpp:293 libcamera v0.0.1+21-7c855784 [0:20:57.830808341] [8950] INFO RPI raspberrypi.cpp:1414 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media4 and ISP device /dev/media0 [0:20:57.831491227] [8949] INFO Camera camera.cpp:1026 configuring streams: (0) 2026x1520-YUV420 [0:20:57.831845004] [8950] INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1520-SBGGR12_1X12 - Selected unicam format: 2028x1520-pBCC

I can also observe that when running a gstreamer pipeline with either 720p or 640x480, the output framerate(s) never exceed 30fps.
Example pipeline:

libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps=video/x-raw,width=1280,height=720,format=NV12,framerate=60/1 ! v4l2convert ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! video/x-h264,level=(string)4 ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=127.0.0.1 port=5600

and the log also shows me
INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC

Am I correct that 720p60 and 640x480@90 has never been properly implemented in libcamera on the rpi, and therefore is still only available from the mmal land ?

@kbingham
Copy link
Collaborator

kbingham commented Nov 1, 2022

Looking at that - the issue is likely that the framerate support in gstreamer still isn't merged I'm afraid. There are patches available that need testing, review and integration.

If you can rebuild libcamera on your platform, try adding these two patches and retest.

https://patchwork.libcamera.org/project/libcamera/list/?series=3487

@Consti10
Copy link
Author

Consti10 commented Nov 1, 2022

Thanks for the info, that is good to know. It doesn't explain why libcamera-hello selects an inapropriate sensor format
for these framerates / resolutions though, right ?

@Consti10
Copy link
Author

Consti10 commented Nov 1, 2022

It should at least select 1332 × 990 for 720p and 640x480, right ?

https://www.raspberrypi.com/documentation/accessories/camera.html

(And ideally, 1332 × 990 is binned instead of cropped. Not sure if someone bothered to bring those modes out of the broadcom blob into the linux OS sensor driver(s) though).

@kbingham
Copy link
Collaborator

kbingham commented Nov 1, 2022

I'll leave the libcamera-hello parts to RPi - I only know the gstreamer side I'm afraid. I have pinged the developers working on framerate support in gstreamer earlier today, so hopefully it would progress. I've wanted it in for a while too.

@kbingham
Copy link
Collaborator

kbingham commented Nov 1, 2022

(Which actually, if you could test, and report back - would help progress that)

@Consti10
Copy link
Author

Consti10 commented Nov 1, 2022

I did look into the libcamera rpi code.

V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth)

Looks like the framerate is not even accounted for when selecting the "best" sensor format ?
It only takes resolution and bit depth into account.
But on the other hand, even just using the resolution I'd think for 720p and 640x480 the algorithm should already return
1332 × 990.

@Consti10
Copy link
Author

Consti10 commented Nov 1, 2022

I'll leave the libcamera-hello parts to RPi - I only know the gstreamer side I'm afraid. I have pinged the developers working on framerate support in gstreamer earlier today, so hopefully it would progress. I've wanted it in for a while too.

Do you know where the "format selection" code from gst libcamerasrc can be found ? Or is it just the same as here in libcamera. In this case, I don't think the patches would change anything on a rpi with the HQ camera.

Since HQ 2028 × 1520p40 maxes out at 40fps (at least according to their doc here https://www.raspberrypi.com/documentation/accessories/camera.html )

@6by9
Copy link
Collaborator

6by9 commented Nov 1, 2022

Do not use sudo for libcamera commands - it really shouldn't be needed and is a very bad habit to get into.

Yes the mode selection algorithm is always going to have some conditions that are sub-optimal. I thought there had been discussions over including framerate and it had been rejected, but I'm not directly involved.

libcamera-vid --width 1280 --height 720 --framerate 60 --mode 1332:990:10:P --save-pts foo.txt
will save the timestamps of the captured frames in foo.txt. I've just done that and got deltas of 16.66ms, or 60fps.
libcamera-vid --width 1280 --height 720 --framerate 50 --mode 2028:1080:12:P --save-pts foo.txt
has given me deltas of 19.99ms, so almost spot on 50fps.

libcamera-hello --list-cameras will tell you the limits for each mode as advertised by the kernel driver.
1332x990 - max 120.05fps
2028x1080 max 50.03fps
2028x1520 max 40.01fps
4056x3040 max 10fps.

@Consti10
Copy link
Author

Consti10 commented Nov 1, 2022

Do not use sudo for libcamera commands - it really shouldn't be needed and is a very bad habit to get into.

Yes the mode selection algorithm is always going to have some conditions that are sub-optimal. I thought there had been discussions over including framerate and it had been rejected, but I'm not directly involved.

libcamera-vid --width 1280 --height 720 --framerate 60 --mode 1332:990:10:P --save-pts foo.txt will save the timestamps of the captured frames in foo.txt. I've just done that and got deltas of 16.66ms, or 60fps. libcamera-vid --width 1280 --height 720 --framerate 50 --mode 2028:1080:12:P --save-pts foo.txt has given me deltas of 19.99ms, so almost spot on 50fps.

libcamera-hello --list-cameras will tell you the limits for each mode as advertised by the kernel driver. 1332x990 - max 120.05fps 2028x1080 max 50.03fps 2028x1520 max 40.01fps 4056x3040 max 10fps.

Thanks for these helpfull tipps, while I cannot get rid of the sudo yet both tools you posted really help.

I can confirm that manually specifying the sensor resolution "works", aka I get 16.6ms frame delta(s)
I can also confirm that not manually specifying the sensor resolution (but setting 720p60fps) give frame delta(s) of 19.99ms.

I understand that the mode selection is not easy, but silently ignoring the fps doesn't sound like a good idea to me ;)

@Consti10
Copy link
Author

Consti10 commented Nov 1, 2022

This probably is a gst-rpicamsrc issue, but nonetheless imporant for here: for some reason, with gst-rpicamsrc I am getting about 15fps (according to my gstreamer fpsdisplaysink running on another x86 pc)

Tx pipeline:
sudo gst-launch-1.0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps=video/x-raw,width=1280,height=720,format=NV12,framerate=60/1 ! v4l2convert ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600

Rx pipeline:
gst-launch-1.0 -v udpsrc port=5600 caps = "application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! fpsdisplaysink

~15fps (my decoder is well capable of 60fps and also gst doesn't report dropped frames)

Quite far away from either 30fps, 50fps or the actually wanted 60fps ;)

@kbingham
Copy link
Collaborator

kbingham commented Nov 2, 2022

Add "interlace-mode=(string)progressive" to your capsfilter after libcamerasrc, and remove the v4l2convert. You don't want to give gstreamer the opportunity to do any software processing of the image.

Otherwise, if you haven't applied the patches I mentioned above, you simply can't specifiy the framerate through the gstlibcamerasrc yet (you can put the filter on, but it won't take effect) - So I could guess that the stream is running at 15FPS perhaps because you're in a dark room, and the AGC is choosing to extend the exposure time to get a higher brightness image.

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

Add "interlace-mode=(string)progressive" to your capsfilter after libcamerasrc, and remove the v4l2convert. You don't want to give gstreamer the opportunity to do any software processing of the image.

Otherwise, if you haven't applied the patches I mentioned above, you simply can't specifiy the framerate through the gstlibcamerasrc yet (you can put the filter on, but it won't take effect) - So I could guess that the stream is running at 15FPS perhaps because you're in a dark room, and the AGC is choosing to extend the exposure time to get a higher brightness image.

Thanks for the info, I can confirm I was in a dark room and when I re-run the pipeline above in a lighter environment I get the (expected) 30fps (here I was just surprised by the 15 fps even though I expected at least 30fps, knowing that setting the fps won't work but libcamera defaults to 30fps). But your explanation makes total sense ;)

Yeah we've been wanting to get rid of the v4l2convert for ages, but it just doesn't work. E.g. the following pipeline
sudo gst-launch-1.0 libcamerasrc -vvv camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=60/1,interlace-mode=(string)progressive" ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600

shows:
Setting pipeline to PAUSED ... [0:20:45.230789618] [2355] INFO Camera camera_manager.cpp:293 libcamera v0.0.1+21-7c855784 [0:20:45.258419463] [2357] INFO RPI raspberrypi.cpp:1414 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media4 and ISP device /dev/media1 Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0.GstLibcameraPad:src: caps = video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, colorimetry=(string)2:4:5:4 [0:20:45.263414841] [2361] INFO Camera camera.cpp:1026 configuring streams: (0) 1280x720-NV12 [0:20:45.263834800] [2357] INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC ERROR: from element /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: Internal data stream error. Additional debug info: ../src/gstreamer/gstlibcamerasrc.cpp(311): processRequest (): /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: streaming stopped, reason not-negotiated (-4) Execution ended after 0:00:00.835618054 Setting pipeline to NULL ... Freeing pipeline ...

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

Since we already build gstreamer ourselves to keep gst-rpicamsrc as an option for the users (switching between non-legacy and legacy camera stack) I've asked the dev responsible to apply the patches. I'l let you know if they work.

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

-mode 1332:990:10:P

Do you know if it is already possible to set the sensor mode (as shown by 6by9 with libcamera-vid) with gst libcamerasrc ?
Because without that I am afraid we won't be able to do 720p60fps with gstreamer libcamerasrc anyways.

@kbingham
Copy link
Collaborator

kbingham commented Nov 2, 2022

-mode 1332:990:10:P

Do you know if it is already possible to set the sensor mode (as shown by 6by9 with libcamera-vid) with gst libcamerasrc ? Because without that I am afraid we won't be able to do 720p60fps with gstreamer libcamerasrc anyways.

I'm not sure, I suspect that feature for explicitly setting a raw sensor mode is not yet implemented in the gstreamer gstlibcamerasrc, and will need explicit investigation. If you are willing to do this, let us know and we'll help,

@kbingham
Copy link
Collaborator

kbingham commented Nov 2, 2022

Yeah we've been wanting to get rid of the v4l2convert for ages, but it just doesn't work.

Note that currently the gstreamer element for libcamera might need explicit specification for the following properties:

  • colorimetery
  • framerate
  • interlace-mode

Try adding the colorimetry in there too.

capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=60/1,interlace-mode=(string)progressive,colorimetry=bt709"

I believe I have tested the example at:

https://github.com/kbingham/linux-cameras/wiki/GStreamer-use-cases-with-libcamera#rtp-streaming

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

I can confirm that setting both the interlace mode and colorimetry allows you to get rid of v4l2convert. Thanks !

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

Quick question: Is it safe to install libcamera from https://git.libcamera.org/libcamera/libcamera.git and use on a rpi ? Or should we build and install from here.

@kbingham
Copy link
Collaborator

kbingham commented Nov 2, 2022

I see RPi have a couple of patches on top of libcamera at the moment. To maintain existing behaviour, I think you should probably continue to use this tree for the time being.

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

To test the linked patches, I've performed the following steps:

  1. clone and install libcamera from raspberrypi/libcamera (main)

  2. use gst-launch-1.0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=30/1,interlace-mode=(string)progressive,colorimetry=bt709" ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600
    measure the fps at receiver: ~15fps
    => expected

  3. apply 2 linked patches

  4. build and install again

  5. Execute above pipeline

Expected result: measure 30fps at receiver
Actual result: measured ~15fps

So I can't validate the patche(s) work, at least no yet (will double check)

Fresh & latest raspbian, removed libcamera-dev and libcamera-apps. installed gstreamer to compile
Linux raspberrypi 5.15.61-v7l+

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

Full log (forgive me the sudo ;) )
openhd@raspberrypi:~/libcamera $ sudo gst-launch-1.0 libcamerasrc -vvv camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=30/1,interlace-mode=(string)progressive,colorimetry=bt709" ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600 Setting pipeline to PAUSED ... [0:42:00.594449767] [8722] INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3866-0c55e522 [0:42:00.623239339] [8723] INFO RPI raspberrypi.cpp:1374 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media4 and ISP device /dev/media1 Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0.GstLibcameraPad:src: caps = video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 Redistribute latency... /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 [0:42:00.680455141] [8727] INFO Camera camera.cpp:1035 configuring streams: (0) 1280x720-NV12 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720 [0:42:00.680932168] [8723] INFO RPI raspberrypi.cpp:761 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, codec_data=(buffer)01428028ffe100232742802895a014016e84000003000400000300f38a8000989600017d79bdee01e244d401000528ce025c80 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgFAFuhAAAAwAEAAADAPOKgACYlgABfXm97gHiRNQ\=\,KM4CXIA\=", payload=(int)96, ssrc=(uint)1540806756, timestamp-offset=(uint)2327953635, seqnum-offset=(uint)29751, a-framerate=(string)30 /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgFAFuhAAAAwAEAAADAPOKgACYlgABfXm97gHiRNQ\=\,KM4CXIA\=", payload=(int)96, ssrc=(uint)1540806756, timestamp-offset=(uint)2327953635, seqnum-offset=(uint)29751, a-framerate=(string)30 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, codec_data=(buffer)01428028ffe100232742802895a014016e84000003000400000300f38a8000989600017d79bdee01e244d401000528ce025c80 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 2328035608 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 29751 ^Chandling interrupt. Interrupt: Stopping pipeline ... Execution ended after 0:00:11.775799274 Setting pipeline to NULL ... Freeing pipeline ...

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

I've also tried setting 50fps in a light environment - getting 30fps in this case

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

Actually, I am not sure if I did everyhting correctly - I've uninstalled the (patched) libcamera build using sudo ninja -C build uninstall but gst-inspect-1.0 still shows me libcamerasrc.
Any ideas ?

@Consti10
Copy link
Author

Consti10 commented Nov 2, 2022

Yeah, okay, my mistake - aparently you have to set export GST_PLUGIN_PATH=$(pwd)/build/src/gstreamer such that gstreamer then actually uses the now build plugin (I suspect that libcamerasrc also comes with the default rpi os gstreamer installation)

Now I can report -

  1. Setting 720p50fps -> measured 50fps output
  2. Setting 720p30fps -> measured 30fps output (even in low light)
  3. Setting 720p60fps ->
    ERROR: from element /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: Internal data stream error. Additional debug info: ../src/gstreamer/gstlibcamerasrc.cpp(312): processRequest (): /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: streaming stopped, reason not-negotiated (-4)
    So 1) and 2) and 3) pretty much validate - setting the framerate from the gstreamer side now works, 3) is just the result of what we found out about the sensor selection.

In my humble opinion, the current raspberry pi sensor mode selecton code is suboptimal. After all, hello_video /mmal always had the "feature" of selecting a sensor mode that can fulfill the given framerate request, if there is one. The current behaviour is much less intuitive than previous common mmal solutions,and doesn't even log a warning - an unexperienced user can set 720p60 (for example) , and libcamera will silently output 50fps instead.

It is also not compatible with libcamerasrc (unless setting the sensor mode is added there).

@naushir
Copy link
Collaborator

naushir commented Nov 4, 2022

In my humble opinion, the current raspberry pi sensor mode selecton code is suboptimal. After all, hello_video /mmal always had the "feature" of selecting a sensor mode that can fulfill the given framerate request, if there is one. The current behaviour is much less intuitive than previous common mmal solutions,and doesn't even log a warning - an unexperienced user can set 720p60 (for example) , and libcamera will silently output 50fps instead.

There is no direct API in libcamera to allow sensor mode selection. The system selects a sensor mode based on what output resolution was requested. Again, there is no way to factor framerate into this selection routine with the libcamera API. As an alternative, libcamera-apps have added the --mode and --viewfinder-mode command line arguments to manually override the sensor mode selection with a user choice. This is equivalent to the --md argument in the legacy mmal camera stack.

@Consti10
Copy link
Author

Consti10 commented Nov 5, 2022

In my humble opinion, the current raspberry pi sensor mode selecton code is suboptimal. After all, hello_video /mmal always had the "feature" of selecting a sensor mode that can fulfill the given framerate request, if there is one. The current behaviour is much less intuitive than previous common mmal solutions,and doesn't even log a warning - an unexperienced user can set 720p60 (for example) , and libcamera will silently output 50fps instead.

There is no direct API in libcamera to allow sensor mode selection. The system selects a sensor mode based on what output resolution was requested. Again, there is no way to factor framerate into this selection routine with the libcamera API. As an alternative, libcamera-apps have added the --mode and --viewfinder-mode command line arguments to manually override the sensor mode selection with a user choice. This is equivalent to the --md argument in the legacy mmal camera stack.

Since a resolution is always tied to a max (sometimes also min framerate), this sounds like a not ideal design to me. Either libcamera (as a library) should expose the functionality to query all sensor modes and then select a specific sensor mode,
Offloading the selection completely to the (library) user.
Or it should allow setting a specific resolution@framerate (for video), and then figure out the right sensor mode for this request.

@Consti10
Copy link
Author

Consti10 commented Nov 5, 2022

I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm:
Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides.
For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.

@Consti10
Copy link
Author

Consti10 commented Nov 5, 2022

I am wondering - should merge requests concerning code in libcamera/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp be done here (since it is rpi-specific) ?

@Consti10
Copy link
Author

Consti10 commented Nov 5, 2022

Some logs on this topic: Requesting 720p withut specifying the sensor mode, I get:

[0:03:01.857414251] [880] INFO Camera camera.cpp:1026 configuring streams: (0) 1280x720-YUV420 ... [0:03:01.857923710] [881] DEBUG RPI raspberrypi.cpp:167 Format: 1332x990 fmt SRGGB10 Score: 2377.47 (best 2377.47) [0:03:01.857977636] [881] DEBUG RPI raspberrypi.cpp:167 Format: 2028x1080 fmt SRGGB12 Score: 314.5 (best 314.5) [0:03:01.858014802] [881] DEBUG RPI raspberrypi.cpp:167 Format: 2028x1520 fmt SRGGB12 Score: 1717.7 (best 314.5) [0:03:01.858050691] [881] DEBUG RPI raspberrypi.cpp:167 Format: 4056x3040 fmt SRGGB12 Score: 2604.7 (best 314.5) [0:03:01.858220930] [881] INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC

It is not documented in code what's better - a higher or lower score. I'd assume a lower score ? But how can 2028x1080 achieve a better score for a resolution of 720p than 1332x990 ?

@naushir
Copy link
Collaborator

naushir commented Nov 7, 2022

I am wondering - should merge requests concerning code in libcamera/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp be done here (since it is rpi-specific) ?

Merge requests for any code in the libcamera tree (e.g. the Raspberry Pi pipeline handler in this case) should be done though the libcamera dev mailing list. You can find the instructions here.

@naushir
Copy link
Collaborator

naushir commented Nov 7, 2022

I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm: Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides. For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.

Unfortunately, this will cause a regression. The 1332x990 mode does not actually use binning, rather it scales in the Bayer domain. This causes a significant loss of image quality when compared with direct binning. The reason for using scaling over binning is to achieve a faster framerate readout because of limitation in the sensor electronics.

One approach to make things easier would be for libcamera-apps to essentially duplicate what the pipeline handler does for mode selection, but have framerate accounted for as well, assuming it was provided in the command line. This way, you only need to specify --framerate 120 on the command line, and libcamera-apps will choose the mode that matches the requested 120fps, regardless of output resolution.

@Consti10
Copy link
Author

Consti10 commented Nov 7, 2022

I am wondering - should merge requests concerning code in libcamera/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp be done here (since it is rpi-specific) ?

Merge requests for any code in the libcamera tree (e.g. the Raspberry Pi pipeline handler in this case) should be done though the libcamera dev mailing list. You can find the instructions here.

Thanks for letting me know. I don't think I'l invest the time to properly go through this hustle - In my opinion, the sensor mode selection code is flawed, but we now have a simple workaround.
https://github.com/OpenHD/libcamera/pull/2

@Consti10
Copy link
Author

Consti10 commented Nov 7, 2022

I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm: Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides. For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.

Unfortunately, this will cause a regression. The 1332x990 mode does not actually use binning, rather it scales in the Bayer domain. This causes a significant loss of image quality when compared with direct binning. The reason for using scaling over binning is to achieve a faster framerate readout because of limitation in the sensor electronics.

One approach to make things easier would be for libcamera-apps to essentially duplicate what the pipeline handler does for mode selection, but have framerate accounted for as well, assuming it was provided in the command line. This way, you only need to specify --framerate 120 on the command line, and libcamera-apps will choose the mode that matches the requested 120fps, regardless of output resolution.

I don't think that's true. The data needs to be cropped anyways before going through the isp. And eliminating those wasted pixels as early as possible should be the standard approach. The only thing in this specific case where one perhaps could make a point is that the 2028x1080 provides 12bpp instead of the 10bpp in 1332x990. But from my testing, this 12bpp is just the default param of libcamera for some reason. Not sure if the ISP actually makes use of 12bpp in video and/or if that makes a difference in quality.

Also,sad that here is no pixel binning in the OS imx477 driver for 720p. Quite sure mmal had it. But I know how stubborn those vendors can be in regards to IP.

@Consti10
Copy link
Author

Consti10 commented Nov 8, 2022

I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm: Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides. For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.

Unfortunately, this will cause a regression. The 1332x990 mode does not actually use binning, rather it scales in the Bayer domain. This causes a significant loss of image quality when compared with direct binning. The reason for using scaling over binning is to achieve a faster framerate readout because of limitation in the sensor electronics.

One approach to make things easier would be for libcamera-apps to essentially duplicate what the pipeline handler does for mode selection, but have framerate accounted for as well, assuming it was provided in the command line. This way, you only need to specify --framerate 120 on the command line, and libcamera-apps will choose the mode that matches the requested 120fps, regardless of output resolution.

actually, I just had a look at the source code - 1332x990 is binned and cropped
https://github.com/raspberrypi/linux/blob/rpi-5.15.y/drivers/media/i2c/imx477.c#L999

@naushir
Copy link
Collaborator

naushir commented Nov 8, 2022

actually, I just had a look at the source code - 1332x990 is binned and cropped https://github.com/raspberrypi/linux/blob/rpi-5.15.y/drivers/media/i2c/imx477.c#L999

That comment is wrong, the fast fps mode definitely uses scaling together with cropping.

@naushir
Copy link
Collaborator

naushir commented Nov 8, 2022

Seeing that you have an acceptable solution in your fork, I'll close this issue down now.

The change at raspberrypi/rpicam-apps#403 will also work by providing a similar mode selection directly in libcamera-apps.

@naushir naushir closed this as completed Nov 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants