New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't record full-frame in mjpeg on pi zero v1.3 with camera v2.1 #315

Closed
chconnor opened this Issue Aug 8, 2016 · 32 comments

Comments

Projects
None yet
4 participants
@chconnor

chconnor commented Aug 8, 2016

Hello -- My understanding is that full-frame (3280x2464) H.264 is not possible (without 2x2 binning). I thus tried mjpeg, but I can only record a few frames before the python module appears to lock up, requiring ctrl-Z and kill %1 and leaving the camera inaccessible until reboot.

At the pi forums, 6by9 suggested that this was a known thing, and suggested that I check for an existing issue, which I did not find: "I remember this now from discussion with waveform80 ... The buffer size is too low and ends up stalling the codec and then bad things happen. ... it'll need a rebuild of picamera to get it working vaguely reliably."

The code I'm testing with:

#!/usr/bin/python3
import picamera
with picamera.PiCamera(sensor_mode=2) as camera:
    camera.sensor_mode=2
    camera.sensor_mode=2
    camera.resolution=(3280, 2464)
    camera.framerate=15
    try:
        camera.start_recording("mytest.mjpeg", format='mjpeg', quality=70)
        camera.wait_recording(10)
    finally:
        camera.stop_recording()

This results in an mjpeg being created of 2-3MB in size that does not play in vlc (first frame only -- not sure if vlc handles mjpeg anyhow), and when transcoded reveals itself to contain a few frames only. CPU continues to be used and the process never exits. I've tried with various gpu_mem settings to no avail.

Happy to test/tweak anything else that may be useful. It'd be great for security applications if the pi zero could record full frames at a reasonable rate (which to me is >=15 fps). Thanks!

@6by9

This comment has been minimized.

6by9 commented Aug 8, 2016

It's https://github.com/waveform80/picamera/blob/master/picamera/mmalobj.py#L760 that is going to be the problem. The default buffer size off video_encode is 64kB. The MJPEG codec is pretty shonky and will stall badly if it can't get rid of the output data. Disable whilst it still has output data queued and it is game over :-(

Ideally for 1080P we want 768kB or so to be able to take an entire frame in one buffer (the same optimisation would be useful on H264), and preferably a few more buffers to keep data flowing cleanly.
The other option would be to disable MMAL_PARAMETER_CAPTURE on the camera video port first which throws an End Of Stream (EOS) down the pipe, and then wait for the EOS to appear at the output. That way you can guarantee that the pipe is clear.

@chconnor

This comment has been minimized.

chconnor commented Aug 8, 2016

(Just for the record, 1080p h.264 actually works fine for me on the pi zero, as does the 2x2 binning at full frame (mode 4). At least... so far... It's just the 3280x2464 mjpeg at 15fps that I can't make happen. Maybe that's all obvious but just wanted to be clear. I'm editing the issue to clarify as well.)

@6by9

This comment has been minimized.

6by9 commented Aug 8, 2016

H264 doesn't stall as the codec architecture is different to work more efficiently with the hardware.
However my comment is based on the fact that only having one buffer of 64kB for shuffling the back and forth to the GPU is suboptimal.
I really ought to fix that for raspivid too - currently it does 256kB for MJPEG, and buffer_size_recommended otherwise. A couple of extra buffers wouldn't go amiss, particularly with motion vectors thrown in.
The V4L2 spec doesn't allow devices to fragment the output data, and I think assumes a horrendous worst case of 8bits/pixel, so almost 2MB for a 1080P frame.

@chconnor

This comment has been minimized.

chconnor commented Aug 8, 2016

Well, the project is obviously in capable hands and I will await whatever developments with interest and appreciation. :-)

@waveform80

This comment has been minimized.

Owner

waveform80 commented Aug 8, 2016

Hmm, picamera should have a workaround for the MJPEG buffer size issue ... which I thought was in MMALPort.commit ... now, where did I put it ...

Ah, here it is, down in MMALPort.enable. So, picamera ought to be using 512kB for the buffer but that obviously needs boosting a bit. Not sure why I stuck that fix in enable rather than commit though ... I'll have to dig out the prior changeset when I tweak it and make sure I understand what I'm doing!

The other option would be to disable MMAL_PARAMETER_CAPTURE on the camera video port first which throws an End Of Stream (EOS) down the pipe, and then wait for the EOS to appear at the output. That way you can guarantee that the pipe is clear.

Hmm, at the moment the PiVideoEncoder class does disable MMAL_PARAMETER_CAPTURE on the camera video port, but I don't think it waits for EOS. The callback does check for EOS and terminates if it sees it, but at the moment the main thread disables capture then immediately disables the output port.

I should try tweaking that to see what happens if we just wait for EOS (I vaguely recall trying this at some point in the past and having some issue with it, but I can't recall what it was - oh well, worth having a play!)

@waveform80 waveform80 added the bug label Aug 8, 2016

@waveform80 waveform80 self-assigned this Aug 8, 2016

@6by9

This comment has been minimized.

6by9 commented Aug 9, 2016

Thanks @waveform80 - I knew we'd had the discussion over buffer size, but couldn't see it.
It sounds like it's worth me double checking raspivid before you spend too much time checking PiCamera, as it may be a firmware issue (hoping not) - I'll try to get that done tonight. I might try adding a ctrl-c handler for raspivid too and wait for EOS, but today is likely to be a long day.

@chconnor

This comment has been minimized.

chconnor commented Nov 15, 2016

Hey -- was just curious if there was any movement or interest on this? No pressure, just curious. :-) It'd be cool to have full-res MJPEG.

@6by9

This comment has been minimized.

6by9 commented Nov 15, 2016

Admittedly I'm on a Pi2, but using raspivid -fps 15 -w 3280 -h 2464 -cd MJPEG -o file.mjpg seems to work OK for recording the full resolution.
I do have the equivalent to gpu_mem being set to 256MB and that may well be the key here - full resolution obviously requires large chunks of memory. Whilst running the above command I only have 80MB reported as free from sudo vcdbg reloc

pi@raspberrypi:~ $ sudo vcdbg reloc

Relocatable heap version 4 found at 0x30000000
total space allocated is 220M, with 220M relocatable, 0 legacy and 0 offline
0 legacy blocks of size 2359296

free list at 0x37139020
80M free memory in 3 free block(s)
largest free block is 80M bytes

0x30000000: free 80M

That would imply that gpu_mem needs to be set to at least 176MB, and I'd suggest playing safe at 192MB as PiCamera allocates memory slightly differently to raspivid.

Anyone fancy giving that a try with PiCamera?

@chconnor

This comment has been minimized.

chconnor commented Nov 15, 2016

Thanks! -- just for the record, I tried a variety of memory sizes and via Python and raspivid, and nothing worked for me (on the pi zero) at the high resolution. Thread here: https://www.raspberrypi.org/forums/viewtopic.php?f=63&t=156458&p=1021146

...but I'll make sure i'm on the latest version and will try your exact raspivid command to see if it works.

@chconnor

This comment has been minimized.

chconnor commented Nov 16, 2016

Hmmm, doesn't seem to be working for me on the pi zero (your raspivid llne.) It seems to work at first, but the file seemed to only have one actual frame in it (in VLC anyway), as reported above. And my python test code (see first post in this issue) still hangs.

After I ran my python test, it hangs, I hit ctrl-Z and kill %1, and I did "vcdbg reloc". I'll try to attach the result, in case it's useful (github is giving me trouble).

Let me know if there's anything I can do to help.

@chconnor

This comment has been minimized.

chconnor commented Nov 16, 2016

I can't attach a .txt file for some reason, so here's the debug output:

Relocatable heap version 4 found at 0x10000000
total space allocated is 236M, with 236M relocatable, 0 legacy and 0 offline
0 legacy blocks of size 2359296

free list at 0x185ee8c0
106M free memory in 5 free block(s)
largest free block is 102M bytes

0x10000000: free 102M
[  97] 0x165fd580: used  12M (refcount 1 lock count 0, size 12182016, align   32, data 0x165fd5a0, d1rual) 'mjpg conv buf'
0x1719b7c0: free 4.0M
[  93] 0x1759d500: used 3.8M (refcount 1 lock count 0, size  3940352, align 4096, data 0x1759e000, d1ruAl) 'video_encodeRIL: subsample pool'
[  94] 0x17960520: used 3.8M (refcount 1 lock count 0, size  3940352, align 4096, data 0x17961000, d1rual) 'video_encodeRIL: subsample pool'
[  78] 0x17d23540: used 3.8M (refcount 1 lock count 0, size  3940352, align 4096, data 0x17d24000, d1rual) 'video_encodeRIL: subsample pool'
[  79] 0x180e6560: used 3.8M (refcount 1 lock count 0, size  3940352, align 4096, data 0x180e7000, d1rual) 'video_encodeRIL: subsample pool'
0x184a9580: free 1.9K
[  12] 0x184a9d00: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184a9d20, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  11] 0x184a9dc0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184a9de0, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  63] 0x184a9e80: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184a9ea0, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  67] 0x184a9f40: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184a9f60, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  66] 0x184aa000: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184aa020, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  62] 0x184aa0c0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184aa0e0, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  61] 0x184aa180: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x184aa1a0, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  83] 0x184aa240: used  16K (refcount 1 lock count 0, size    16384, align   32, data 0x184aa260, d0rual) 'stabilise current image'
[  82] 0x184ae280: used 3.1K (refcount 1 lock count 0, size     3072, align   32, data 0x184ae2a0, d0rual) 'stabilise ref_ss'
[  16] 0x184aeec0: used  48K (refcount 1 lock count 0, size    49152, align   32, data 0x184aeee0, d0rual) 'stabilise ref'
[  20] 0x184baf00: used 3.1K (refcount 1 lock count 0, size     3072, align   32, data 0x184baf20, d0rual) 'stabilise ref_ss'
[  55] 0x184bbb40: used  48K (refcount 1 lock count 0, size    49152, align   32, data 0x184bbb60, d0rual) 'stabilise ref'
[  38] 0x184c7b80: used 438K (refcount 1 lock count 0, size   448256, align   32, data 0x184c7ba0, d0ruAl) 'drc lp'
[  23] 0x185352c0: used  576 (refcount 1 lock count 0, size      512, align   32, data 0x185352e0, d0ruAl) 'drc in last lvl'
[  24] 0x18535500: used 1.6K (refcount 1 lock count 0, size     1536, align   32, data 0x18535520, d0ruAl) 'drc out'
[  25] 0x18535b40: used 1.6K (refcount 1 lock count 0, size     1536, align   32, data 0x18535b60, d0ruAl) 'drc in'
[  89] 0x18536180: used  55K (refcount 1 lock count 0, size    56576, align   32, data 0x185361a0, d0ruAl) 'drc out'
[   8] 0x18543ec0: used 215K (refcount 1 lock count 0, size   219648, align   32, data 0x18543ee0, d0ruAl) 'drc out'
[  45] 0x18579900: used 215K (refcount 1 lock count 0, size   219648, align   32, data 0x18579920, d0ruAl) 'drc in'
[  75] 0x185af340: used  52K (refcount 1 lock count 0, size    53504, align    4, data 0x185af360, d1rual) 'AWB State (makesum_direct)'
[  76] 0x185bc480: used  18K (refcount 1 lock count 0, size    18432, align    4, data 0x185bc4a0, d1rual) 'AWB State (CCMs)'
[  40] 0x185c0cc0: used  18K (refcount 1 lock count 0, size    18432, align    4, data 0x185c0ce0, d1rual) 'AWB State (CCMs)'
[  53] 0x185c5500: used  18K (refcount 1 lock count 0, size    18432, align    4, data 0x185c5520, d1rual) 'AWB State (CCMs)'
0x185c9d40: free 64
[  44] 0x185c9d80: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x185c9da0, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  71] 0x185c9e40: used 4.4K (refcount 1 lock count 0, size     4480, align   32, data 0x185c9e60, d0ruAl) 'drc out'
[  86] 0x185cb000: used 4.4K (refcount 1 lock count 0, size     4480, align   32, data 0x185cb020, d0ruAl) 'drc in'
[  87] 0x185cc1c0: used  15K (refcount 1 lock count 0, size    14976, align   32, data 0x185cc1e0, d0ruAl) 'drc out'
[  90] 0x185cfc80: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x185cfca0, d0rual) 'ls table 4'
[  91] 0x185d5ee0: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x185d5f00, d0rual) 'ls table 3'
[  36] 0x185dc140: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x185dc160, d0rual) 'ls table 2'
[  54] 0x185e23a0: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x185e23c0, d0rual) 'ls table 1'
[  74] 0x185e8600: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x185e8620, d0rual) 'ls table 0'
0x185ee860: free 128
[  85] 0x185ee8e0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x185ee900, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  84] 0x185ee9a0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x185ee9c0, d1rual) 'ril.video_splitter:in:0(OPQV)'
[  88] 0x185eea60: used  15K (refcount 1 lock count 0, size    14976, align   32, data 0x185eea80, d0ruAl) 'drc in'
[  47] 0x185f2520: used  55K (refcount 1 lock count 0, size    56576, align   32, data 0x185f2540, d0ruAl) 'drc in'
[  17] 0x18600260: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600280, d1rual) 'null_sink:in:0(OPQV)'
[   7] 0x18600320: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600340, d1rual) 'null_sink:in:0(OPQV)'
[   6] 0x186003e0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600400, d1rual) 'null_sink:in:0(OPQV)'
[  48] 0x186004a0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x186004c0, d1rual) 'null_sink:in:0(OPQV)'
[  43] 0x18600560: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600580, d1rual) 'null_sink:in:0(OPQV)'
[  72] 0x18600620: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600640, d1rual) 'null_sink:in:0(OPQV)'
[  77] 0x186006e0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600700, d1rual) 'null_sink:in:0(OPQV)'
[  65] 0x186007a0: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x186007c0, d1rual) 'null_sink:in:0(OPQV)'
[  64] 0x18600860: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600880, d1rual) 'null_sink:in:0(OPQV)'
[  73] 0x18600920: used  192 (refcount 1 lock count 0, size      128, align   32, data 0x18600940, d1rual) 'null_sink:in:0(OPQV)'
[  42] 0x186009e0: used 4.3K (refcount 1 lock count 0, size     4378, align    4, data 0x18600a00, d0rual) 'distortion table'
[  39] 0x18601b40: used 135K (refcount 1 lock count 0, size   138240, align    4, data 0x18601b60, d0rual) 'isp sw stage buffers'
[  19] 0x18623780: used 271K (refcount 1 lock count 0, size   277512, align    4, data 0x186237a0, d0rual) 'concurrent mdata'
[  58] 0x186673e0: used 4.3K (refcount 1 lock count 0, size     4378, align    4, data 0x18667400, d0rual) 'distortion table'
[  10] 0x18668540: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x18668560, d0rual) 'ls table isp'
[  52] 0x1866e7a0: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x1866e7c0, d0rual) 'ls table tx1'
[  51] 0x18674a00: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x18674a20, d0rual) 'ls table tx0'
[  30] 0x1867ac60: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x1867ac80, d0rual) 'ls table 1'
[  29] 0x18680ec0: used  25K (refcount 1 lock count 0, size    25100, align    4, data 0x18680ee0, d0rual) 'ls table 0'
[  18] 0x18687120: used  52K (refcount 1 lock count 0, size    53504, align    4, data 0x18687140, d1rual) 'AWB State (makesum_direct)'
[  26] 0x18694260: used  18K (refcount 1 lock count 0, size    18432, align    4, data 0x18694280, d1rual) 'AWB State (CCMs)'
[  27] 0x18698aa0: used  18K (refcount 1 lock count 0, size    18432, align    4, data 0x18698ac0, d1rual) 'AWB State (CCMs)'
[  28] 0x1869d2e0: used  18K (refcount 1 lock count 0, size    18432, align    4, data 0x1869d300, d1rual) 'AWB State (CCMs)'
[  57] 0x186a1b20: used 135K (refcount 1 lock count 0, size   138240, align    4, data 0x186a1b40, d0rual) 'isp sw stage buffers'
[  35] 0x186c3760: used 271K (refcount 1 lock count 0, size   277512, align    4, data 0x186c3780, d0rual) 'concurrent mdata'
[  50] 0x187073c0: used 9.8M (refcount 1 lock count 1, size 10289152, align 4096, data 0x18708000, d1rual) 'cameraRIL:input pool'
[  68] 0x190d83e0: used 9.8M (refcount 1 lock count 1, size 10289152, align 4096, data 0x190d9000, d1rual) 'cameraRIL:input pool'
[  14] 0x19aa9400: used 9.8M (refcount 1 lock count 1, size 10289152, align 4096, data 0x19aaa000, d1rual) 'cameraRIL:input pool'
[  31] 0x1a47a420: used 3.8M (refcount 1 lock count 0, size  3986624, align 4096, data 0x1a47b000, d1ruAl) 'cameraRIL: lowres pool'
[  32] 0x1a848900: used 3.8M (refcount 1 lock count 1, size  3986624, align 4096, data 0x1a849000, d1rual) 'cameraRIL: lowres pool'
[  33] 0x1ac16de0: used 3.8M (refcount 1 lock count 0, size  3986624, align 4096, data 0x1ac17000, d1ruAl) 'cameraRIL: lowres pool'
[  34] 0x1afe52c0: used  12M (refcount 1 lock count 0, size 12228288, align 4096, data 0x1afe6000, d1ruAl) 'cameraRIL: hires stills pool'
[  21] 0x1bb8f9a0: used  16M (refcount 1 lock count 0, size 16553152, align 4096, data 0x1bb90000, d1rual) 'cameraRIL: hires vid pool'
[  70] 0x1cb59e80: used  16M (refcount 1 lock count 0, size 16553152, align 4096, data 0x1cb5a000, d1rual) 'cameraRIL: hires vid pool'
[  69] 0x1db24360: used  16M (refcount 1 lock count 0, size 16553152, align 4096, data 0x1db25000, d1rual) 'cameraRIL: hires vid pool'
[  15] 0x1eaee840: used  832 (refcount 1 lock count 0, size      740, align   32, data 0x1eaee860, d1rual) 'ril mem ril.camera-0'
[  22] 0x1eaeeb80: used  832 (refcount 1 lock count 0, size      740, align   32, data 0x1eaeeba0, d1rual) 'ril mem ril.camera-0'
[  41] 0x1eaeeec0: used  832 (refcount 1 lock count 0, size      740, align   32, data 0x1eaeeee0, d1rual) 'ril mem ril.camera-0'
[  46] 0x1eaef200: used  832 (refcount 1 lock count 0, size      740, align   32, data 0x1eaef220, d1rual) 'ril mem ril.camera-0'
[   4] 0x1eaef540: used  576 (refcount 1 lock count 0, size      512, align    4, data 0x1eaef560, d0rual) 'ILCS VC buffer pool'
[   3] 0x1eaef780: used 1.0M (refcount 1 lock count 8, size  1091584, align 4096, data 0x1eaf0000, d3rual) 'ARM FB'
[   2] 0x1ebfafa0: used  16K (refcount 1 lock count 0, size    16384, align   32, data 0x1ebfafc0, d0ruAl) 'audioplus_tmp_buf'
[   1] 0x1ebfefe0: used 4.0K (refcount 1 lock count 0, size        0, align 4096, data 0x1ebff000, d1rual) 'camera fast alloc arena'
small allocs not requested

@6by9

This comment has been minimized.

6by9 commented Nov 16, 2016

VLC is broken when it comes to MJPEG - raspberrypi/userland#345. It sees the JFIF header in our bitstream and incorrectly identifies it as a single JPEG rather than MJPEG.
Try adding the --demux=avformat flag to the vlc command line and it should identify it correctly as MJPEG, or mux it into a container using avconv first.
Mplayer on my Ubuntu laptop generally seems happy to play them.

Just checking with the file I captured yesterday and there is something off. It looks like if the bitrate is too low then the encode needs to try and drop the Q factor below the minimum and things fail.
raspivid -md 2 -fps 15 -w 3264 -h 2448 -cd MJPEG -o wibble.mjpg -t 10000 -b 25000000

NB camera.start_recording("mytest.mjpeg", format='mjpeg', quality=70) quality does nothing for MJPEG, and not a huge amount for H264 (it appears to be applying a very odd constraint). bitrate is the much more significant parameter.

@chconnor

This comment has been minimized.

chconnor commented Nov 16, 2016

Thanks 6x9 -- that vlc fix works great. Some interesting test results here...

Test 1: The second raspivid command works to record the full-res clips. (But i'm curious why you specified 3264x2448 instead of 3280x2464 which I understand to be mode 2?)

But the posterization and blocking is so rough on the full-res files that if I simply scale a full-res h.264 video (1640x1232 with 2x2 binning, and at only 15mbps) to the same size, the image quality is roughly equivalent.

Test 2: the python code still hangs -- 'top' shows no activity, and I have to ctrl-Z / kill %1 to get out of it (and the camera is borked for further use). A partial mjpg file is recorded with only a few frames in it, but they are full resolution and look great. This is what I'm currently doing:

import picamera

with picamera.PiCamera(sensor_mode=2) as camera:
    camera.sensor_mode=2
    camera.sensor_mode=2
    camera.resolution=(3280, 2464)
    camera.framerate=15

    try:
        camera.start_recording("mytest.mjpg", format='mjpeg', bitrate=25000000)
        camera.wait_recording(10)
    finally:
        camera.stop_recording()

Here are some sample images extracted from the videos:

It's tantalizingly close! :-)

@6by9

This comment has been minimized.

6by9 commented Nov 17, 2016

Resolution is just a number - the system will crop/scale almost anything to anything. 3264x2448 is the normal "standard" for 4:3 8MPix.

Just doing the maths, even at 25Mbit/s 15fps, you're looking at 203kB per frame for an 8MPix (12MB) image. That's a pretty high compression factor hence compression artefacts. MJPEG is a hopelessly inefficient codec in comparison to H264.
MJPEG should allow you to request a higher bitrate still (although raspivid is clipping to a max 25Mbit/s, and you'll struggle to save it to SD card), or you can reduce frame rate, but you really need to be assigning a few more bits to each frame if you need to improve the quality.

@chconnor

This comment has been minimized.

chconnor commented Nov 17, 2016

Thanks -- that makes sense, but the part I don't understand is why the python code results above look so much better than the raspivid results (i.e. little/no compression artifacts, no posterization) despite being assigned the same bitrate, and that the python code also hangs and borks the camera... ?

I had given up on the mjpg, assuming the bitrate was the cause, but then I saw the (partial) results of the python code video and it looked so beautiful and made me think that it was possible to make it work after all... is it perhaps recording at too-high a bitrate (ignoring the specified 25mbps) and that's causing it to hang/fail?

@6by9

This comment has been minimized.

6by9 commented Nov 17, 2016

The codec always starts at quality 7 and then nudges the quality up/down by 1 per frame if the resultant image was more than 12% outside the target range, therefore the first frames may well end up well beyond bitrate budget but better quality.
I generally try to avoid running specific test cases but may give yours a whirl if I can find time as it is so simple.

You can turn on some GPU side logging if you're that interested. vcgencmd set_logging level=0x4 before the run, and then sudo vcdbg log msg afterwards. You should get some log lines similar to:

041778.728: mjpeg_enc_stripe: eof 4194304 (done 47/196401) (target:208333) qual:17
041862.848: mjpeg_open_module: (3eb30d40) successfully opened module
041913.619: mjpeg_enc_stripe: eof 4194304 (done 48/201193) (target:208333) qual:17
041997.165: mjpeg_open_module: (3eb30d40) successfully opened module

Decoding it:
mjpeg_enc_stripe: eof <buffer size> (done <frame count>/<encoded size of that frame>) (target: <target bytes per frame>) qual <current quality value>
My camera was pointing down at a table so compressed very well and ended up at quality 17. Pointing it at a more useful scene I get:

176740.103: mjpeg_enc_stripe: eof 4194304 (done 63/202375) (target:208333) qual:6
176769.713: mjpeg_open_module: (3eb30d40) successfully opened module
176866.103: mjpeg_enc_stripe: eof 4194304 (done 64/202012) (target:208333) qual:6
176949.807: mjpeg_open_module: (3eb30d40) successfully opened module
177000.997: mjpeg_enc_stripe: eof 4194304 (done 65/201655) (target:208333) qual:6

So pretty close to the target.

There is a known lockup problem with the MJPEG codec which I think I've just hit and got logging lines of
042311.375: mjpg_enc_frame3 blocked as st->free_list_head 3eb30dfc, curr_freespace=1624543, max_frame_size 700604.
042311.407: mjpg_enc_frame3 blocked as st->free_list_head 3eb30dfc, curr_freespace=1624543, max_frame_size 700604.
It's a much bigger task to fix than I have time for currently, but it is on the list.

@chconnor

This comment has been minimized.

chconnor commented Nov 17, 2016

Thanks -- crazy theory: perhaps the bitrate setting with mjpeg is being interpreted (through the python API) as a "per-frame" bit target instead of mbps?

Here's the debug output (from running my python code above, which sets fps 15 and bitrate 25000000):

1921397.617: mjpg_enc_open 3280x2464
1921910.702: mjpeg_open_module: (1fa56060) successfully opened module
1921950.570: mjpeg_enc_stripe: eof 4194304 (done 1/348210) (target:3125000) qual:8
1922040.425: mjpeg_open_module: (1fa56060) successfully opened module
1922085.411: mjpeg_enc_stripe: eof 4194304 (done 2/415882) (target:3125000) qual:9
1922130.442: mjpeg_open_module: (1fa56060) successfully opened module
1922180.491: mjpeg_enc_stripe: eof 4194304 (done 3/499745) (target:3125000) qual:10
1922225.704: mjpeg_open_module: (1fa56060) successfully opened module
1922277.080: mjpeg_enc_stripe: eof 4194304 (done 4/580225) (target:3125000) qual:11
1922324.039: mjpeg_open_module: (1fa56060) successfully opened module
1922376.095: mjpeg_enc_stripe: eof 4194304 (done 5/669976) (target:3125000) qual:12
1922376.140: mjpg_enc_frame3 blocked as st->free_list_head 1fa56134, curr_freespace=1680265, max_frame_size 669976.
1922376.161: mjpg_enc_frame3 blocked as st->free_list_head 1fa56134, curr_freespace=1680265, max_frame_size 669976.

It looks like it is indeed hitting the lockup problem you describe, after five frames.

But the target is listed as "3125000"... at 25mbps and 15fps (set via python code) I would have expected 208333 as your debug output has, but it's ~exactly 15X that value, which happens to be the fps... so I ran another test: I used my python code to set the bitrate to 1666666 (25000000 / 15). This time the log messages show 208333 as the target and the python script works just fine -- no hang:

...
497779.320: mjpeg_enc_stripe: eof 4194304 (done 82/206924) (target:208333) qual:4
497855.794: mjpeg_open_module: (1fa56060) successfully opened module
497905.615: mjpeg_enc_stripe: eof 4194304 (done 83/206860) (target:208333) qual:4
497983.583: mjpeg_open_module: (1fa56060) successfully opened module
498076.500: mjpeg_enc_stripe: eof 4194304 (done 84/206767) (target:208333) qual:4
498076.591: mjpg_enc_close

@6by9

This comment has been minimized.

6by9 commented Nov 17, 2016

Er, ouch! I'm not sure what is going on there. One for @waveform80 I think, if only for guidance.
I can't see anything obvious misinterpreting the bitrate, but it's not my code (and I don't do Python).
From #342 (comment) he does say that setting picamera.PiEncoder.DEBUG = True should dump out port configuration - I wonder if that includes bitrate.

Technically the "mjpg_enc_frame3 blocked" message isn't actually the lockup, but is indicative that the codec is having to stall due to thinking that it has insufficient space in the output FIFO.

@chconnor

This comment has been minimized.

chconnor commented Nov 17, 2016

Here's the DEBUG=True output in case it's useful (python code otherwise as shown above, and I ran it in python 2.7.9 instead of my normal 3.4.2 because the other bug said that debugging didn't work in python 3):

 vc.ril.camera [1]                                [0] vc.ril.video_splitter [1]                               [0] vc.ril.video_encode [0]
   encoding    OPQV-dual         -->      OPQV-single       encoding        OPQV-single      -->        OPQV-dual      encoding       MJPG
      buf      10x128                          10x128          buf          10x128                         10x128         buf         1x65536
     frame     3280x2464@15fps         3280x2464@0fps         frame         3280x2464@0fps         3280x2464@0fps        frame        3280x2464@0fps

As an aside: is the 25mbps limit a non-negotiable hardware limitation?

@6by9

This comment has been minimized.

6by9 commented Nov 17, 2016

Nothing terribly useful in that debug output, although I am a little surprised to see "1x65536" on the output of video_encode - https://github.com/waveform80/picamera/blob/master/picamera/mmalobj.py#L832 should bump it up to 512kB, but perhaps that happens later on as it's part of the enable call.

25MBit/s isn't a real hardware limit, although it may be a practical one from shovelling data around the place. The JPEG block is expected to achieve about 160MPix/s, but I don't recall a strict limit on the quality setting on the output side.
The limit is however enforced by PiCamera with https://github.com/waveform80/picamera/blob/master/picamera/encoders.py#L627, which only allows the bitrate to exceed 25Mbit/s if H264 level 4.2 has been selected.

@chconnor

This comment has been minimized.

chconnor commented Nov 17, 2016

Well, let me hereby register my humble request that we clueless users be allowed to push the mbps as high as hardware/software/codec standards will permit. :-) Meaning, if a 25Mbps limit for mjpg isn't meaningful, it'd be cool to have that limit lifted.

For security cams, it is, of course, all about the clarity of the image, so if I can push a little more clarity out of the camera with a higher bitrate and mjpg (once it gets sorted out), that'd be awesome!

@waveform80

This comment has been minimized.

Owner

waveform80 commented Nov 18, 2016

But the target is listed as "3125000"... at 25mbps and 15fps (set via python code) I would have expected 208333 as your debug output has, but it's ~exactly 15X that value, which happens to be the fps... so I ran another test: I used my python code to set the bitrate to 1666666 (25000000 / 15). This time the log messages show 208333 as the target and the python script works just fine -- no hang:

Hmmm, that's interesting. Bitrate is set for H.264 and MJPEG by the same chunk of code, i.e. there's no special casing for H.264 or MJPEG (the bitrate setter is down in mmalobj and is quite straight-forward). I'll add bitrate to the debug output and see what happens...

25MBit/s isn't a real hardware limit, although it may be a practical one from shovelling data around the place. The JPEG block is expected to achieve about 160MPix/s, but I don't recall a strict limit on the quality setting on the output side.
The limit is however enforced by PiCamera with https://github.com/waveform80/picamera/blob/master/picamera/encoders.py#L627, which only allows the bitrate to exceed 25Mbit/s if H264 level 4.2 has been selected.

Yup - in the absence of any particular knowledge about the JPEG block (or the H264) block I just copied the limit for H264 to MJPEG. If no limit exists I can certainly remove that (although it looks like excessively high values may have something to do with the lockup from the observations above).

@waveform80

This comment has been minimized.

Owner

waveform80 commented Nov 18, 2016

Nothing terribly useful in that debug output, although I am a little surprised to see "1x65536" on the output of video_encode - https://github.com/waveform80/picamera/blob/master/picamera/mmalobj.py#L832 should bump it up to 512kB, but perhaps that happens later on as it's part of the enable call.

Ah, I've just moved where the debug print happens to fix that - it's pretty misleading otherwise. So, here's the output from a quick session with the new debug stuff printing bitrate too:

>>> from picamera import *
>>> camera = PiCamera(resolution='1280x720', framerate=30)
>>> PiEncoder.DEBUG = True
>>> camera.start_recording('foo.mjpg', bitrate=1000000)
 vc.ril.camera [1]                              [0] vc.ril.video_splitter [1]                             [0] vc.ril.video_encode [0]
   encoding    OPQV-dual                OPQV-single       encoding        OPQV-single               OPQV-dual      encoding       MJPG
      buf      10x128           -->          10x128          buf          10x128          -->          10x128         buf         1x524288
    bitrate    0bps                            0bps        bitrate        0bps                           0bps       bitrate       1000000bps
     frame     1280x720@30fps         1280x720@0fps         frame         1280x720@0fps         1280x720@0fps        frame        1280x720@0fps
>>> camera.stop_recording()

So, the buffer size looks okay, and the bitrate has indeed been set to 1,000,000 bps. Hence we'd expect to see 1,000,000 / (8 * 30) ~= 4166 as the target byte size for a frame. However, the target size reported by the MJPEG debug output is 125,000 (or 1,000,000/8):

3147246.085: mjpg_enc_open 1280x720
3147315.161: mjpeg_open_module: (3ea4d020) successfully opened module
3147319.349: mjpeg_enc_stripe: eof 4194304 (done 1/47207) (target:125000) qual:8
3147346.900: mjpeg_open_module: (3ea4d020) successfully opened module
3147351.190: mjpeg_enc_stripe: eof 4194304 (done 2/54157) (target:125000) qual:9
3147380.271: mjpeg_open_module: (3ea4d020) successfully opened module
3147384.499: mjpeg_enc_stripe: eof 4194304 (done 3/62611) (target:125000) qual:10
3147414.564: mjpeg_open_module: (3ea4d020) successfully opened module
3147418.897: mjpeg_enc_stripe: eof 4194304 (done 4/70149) (target:125000) qual:11
3147446.902: mjpeg_open_module: (3ea4d020) successfully opened module
3147451.158: mjpeg_enc_stripe: eof 4194304 (done 5/79942) (target:125000) qual:12
3147481.718: mjpeg_open_module: (3ea4d020) successfully opened module
3147486.110: mjpeg_enc_stripe: eof 4194304 (done 6/90836) (target:125000) qual:13
3147513.558: mjpeg_open_module: (3ea4d020) successfully opened module
3147517.994: mjpeg_enc_stripe: eof 4194304 (done 7/100382) (target:125000) qual:14
3147546.794: mjpeg_open_module: (3ea4d020) successfully opened module
3147551.120: mjpeg_enc_stripe: eof 4194304 (done 8/109732) (target:125000) qual:14
3147580.153: mjpeg_open_module: (3ea4d020) successfully opened module
3147584.473: mjpeg_enc_stripe: eof 4194304 (done 9/109503) (target:125000) qual:14
3147614.472: mjpeg_open_module: (3ea4d020) successfully opened module
3147618.972: mjpeg_enc_stripe: eof 4194304 (done 10/109552) (target:125000) qual:14
3147648.317: mjpeg_open_module: (3ea4d020) successfully opened module
3147652.781: mjpeg_enc_stripe: eof 4194304 (done 11/109794) (target:125000) qual:14
3147680.035: mjpeg_open_module: (3ea4d020) successfully opened module
3147684.498: mjpeg_enc_stripe: eof 4194304 (done 12/109609) (target:125000) qual:14
3147713.427: mjpeg_open_module: (3ea4d020) successfully opened module
3147717.910: mjpeg_enc_stripe: eof 4194304 (done 13/109962) (target:125000) qual:14
3147746.687: mjpeg_open_module: (3ea4d020) successfully opened module
3147750.990: mjpeg_enc_stripe: eof 4194304 (done 14/109637) (target:125000) qual:14
3147780.063: mjpeg_open_module: (3ea4d020) successfully opened module
3147784.518: mjpeg_enc_stripe: eof 4194304 (done 15/109707) (target:125000) qual:14

So I'd guess MJPEG isn't taking into account the framerate when calculating the target size?

I could certainly tweak this on the picamera side of things (fudge the bitrate setting so it divides by the framerate when setting bitrate for MJPEG), but I don't know if it's feasible to tweak this on the firmware side (which is probably preferable as it fixes it for all applications). I should definitely make picamera's buffer size fudge a bit more intelligent though (it should at least base it on the bitrate + some fudge factor which might alleviate the MJPEG lockup).

@6by9

This comment has been minimized.

6by9 commented Nov 18, 2016

Yup - in the absence of any particular knowledge about the JPEG block (or the H264) block I just copied the limit for H264 to MJPEG. If no limit exists I can certainly remove that (although it looks like excessively high values may have something to do with the lockup from the observations above).

The H264 limits come from H264 level spec. Actually the current limit is incorrect but only on a technicality (from https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Levels):
level 4.0 - 1080P30.1 @ max 25Mbit/s
level 4.1 - 1080P30.1 @ max 62.5Mbit/s (it only increases that max bitrate)
level 4.2 - 1080P64.0 @ max 62.5Mbit/s (we can't actually do 64fps even with overclocking).
There are benefits to being able to select lower levels as it puts stuff in the headers that the decoder can use to allocate less memory (see the wiki section on "Decoded picture buffering"). It's not a big deal though.

I will try to sort the MJPEG lockup at some point. It can probably be worked around more effectively by ensuring state transitions in an appropriate order, but fixing the firmware would be the better option.

@6by9

This comment has been minimized.

6by9 commented Nov 18, 2016

So I'd guess MJPEG isn't taking into account the framerate when calculating the target size?

OK, I know where to look now, but it can't be right now (this annoying work thing).
Memory does say that it uses the frame rate value from the OUTPUT port, which your debug says is 0.
Although your debug says that frame rate is 0fps on ALL ports from video_splitter input onwards - that can't be right.

Racking my memory, H264 bitrate control can run based solely on the timestamps of the frames that come in should the framerate not be provided. The MJPEG encoder doesn't support that (although I guess it could), but that probably explains what we're seeing. It's probably firing a div0 exception on the GPU (bytes_per_frame = bitrate/framerate), but that probably doesn't get logged anywhere accessible.

@waveform80

This comment has been minimized.

Owner

waveform80 commented Nov 18, 2016

The H264 limits come from H264 level spec. Actually the current limit is incorrect but only on a technicality...

Ah, that's good to know! I'll expand the checks in picamera with the relevant bits from that table.

Although your debug says that frame rate is 0fps on ALL ports from video_splitter input onwards - that can't be right.

Hmm, just had a look at the debug printing code and I can't see how it'd be reporting anything other than what's actually set so I think that's actually the state of the pipeline. I'll see if I can figure out where the framerate info's getting lost (I don't think I set it to zero anywhere, but I could be wrong - or it might be happening implicitly somewhere).

Racking my memory, H264 bitrate control can run based solely on the timestamps of the frames that come in should the framerate not be provided. The MJPEG encoder doesn't support that (although I guess it could), but that probably explains what we're seeing. It's probably firing a div0 exception on the GPU (bytes_per_frame = bitrate/framerate), but that probably doesn't get logged anywhere accessible.

Given the lack of framerate on the output port that certainly sounds likely. I'll try hacking that manually in a mo and see what happens...

@waveform80

This comment has been minimized.

Owner

waveform80 commented Nov 18, 2016

I'll see if I can figure out where the framerate info's getting lost (I don't think I set it to zero anywhere, but I could be wrong - or it might be happening implicitly somewhere).

A little more on this - it seems committing the port format wipes the framerate info:

>>> from picamera import PiCamera
>>> camera = PiCamera()
>>> camera._splitter.inputs[0]
<MMALVideoPort "vc.ril.video_splitter:in:0(OPQV)": format=MMAL_FOURCC('OPQV') buffers=10x128 frames=1920x1080@0fps>
>>> camera._camera.outputs[1]
<MMALVideoPort "vc.ril.camera:out:1(OPQV)": format=MMAL_FOURCC('OPQV') buffers=10x128 frames=1920x1080@30fps>
>>> camera._splitter.enabled = False
>>> camera._splitter.inputs[0].copy_from(camera._camera.outputs[1])
>>> camera._splitter.inputs[0]
<MMALVideoPort "vc.ril.video_splitter:in:0(OPQV)": format=MMAL_FOURCC('OPQV') buffers=10x128 frames=1920x1080@30fps>
>>> camera._splitter.inputs[0].commit()
>>> camera._splitter.inputs[0]
<MMALVideoPort "vc.ril.video_splitter:in:0(OPQV)": format=MMAL_FOURCC('OPQV') buffers=3x128 frames=1920x1080@0fps>
>>> camera._splitter.enabled = True

I can't see anything in picamera's mmalobj layer which'd be causing that so I'll have a dig into userland and see if I can figure out what's going on!

@6by9

This comment has been minimized.

6by9 commented Nov 18, 2016

Fair cop.
video_splitter appears to not do anything with the frame rate field when given a new format. Sorry about that.
It's one of the "fun" things in IL that it handles Video domain things (which have a frame rate field) and Image domain things (which don't).
It should be a dozen lines total, so I'll try to get it sorted today and pushed to Pi Towers.

@6by9

This comment has been minimized.

6by9 commented Nov 19, 2016

I have a patch that waveform80 has tested for me (thank you) and works. I'm pushing it to Pi Towers, so it'll be released in a firmware update relatively soon and should fix MJPEG getting the bitrate totally wrong due to having no frame rate information.

Any updates to PiCamera to handle more H264 levels or altering MJPEG max bitrate are totally up to waveform80 to do if/when he can.

@chconnor

This comment has been minimized.

chconnor commented Nov 19, 2016

Awesome, thanks!

@6by9

This comment has been minimized.

6by9 commented Nov 23, 2016

Firmware update so that video_splitter forwards on the framerate has now been released via rpi-update (Hexxeh/rpi-firmware@3ffa1c6)

Normal warnings over rpi-update releases potentially being unstable. I don't know when the next firmware bump will be made to the Raspbian repo to make a fully official release.

@3bl3gamer

This comment has been minimized.

3bl3gamer commented Dec 14, 2016

Have the same issue with full-size mjpeg on RPi1 with v1.3 camera. Made this as a workaround:

class MyEncoder(picamera.PiVideoEncoder):
    def start(self, output, motion_output=None):
        self.frame = picamera.PiVideoFrame(
                index=0,
                frame_type=None,
                frame_size=0,
                video_size=0,
                split_size=0,
                timestamp=0,
                complete=False,
                )
        self.output_port._port[0].buffer_size_recommended = 1024*1024*2
        super(picamera.PiVideoEncoder, self).start(output)


class MyCamera(picamera.PiCamera):
    def _get_video_encoder(self, *args, **kwargs):
        return MyEncoder(self, *args, **kwargs)

@waveform80 waveform80 added this to the 1.13 milestone Dec 22, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment