New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Making sure the last recorded frame is actually the last one I get in the output #513
Comments
Sorry, I don't think that's possible from the layer you're in. In the recording pipeline that picamera sets up there's a set of buffers that each component uses to ensure things run smoothly without stalling. The firmware's defaults are sufficiently large to ensure that a write callback can take a little longer (not uncommon for things like SD card output) without preventing the camera from having a buffer available to capture another frame. You don't have control over this from the picamera interface, unless you go down to the mmalobj layer and start allocating all the components and buffers yourself (I don't think even picamera specifies the buffer counts - just uses the firmware's defaults). Anyway, all this means that by the time a buffer reaches your |
I also tried getting a signal from the camera in an attempt to synchronize the frames (that was before you replied here) and I gotta say you were right. This is what I tried: Anyway, as a final option, I resorted to using the Thanks for your help, Dave. I think it's all clear for me now. |
As waveform80 says, you're at the end of a pipeline, so it's a touch tricky to judge the latency of a buffer through the system. As a rough estimate for 1080P, the frame takes whatever exposure time is programmed on the sensor, and then ~31ms to read out. I recall there was a forum thread where I actually measured the numbers - ah https://www.raspberrypi.org/forums/viewtopic.php?t=153410&p=1027417#p1004792. You don't state your resolution, but your 6.7fps would be 149ms would be about right for 1080P with a 30ms exposure time and a small safety margin. Can you work the other way around? Every buffer has a timestamp. You can retrieve the current system time from the GPU (I forget the call), therefore at point X you can read the system time and get the current value. All buffers up to that timestamp will be before your change. |
So right now the resolution I'm using is 480x272 and the exposure time is set to 3ms - the LEDs I'm testing are bright enough to compensate for the lack of exposure - this has the advantage of filtering out nearby lights. So theoretically the latency could be estimated at ~(31 + 3)ms. For the time being, 6.7 frames/s are enough. As for retrieving the timestamps from the buffers, I'm gonna leave this for later on when getting more out of it is going to be necessary - this is anyway a good idea that's worth trying out. Thank you 6by9. |
Closing for now; do feel free to re-open if you've further questions about this! |
I've got a time-sensitive mechanism that needs to be sure the last frame it got is actually the last one that got recorded until that moment. I'm using the
start_recording
method and I'm passing an output object of a custom class I wrote that has implementedwrite
andflush
methods.The
write
method resembles this (it's pseudocode):Basically, I'm changing the state of what the camera sees (like showing a unicorn instead of a bear in the next frame) for the next frame that it records and sends it as a parameter to the write method. What I need is the assurance that in the next frame I'll be seeing whatever the
change_targeted_state_of_next_frame()
function wants it to see in the next one. Is this possible in this setting?Hopefully, I've been as clear as possible 😃
Thank you!
The text was updated successfully, but these errors were encountered: