Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ source "https://rubygems.org"
#
# This will help ensure the proper Jekyll version is running.
# Happy Jekylling!
gem "jekyll", "~> 4.2.1"
gem "jekyll", "~> 4.2.2"

# This is the default theme for new Jekyll sites. You may change this to anything you like.
gem "minima", "~> 2.0"
Expand Down
16 changes: 8 additions & 8 deletions Gemfile.lock
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@ GEM
eventmachine (>= 0.12.9)
http_parser.rb (~> 0)
eventmachine (1.2.7)
ffi (1.15.4)
ffi (1.15.5)
forwardable-extended (2.6.0)
http_parser.rb (0.8.0)
i18n (1.8.11)
i18n (1.10.0)
concurrent-ruby (~> 1.0)
jekyll (4.2.1)
jekyll (4.2.2)
addressable (~> 2.4)
colorator (~> 1.0)
em-websocket (~> 0.5)
Expand All @@ -35,7 +35,7 @@ GEM
jekyll (>= 3.0.0)
jekyll-feed (0.16.0)
jekyll (>= 3.7, < 5.0)
jekyll-sass-converter (2.1.0)
jekyll-sass-converter (2.2.0)
sassc (> 2.0.1, < 3.0)
jekyll-seo-tag (2.7.1)
jekyll (>= 3.8, < 5.0)
Expand All @@ -46,7 +46,7 @@ GEM
kramdown-parser-gfm (1.1.0)
kramdown (~> 2.0)
liquid (4.0.3)
listen (3.7.0)
listen (3.7.1)
rb-fsevent (~> 0.10, >= 0.10.3)
rb-inotify (~> 0.9, >= 0.9.10)
mercenary (0.4.0)
Expand All @@ -62,11 +62,11 @@ GEM
forwardable-extended (~> 2.6)
public_suffix (4.0.6)
racc (1.6.0)
rb-fsevent (0.11.0)
rb-fsevent (0.11.1)
rb-inotify (0.10.1)
ffi (~> 1.0)
rexml (3.2.5)
rouge (3.27.0)
rouge (3.28.0)
safe_yaml (1.0.5)
sassc (2.4.0)
ffi (~> 1.9)
Expand All @@ -83,7 +83,7 @@ PLATFORMS
ruby

DEPENDENCIES
jekyll (~> 4.2.1)
jekyll (~> 4.2.2)
jekyll-asciidoc
jekyll-feed (~> 0.16)
minima (~> 2.0)
Expand Down
4 changes: 2 additions & 2 deletions documentation/asciidoc/accessories/build-hat/net-brick.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ Console.WriteLine($"{BitConverter.ToString(info.Signature)}");
Console.WriteLine($"Vin = {brick.InputVoltage.Volts} V");
----

NOTE: the input voltage is read only once at boot time and is not read again afterwards.
NOTE: The input voltage is read only once at boot time and is not read again afterwards.

==== Getting sensors and motors details

Expand Down Expand Up @@ -111,4 +111,4 @@ The brick can take long before it initializes. A wait for sensor to be connected
brick.WaitForSensorToConnect(SensorPort.PortB);
----

It does as well take a `CancellationToken` if you want to implement advance features like warning the user after some time and retrying.
It does as well take a `CancellationToken` if you want to implement advance features like warning the user after some time and retrying.
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ The legacy camera stack can be re-enabled in Bullseye using the following steps.
1. Ensure your system is up-to-date and reboot it.
2. Run `sudo raspi-config`.
3. Navigate to `Interface Options` and select `Legacy camera` to enable it.
4. Reboot your Pi again.
4. Reboot your Raspberry Pi again.

These steps are shown in the following video.

Expand Down
4 changes: 2 additions & 2 deletions documentation/asciidoc/accessories/camera/csi-2-usage.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -181,11 +181,11 @@ There are a couple of commercially available boards that connect this chip to th

This driver is loaded using the `config.txt` dtoverlay `tc358743`.

The chip also supports capturing stereo HDMI audio via I2S. The Auvidea boards break the relevant signals out onto a header, which can be connected to the Pi's 40 pin header. The required wiring is:
The chip also supports capturing stereo HDMI audio via I2S. The Auvidea boards break the relevant signals out onto a header, which can be connected to the Raspberry Pi's 40-pin header. The required wiring is:

[cols=",^,^,^"]
|===
| Signal | B101 header | Pi 40 pin header | BCM GPIO
| Signal | B101 header | 40-pin header | BCM GPIO

| LRCK/WFS
| 7
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@ Building `libcamera` and `libcamera-apps` for yourself can bring the following b

* You can pick up the latest enhancements and features.

* `libcamera-apps` can be compiled with extra optimisation for Pi 3 and Pi 4 devices running a 32-bit OS.
* `libcamera-apps` can be compiled with extra optimisation for Raspberry Pi 3 and Raspberry Pi 4 devices running a 32-bit OS.

* You can include the various optional OpenCV and/or TFLite post-processing stages (or add your own).

* You can customise or add your own applications derived from `libcamera-apps`.

NOTE: When building for Pi 3 or earlier devices there is a risk that the device may run out of swap and fail. We recommend either increasing the amount of swap, or building with fewer threads (the `-j` option to `ninja` and to `make`).
NOTE: When building on a Raspberry Pi with 1GB or less of RAM, there is a risk that the device may run out of swap and fail. We recommend either increasing the amount of swap, or building with fewer threads (the `-j` option to `ninja` and to `make`).

==== Building `libcamera-apps` without rebuilding `libcamera`

Expand Down Expand Up @@ -89,7 +89,7 @@ The only difference is that the latter also builds the `qcam` test application,
To complete the `libcamera` build, please run

----
ninja -C build # use -j 2 on Pi 3 or earlier devices
ninja -C build # use -j 2 on Raspberry Pi 3 or earlier devices
sudo ninja -C build install
----

Expand Down Expand Up @@ -138,7 +138,7 @@ cd build

At this point you will need to run `cmake` after deciding what extra flags to pass it. The valid flags are:

* `-DENABLE_COMPILE_FLAGS_FOR_TARGET=armv8-neon` - you may supply this when building for Pi 3 or Pi 4 devices running a 32-bit OS. Some post-processing features may run more quickly.
* `-DENABLE_COMPILE_FLAGS_FOR_TARGET=armv8-neon` - you may supply this when building for Raspberry Pi 3 or Raspberry Pi 4 devices running a 32-bit OS. Some post-processing features may run more quickly.

* `-DENABLE_DRM=1` or `-DENABLE_DRM=0` - this enables or disables the DRM/KMS preview rendering. This is what implements the preview window when X Windows is not running.

Expand All @@ -162,16 +162,16 @@ and for Raspberry Pi OS Lite users:
cmake .. -DENABLE_DRM=1 -DENABLE_X11=0 -DENABLE_QT=0 -DENABLE_OPENCV=0 -DENABLE_TFLITE=0
----

In both cases, consider `-DENABLE_COMPILE_FLAGS_FOR_TARGET=armv8-neon` if you are using a 32-bit OS on a Pi 3 or Pi 4. Consider `-DENABLE_OPENCV=1` if you have installed _OpenCV_ and wish to use OpenCV-based post-processing stages. Finally also consider `-DENABLE_TFLITE=1` if you have installed _TensorFlow Lite_ and wish to use it in post-processing stages.
In both cases, consider `-DENABLE_COMPILE_FLAGS_FOR_TARGET=armv8-neon` if you are using a 32-bit OS on a Raspberry Pi 3 or Raspberry Pi 4. Consider `-DENABLE_OPENCV=1` if you have installed _OpenCV_ and wish to use OpenCV-based post-processing stages. Finally also consider `-DENABLE_TFLITE=1` if you have installed _TensorFlow Lite_ and wish to use it in post-processing stages.

After executing the `cmake` command of your choice, the whole process concludes with the following:

----
make -j4 # use -j1 on Pi 3 or earlier devices
make -j4 # use -j1 on Raspberry Pi 3 or earlier devices
sudo make install
sudo ldconfig # this is only necessary on the first build
----

NOTE: If you are using an image where `libcamera-apps` have been previously installed as an `apt` package, and you want to run the new `libcamera-apps` executables from the same terminal window where you have just built and installed them, you may need to run `hash -r` to be sure to pick up the new ones over the system supplied ones.

Finally, if you have not already done so, please be sure to follow the `dtoverlay` and display driver instructions in the xref:camera.adoc#getting-started[Getting Started section] (and rebooting if you changed anything there).
Finally, if you have not already done so, please be sure to follow the `dtoverlay` and display driver instructions in the xref:camera.adoc#getting-started[Getting Started section] (and rebooting if you changed anything there).
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

==== Using the camera for the first time

NOTE: On Pi3 and earlier devices running _Bullseye_ you need to re-enable _Glamor_ in order to make the X-Windows hardware accelerated preview window work. To do this enter `sudo raspi-config` at a terminal window and then choose `Advanced Options`, `Glamor` and `Yes`. Finally quit `raspi-config` and let it reboot your Pi.
NOTE: On Raspberry Pi 3 and earlier devices running _Bullseye_ you need to re-enable _Glamor_ in order to make the X-Windows hardware accelerated preview window work. To do this enter `sudo raspi-config` at a terminal window and then choose `Advanced Options`, `Glamor` and `Yes`. Finally quit `raspi-config` and let it reboot your Raspberry Pi.

When running a Raspberry Pi OS based on _Bullseye_, the 5 basic `libcamera-apps` are already installed. In this case, official Raspberry Pi cameras will also be detected and enabled automatically.

Expand All @@ -17,7 +17,7 @@ You should see a camera preview window for about 5 seconds.

Users running _Buster_ will need to xref:camera.adoc#binary-packages[install one of the `libcamera-apps` packages] first and then configure their `/boot/config.txt` file xref:camera.adoc#if-you-do-need-to-alter-the-configuration[with the appropriate overlay] for the connected camera. Be aware that _libcamera_ and the legacy _raspicam_ stack cannot operate at the same time - to return to the legacy stack after using _libcamera_ you will need to comment out the `dtoverlay` change you made and reboot the system.

NOTE: Pi 3 and older devices may not by default be using the correct display driver. Refer to the `/boot/config.txt` file and ensure that either `dtoverlay=vc4-fkms-v3d` or `dtoverlay=vc4-kms-v3d` is currently active. Please reboot if you needed to change this.
NOTE: Raspberry Pi 3 and older devices may not by default be using the correct display driver. Refer to the `/boot/config.txt` file and ensure that either `dtoverlay=vc4-fkms-v3d` or `dtoverlay=vc4-kms-v3d` is currently active. Please reboot if you needed to change this.

==== If you do need to alter the configuration

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

`libcamera` is a new software library aimed at supporting complex camera systems directly from the Linux operating system. In the case of the Raspberry Pi it enables us to drive the camera system directly from open source code running on ARM processors. The proprietary code running on the Broadcom GPU, and to which users have no access at all, is almost completely by-passed.

`libcamera` presents a C++ API to applications and works at the level of configuring the camera and then allowing an application to request image frames. These image buffers reside in system memory and can be passed directly to still image encoders (such as JPEG) or to video encoders (such as h.264), though such ancillary functions as encoding images or displaying them are strictly beyond the purview of `libcamera` itself.
`libcamera` presents a {cpp} API to applications and works at the level of configuring the camera and then allowing an application to request image frames. These image buffers reside in system memory and can be passed directly to still image encoders (such as JPEG) or to video encoders (such as h.264), though such ancillary functions as encoding images or displaying them are strictly beyond the purview of `libcamera` itself.

For this reason Raspberry Pi supplies a small set of example `libcamera-apps`. These are simple applications, built on top of `libcamera`, and are designed largely to emulate the function of the legacy stack built on Broadcom's propretary GPU code (some users will recognise these legacy applications as `raspstill` and `raspivid`). The applications we provide are:

Expand All @@ -13,7 +13,7 @@ For this reason Raspberry Pi supplies a small set of example `libcamera-apps`. T
* _libcamera-still_ A more complex still image capture application which emulates more of the features of `raspistill`.
* _libcamera-vid_ A video capture application.
* _libcamera-raw_ A basic application for capturing raw (unprocessed Bayer) frames directly from the sensor.
* _libcamera-detect_ This application is not built by default, but users can build it if they have TensorFlow Lite installed on their Pi. It captures JPEG images when certain objects are detected.
* _libcamera-detect_ This application is not built by default, but users can build it if they have TensorFlow Lite installed on their Raspberry Pi. It captures JPEG images when certain objects are detected.

Raspberry Pi's `libcamera-apps` are not only command line applications that make it easy to capture images and video from the camera, they are also examples of how users can create their own libcamera-based applications with custom functionality to suit their own requirements. The source code for the `libcamera-apps` is freely available under a BSD 2-Clause licence at https://github.com/raspberrypi/libcamera-apps[].

Expand All @@ -25,11 +25,11 @@ The `libcamera` source code can be found and checked out from the https://git.li

Underneath the `libcamera` core, Raspberry Pi provides a custom _pipeline handler_, which is the layer that `libcamera` uses to drive the sensor and ISP (Image Signal Processor) on the Raspberry Pi itself. Also part of this is a collection of well-known _control algorithms_, or _IPAs_ (Image Processing Algorithms) in `libcamera` parlance, such as AEC/AGC (Auto Exposure/Gain Control), AWB (Auto White Balance), ALSC (Auto Lens Shading Correction) and so on.

All this code is open source and now runs on the Pi's ARM cores. There is only a very thin layer of code on the GPU which translates Raspberry Pi's own control parameters into register writes for the Broadcom ISP.
All this code is open source and now runs on the Raspberry Pi's ARM cores. There is only a very thin layer of code on the GPU which translates Raspberry Pi's own control parameters into register writes for the Broadcom ISP.

Raspberry Pi's implementation of `libcamera` supports not only the three standard Raspberry Pi cameras (the OV5647 or V1 camera, the IMX219 or V2 camera and the IMX477 or HQ camera) but also third party senors such as the IMX290, IMX327, OV9281, IMX378. Raspberry Pi is keen to work with vendors who would like to see their sensors supported directly by `libcamera`.

Moreover, Raspberry Pi supplies a _tuning file_ for each of these sensors which can be edited to change the processing performed by the Pi hardware on the raw images received from the image sensor, including aspects like the colour processing, the amount of noise suppression or the behaviour of the control algorithms.
Moreover, Raspberry Pi supplies a _tuning file_ for each of these sensors which can be edited to change the processing performed by the Raspberry Pi hardware on the raw images received from the image sensor, including aspects like the colour processing, the amount of noise suppression or the behaviour of the control algorithms.

For further information on `libcamera` for the Raspberry Pi, please consult the https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf[Tuning Guide for the Raspberry Pi cameras and libcamera].

Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ image::images/negate.jpg[Image with negate]

The `hdr` stage implements both HDR (high dynamic range) imaging and DRC (dynamic range compression). The terminology that we use here regards DRC as operating on single images, and HDR works by accumulating multiple under-exposed images and then performing the same algorithm as DRC.

The `hdr` stage has no dependencies on 3rd party libraries, but (like some other stages) may execute more quickly on Pi 3 or Pi 4 devices running a 32-bit OS if recompiled using `-DENABLE_COMPILE_FLAGS_FOR_TARGET=armv8-neon` (please see the xref:camera.adoc#building-libcamera-and-libcamera-apps[build instructions]). Specifically, the image accumulation stage will run quicker and result in fewer frame drops, though the tonemapping part of the process is unchanged.
The `hdr` stage has no dependencies on 3rd party libraries, but (like some other stages) may execute more quickly on Raspberry Pi 3 or Raspberry Pi 4 devices running a 32-bit OS if recompiled using `-DENABLE_COMPILE_FLAGS_FOR_TARGET=armv8-neon` (please see the xref:camera.adoc#building-libcamera-and-libcamera-apps[build instructions]). Specifically, the image accumulation stage will run quicker and result in fewer frame drops, though the tonemapping part of the process is unchanged.

The basic procedure is that we take the image (which in the case of HDR may be multiple images accumulated together) and apply an edge-preserving smoothing filter to generate a low pass (LP) image. We define the high pass (HP) image to be the difference between the LP image and the original. Next we apply a global tonemap to the LP image and add back the HP image. This procedure, in contrast to applying the tonemap directly to the original image, prevents us from squashing and losing all the local contrast in the resulting image.

Expand All @@ -93,7 +93,7 @@ In summary, the user-configurable parameters fall broadly into three groups: tho

We note that the overall strength of the processing is best controlled by changing the `global_tonemap_strength` and `local_tonemap_strength` parameters.

The full processing takes between 2 and 3 seconds for a 12MP image on a Pi 4. The stage runs only on the still image capture, it ignores preview and video images. In particular, when accumulating multiple frames, the stage "swallows" the output images so that the application does not receive them, and finally sends through only the combined and processed image.
The full processing takes between 2 and 3 seconds for a 12MP image on a Raspberry Pi 4. The stage runs only on the still image capture, it ignores preview and video images. In particular, when accumulating multiple frames, the stage "swallows" the output images so that the application does not receive them, and finally sends through only the combined and processed image.

Default `drc.json` file for DRC:

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
=== Post-Processing with TensorFlow Lite

NOTE: These stages require TensorFlow Lite (TFLite) libraries to be installed that export the C++ API. Unfortunately the TFLite libraries are not normally distributed conveniently in this form, however, one place where they can be downloaded is https://lindevs.com/install-precompiled-tensorflow-lite-on-raspberry-pi/[lindevs.com]. Please follow the installation instructions given on that page. Subsequently you may need to recompile `libcamera-apps` with TensorFlow Lite support - please follow the instructions for xref:camera.adoc#building-libcamera-and-libcamera-apps[building `libcamera-apps` for yourself].
NOTE: These stages require TensorFlow Lite (TFLite) libraries to be installed that export the {cpp} API. Unfortunately the TFLite libraries are not normally distributed conveniently in this form, however, one place where they can be downloaded is https://lindevs.com/install-precompiled-tensorflow-lite-on-raspberry-pi/[lindevs.com]. Please follow the installation instructions given on that page. Subsequently you may need to recompile `libcamera-apps` with TensorFlow Lite support - please follow the instructions for xref:camera.adoc#building-libcamera-and-libcamera-apps[building `libcamera-apps` for yourself].

==== `object_classify_tf` stage

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ and forward it to the preview window:
app.ShowPreview(completed_request, app.ViewfinderStream());
----

One important thing to note is that every `CompletedRequest` must be recycled back to the camera system so that the buffers can be reused, otherwise it will simply run out of buffers in which to receive new camera frames. This recycling process happens automatically when all references to the `CompletedRequest` are dropped, using C++'s _shared pointer_ and _custom deleter_ mechanisms.
One important thing to note is that every `CompletedRequest` must be recycled back to the camera system so that the buffers can be reused, otherwise it will simply run out of buffers in which to receive new camera frames. This recycling process happens automatically when all references to the `CompletedRequest` are dropped, using {cpp}'s _shared pointer_ and _custom deleter_ mechanisms.

In `libcamera-hello` therefore, two things must happen for the `CompletedRequest` to be returned to the camera.

Expand Down
Loading