Skip to content
Joonas Trussmann edited this page May 5, 2023 · 14 revisions

DJI FPV Architecture

The DJI FPV system uses a shared architecture between the Goggles, Air Unit, and Vista.

The main Goggles, Air Unit, and Vista processor is a DJI P1 (seemingly codenamed Pigeon), a dual-core ARM processor with several coprocessors onboard. At least two Ceva XC4500 and one Ceva X1643 DSP are known so far, as well as a TrustZone with CryptoCell. Other processors in the family include E1 (Eagle) in the FPV Drone and S1 (Sparrow) in the V2 Remote Controller.

The basic FPV system runs Linux on one core and a custom real-time application on the second core.

On the Linux core, there's an Android kernel and base userland, not Android UI services like SurfaceFlinger or MediaServer.

For this reason, basic Android principles and services are all in play:

adb can be used for file transfer and debugging. The Android logger is employed, so adb logcat will give you interesting logs. setprop can be used to affect the Properties system and trigger actions in the running system.

DJI specific software is all stored in the /system partition.

DJI processes communicate with one another using an event bus and a message passing system called DUML.

The most important processes are in /system/bin , and libraries in /system/lib:

  • dji_glasses (V1) or dji_gls_wm150 (V2) : The base DJI DIY FPV UI process. Has debugging symbols, so a great place to start. This process handles rendering the overlay UI using disko and DirectFB, coordinating video recording and playback using DMI devices, rendering the screensaver using direct frame buffer access, receiving button commands over the DJI event bus, and sending DJI bus messages to dji_hdvt_gnd to handle radio configuration actions. Notably, it does not handle real time FPV feed display, which is coordinated by dji_hdvt_gnd
  • dji_glasses (V2) : The DJI Drone UI process. Whether to start dji_gls_wm150 or dji_glasses is selected by a script at boot time, and a full reboot is required to switch modes as the Drone also uses a different baseband/SDR RTOS image.
  • dji_hdvt_gnd : The "Ground Station" HD Video Transmission service. This handles supervising the real time video feed, by responding to DJI DUML message bus requests to tell the radio baseband what to do using DMI devices and memory mapping.

These services communicate over the DJI event bus using DUPC, a simple binary serialized / packed message bus format. dji_mb_ctrl and dji_mb_parser can be used to manipulate the message bus from userland. The messsage bus is also exposed over a USB-serial gadget, which can be communicated with using https://github.com/o-gs/dji-firmware-tools and comm_serialtalk.py , or online using https://b3yond.d3vl.com/duml/ . The DJI event bus is also forwarded between the Air Unit / Vista and the Goggles over the radio link's IP layer.

In /system/lib:

  • libtp1801_gui.so and libmms*.so : The Disko GUI framework. This uses DirectFB to render widgets on the screen - the whole menu system is implemented this way using XML definitions in /system/gui .
  • libduml_*.so : The frameworks used by all shared DJI applications. FRWK and HAL contain hardware abstraction. OSAL contains OS Abstraction - basically RTOS primitives built on top of PThreads. Util contains... utilities.

UI is driven through a DirectFB mux - dji_glasses takes over DirectFB and something at a lower level using ICC and the "DMI devices" handles the mux between the overlay DirectFB and the raw video data from the baseband. An example of using DirectFB can be found at https://github.com/fpv-wtf/dfbdoom .

The base RF communication between the Goggles and Air Unit/Vista happens through a baseband OS called, rather simply, SDR, which runs on a second core on the same P1 processor, with shared address space. The second core is sometimes referred to as CP (Coprocessor), sometimes as SDR, and sometimes as rtos in various places in code.

Division between the RTOS and Linux seems to be through simple core affinity - Linux gets one core and the RF baseband gets the other core. Communication between the two systems happens through a combination of a proprietary ICC (Inter Core Communication) system, by virtual modem devices called "dmi" which are exposed in /dev by the pigeon_modem module (using the ICC framework under the covers), and by simply mapping buffers by opening /dev/mem since address space is shared.

cp_RTOS.bin is the RTOS image for the second core. This can be extracted, along with other Coprocessor data (like the DSP code and some data) from the file called cp.img in an update package, using https://github.com/o-gs/dji-firmware-tools.

There are a few interesting kernel modules, as well:

  • pigeon_modem - this seems to provide the virtual networking interface which tunnels IP over the SDR radio link. It also provides /dev/dmi* devices which are used to, for example, stream video to and from the SDR baseband.
  • icc_chnl_v2 - this is used by pigeon_modem to accomplish Inter Core Communication (ICC) with the SDR OS. It includes the usual queue, mailbox, and shared buffer primitives, seemingly with support for polling and interrupt driven queues and mailboxes.
  • eagle-dsp - responsible for firmware upload and communication with a proprietary DSP coprocessor.
  • linux_fusion - this is the DirectFB Fusion IPC driver. Fusion IPC seems to only be used for display tasks as far as I can tell, with DJI IPC mechanisms used in most places instead.
  • prores-enc - it seems there's a hardware Prores encoder present (!)
  • csa_dev - I think this is some kind of checksum offload accelerator.

Common things people might want to do:

  • UI Customization: the UI for DIY FPV mode is defined using XML in /system/gui/xml/ . The actions triggered by GUI elements are built using the Disko open-source UI framework and are defined in libtp1801_gui.so. You can arrange menus, add/remove items, etc. simply by editing the XML files. Changing the actual GUI actions is harder and requires patching or intercepting method calls.

  • MSP / Canvas Mode OSD / etc: There are a few ways this could be done. https://github.com/fpv-wtf/msp-osd pursues one approach: sending MSP DisplayPort commands over UDP back to an overlay renderer running on the goggles.

  • Video In. The Apple input for iOS Goggles interface is managed in dji_hdvt_gnd and libduml_frwk. These would be good places to start. Or, a custom application which opened the USB Bulk endpoints and directly accepted data could work as well - performance optimization would be interesting. There's a video playback DMI device which works similarly to the video recording DMI device used here: https://github.com/fpv-wtf/msp-osd/blob/recording/recorder.c . We might be able to write an application which bulk-streams data from a USB transfer endpoint (using hardware encoding on the PC side) into the video playback DMI device. It remains to be seen whether latency with this method could be made acceptable.

DJI