-
Notifications
You must be signed in to change notification settings - Fork 0
Partial sim no Gaz
PART B Sim partial. For almost 2 months I have been working on (among other things) creating a proof-of-concept demo of a simulated AI drone with an onboard companion computer (CC) (and camera) that performs AI and takes over drone control mission control in the event of object recognition (in this example a human face). On 24.0131 I finally got the basic demo working. For (draft) details see
- Gdrive (https://drive.google.com/drive/folders/1HrzLExPTAL5PIKx_j_y0GJ6_RANR8Tjm)
- Document 3a_pymavlink_v24_24.0202_(haar_only).docx 2nd chapter
PS: Why do this? A quadcopter (drone with 4 rotors) can only be flown with the help of a flight controller (FC) that controls drone flight by (amazingly) only modifying rotor speeds. The pilot (RC, ground control station, etc.) only sends "simple" protocol commands (Mavlink) to the FC. Therefore, to test the flight commands the CC AI script sends to the FC (to ensure nothing crazy will happen in real flight), simulation is totally sufficient. The only real thing you need is the camera (the CC should also be real, but (for reasons explained in the docs) for now I am using an Ubuntu PC). The diagram below is my own.
PS: Why am I doing this? Because I could never find a simple demo that shows how to. And I spent a lot of time looking. Its possible that this has limited value, because the camera is not moving with the simulated drone and landscape. But I think it has value because you can test the real CC/camera early on. Note that PART A Sim all shows a demo using a sim camera on the sim drone, but I would think that this method has some big limitations (not least of which would be the sim camera/CC, which are crude approximations to the real camera/CC). In any case, this exercise is a good way to learn Mavlink (the low level communication protocol the FC uses) and how the MP (Mission Planner) / SITL work.
The following shows my home office setup (note how the tea cup is usually strategically positioned where it can do maximum damage in the event of a spill).
PART A Sim all. Completed in Dec 2023.
2024.0624 (12) Wiki, Gdrive, ZiptieAI.com, ZiptieAI docs. The author is looking for a job!
(0) Reference
EPIC 1 - Ziptie'd FPVs
(1) FPV simulators (inav notes 0608)
- 2.0 Technical overviews
- 2.1 MVP (min viable platform)
- 2.2 GPS
- 2.3 Video
- 2.4 Carbon frame
- 2.5 Missions
- 2.6 Tuning (new)
- 2.7 Post-crash rebuild 24.0620
(3) SBeeF405/BF (1b)
(4) SBeeF405/AP (1c)
EPIC 2 - Ziptie'd Pixhawks
(5) Pix6c/PX4 (2a)
(6) Pix6c/AP (2b)
EPIC 3 - AI basic
(7) AI JNano (7.2)
- 7.0 Tech overviews (?)
- 7.1 Config object recog (Jnano+cam)
- 7.2 Kitchen build/test
- 7.3 Field build/test
(8) AI PI5 (7.1) (TODO)
EPIC 4 – AI advanced
(9) FC HITL (3)
(10) SITL (total sim) (1)
- 10.1 SITL AP Core (SITL, Gaz, ROS)
- 10.2 SITL AP C++ mission as ROS node
- 10.3 SITL AP AI Yolo obj recog
- 10.4 SITL AP Lidar obj avoid
- 10.5 SITL AP Drone swarm
- 10.6 SITL AP Flight planners
- 10.7 SITL PX4 Core (SITL, Gaz, ROS) 24.0219
- 10.8 SITL PX4 Matlab UAV toolbox 24.0222
- 10.8 SITL PX4 Matlab AI sim 24.0225
(12) ROS ecosystem
(13) CC AI (2)
EPIC 5 – Advanced platforms
(14) Firmware dev (5)
(16) Special projects (5.6,5.5)