Skip to content

ivogeorg/hovergames3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

hovergames3

Overview

Drone plans a route for the rover "to the agricultural field target" and guides it along. Drone designates the plot of land to be used by the rover. The rover organizes the plot into rows. The rover plans a route to visit each "plant". A virtual multi-tool robotic arm "attached" to the rover is used to treat each plant according to its state.

Development plan: hardware and software stacks

  1. Rover.
    1. Set up PX4 development environment and test it with the default simulator.
    2. Set up RDDRONE vehicle programming (bootloader, J-LINK).
    3. Program the rover with FMUK66 firmware.
    4. Set up the FS-i6S radio transmitter.
    5. Update QGroundControl.
    6. Set up the rover in QGC.
    7. Figure out how to arm (possibly through GPS).
    8. Manually operate the rover. Don't exhaust the battery.
    9. Tune the steernig angle if necessary.
    10. Mount the telemetry transmitter.
    11. Drive the rover with telemetry until battery is exhausted. Have a spare (no need to charge, if new one).
    12. (Optional) Try the NXP example application.
  2. NavQ+: rover companion computer. Answer questions and rearange bullets.
    1. How is NavQ+ programmed?
    2. Inspect and inventory package.
    3. Collect online documentation and resources.
    4. Find a setup guide and follow it. Put steps underneath this one!
    5. Establish the development cycle for NavQ+ applications. (See NXP i.MX 8M Plus block diagram.)
      1. Applications running on the 4-core Cortex-A53.
      2. Applications running on the single-core Cortex-M7. Real time?
      3. Applications requiring machine learning acceleration.
    6. Connectivity of NavQ+ and FMUK66. Answer questions and rearrange bullets.
      1. How does the companion computer figure in autonomous rover motion and navigation?
      2. This seems overly complicated. The only documentation from NXP so far is the NavQPlus_MR-Buggy3 Tradeshow Demo Guide and that includes ROS2 and a number of additional CAN devices.) Problems:
        1. The hovergames3 forum has information about ROS2 not working (no bridge, though old bridge seems to work).
        2. The PX4 development stack does not support ROS2 under macOS.
    7. Mounting and powering of NavQ+ on the rover.
  3. Coral cam with NavQ+: rover computer vision. Answer questions and rearange bullets.
    1. Can the view from the Coral cam be seen on QGroundControl?
    2. How to do dynamic video object segmentation on the rover and how to see it live?
  4. Bosch BME688: rover environmental sensor.
    1. Connect to FMUK66 (and read through telemetry?) or to NavQ+ and integrate in application (e.g. overlay on video stream)?
  5. Drone.
    1. Exchange the FMUK66 and GPS with the new kit.
    2. Set up in QGroundControl.
    3. Get to very conservative manual flight.
  6. NavQ: drone companion computer.
    1. Connectivity with FMUK66.
    2. Mounting on side plate.
    3. Mounting of Coral cam on front plate (possibly on the bottom for downward vision).
  7. Coral cam with NavQ: drone computer vision.
  8. Communication between rover and drone?
    1. Set up a WiFi link between the two vehicles.
    2. How can the two vehicles get on the same PX4 uORB network?
    3. Is this where ROS2 becomes inevitable?

Development plan: applications

  1. Drone plans a route for the rover to a work destination and guides it along.

    1. Autonomously fly from start to destination and land. If enough battery, return.
      1. Where can one fly drones?
      2. What about houses, trees, and power lines?
      3. Flying above all obstacles or flying at 7-10 feet to avoid them?
    2. Fly above a ground path.
      1. How to define it manually?
      2. How can the drone, given a destination, use vision and terrain understanding to plan the path for the rover?
    3. Get in the air and find the rover on the ground.
      1. Should there be a beacon on the rover for the drone to home in on? Are there other ways for give the drone a heuristic for locating the rover, based on the rover's normal radio activity?
      2. Segment the view from the drone camera and label the rover.
    4. Follow the rover.
      1. Borrow from the Follow-Me application.
    5. Create a common 3D AR virtual space for the two camera views.
      1. This should be a moving box strictly corresponing to the environment, and be used for high-resolution planning.
      2. The two cameras should show it overlayed on their video streams and should continously update and fine-tune to ensure maximum correspondence of the virtual space to the physical space, including the locations of the two vehicles.
    6. Guide the rover along a path.
      1. Regardless of how the path is generated, drone-planned or geo-labeled, guide the rover along it.
      2. Display the path overlayed in the drone camera view in QGroundControl.
      3. Create AR markers "in front of" the rover's camera view to follow the path. How will the virtual space overlays be communicated and shared between the vehicles? This may need to be borrowed from multi-player game development and AR applications.
    7. Dynamic path replanning to avoid obstacles.
      1. Cars when crossing the street.
      2. Puddles and ice patches.
      3. Fallen branches and other debris.
      4. Other obstacles defined by the criterion that the rover cannot go through (e.g. a space between two slats of a bridge that may trap one of the wheels).
      5. Avoid people. Need to add a forward-looking camera to the drone and train the drone to pick its flight path based on what it "sees". This is a an application in itself!
  2. Rover performs agricultural mission on the site/plot it has been led to.

    1. Define the plot of land in the common virtual space.
      1. The drone should define the boundaries, while the rover should organize it into rows and "plant" position with sufficient spacing for rover navigation.
    2. Virtual (partial digital-twin) multi-tool (multi-applicator) robotic arm for the rover.
      1. Possible tools/applicators are: liquids (water, pesticide, fertilizer), tools (high-zoom camera, sampler), physical (environmental sensors, particulate sensors).
    3. Work the "plants" in the plot.
      1. Plan a trajectory including all the "plants" (objects in the shared virtual space) and "work" on each one.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published