Skip to content
/ CINE Public

CINE: a Cinematic Intelligent Navigation Engine with Model-based Perception and B-spline Generation. You can find the associated sensing repository here: https://github.com/fishnos/CINE-Sensing

License

Notifications You must be signed in to change notification settings

fishnos/CINE

Repository files navigation

CINE: Cinematic Intelligent Navigation Engine

A comprehensive system for generating and optimizing cinematographic trajectories for autonomous quadrotor drones. This repository integrates EGO-Planner with cinematographic constraints to produce smooth, visually appealing flight paths that maintain proper framing, focus distance, and lighting considerations. Environment sensing incorporates image perception models and gyroscopic input.

Overview

CINE extends the EGO-Planner trajectory planning framework with cinematographic constraints to enable autonomous drone cinematography. The system modifies B-spline trajectories generated by EGO-Planner to incorporate:

  • Focus Distance Control: Maintains optimal camera-to-subject distance for proper focus
  • Framing Constraints: Ensures subjects remain properly framed using rule-of-thirds composition
  • Lighting Awareness: Adjusts trajectories to avoid poor lighting conditions
  • Smooth Motion: Applies velocity and acceleration limits for cinematic motion quality
  • Dynamic Waypoint Generation: Generates waypoints that optimize cinematographic cost functions

Key Features

Trajectory Planning

  • Baseline Trajectories: Standard EGO-Planner trajectories without cinematographic constraints
  • Constrained Trajectories: EGO-Planner trajectories modified with cinematographic constraints
  • B-spline Optimization: Real-time trajectory adjustment using B-spline control point manipulation
  • Dynamic Time Scaling: Adjusts trajectory timing to respect velocity and acceleration limits

Cinematographic Constraints

  • Focus Distance: Maintains target focus distance (default: 1.5m) with configurable tolerance
  • Framing: Rule-of-thirds composition with configurable anchor points
  • Lighting: Avoids poor lighting conditions by adjusting yaw and trajectory
  • Smoothness: Penalizes abrupt changes in viewing direction and motion

Data Collection & Analysis

  • Automated Data Collection: Scripts for collecting baseline and constrained trajectory data
  • Trajectory Statistics: Analysis tools for computing dataset-level statistics
  • Trajectory Averaging: Tools for resampling and averaging multiple trajectory runs
  • CSV Export: Structured data format for post-processing and analysis

Architecture

Core Components

  1. EGO-Planner Integration (ego-planner-swarm/)

    • Base trajectory planning using ESDF-free gradient-based local planning
    • B-spline trajectory representation and optimization
    • Path searching and collision avoidance
  2. Cinematographic Adjustment Agent (cinematographic_adjustment_agent/)

    • Cinematic B-spline Filter: Modifies EGO-Planner trajectories with cinematographic constraints
    • Cost Calculator: Computes framing, depth, lighting, and smoothness costs
    • Waypoint Generator: Generates cinematographic waypoints using cost optimization
    • State Tracker: Tracks drone state and target information
  3. Depth Estimation (depth_publisher/)

    • Depth image processing using Depth-Anything with CUDA acceleration
    • Provides depth information for focus distance calculations
  4. Object Detection (cinematographic_adjustment_agent/object_detection/)

    • Simulated object detector for target tracking
    • Simulated focus/light publishers for testing without real sensors
  5. Data Collection (cinematographic_adjustment_agent/data_collection/)

    • Trajectory data logger for recording baseline and constrained trajectories
    • CSV export with position, velocity, acceleration, and timing data

Data Flow

EGO-Planner → B-spline Trajectory → Cinematic B-spline Filter → Constrained Trajectory
                    ↓
            Cost Calculator (framing, depth, lighting, smoothness)
                    ↓
            Waypoint Generator → Optimized Waypoints

Repository Structure

CINE/
├── trajectory-optimization_ws/          # ROS 2 workspace
│   ├── src/
│   │   ├── ego-planner-swarm/          # EGO-Planner trajectory planner
│   │   ├── cinematographic_adjustment_agent/  # Cinematographic constraints
│   │   ├── depth_publisher/            # Depth estimation
│   │   └── cinematography_msgs/        # Custom ROS 2 messages
│   ├── collect_baseline_data.sh        # Baseline data collection script
│   ├── collect_constrained_data.sh     # Constrained data collection script
│   └── start_simulation.sh             # Simulation launcher
├── scripts/                            # Analysis scripts
│   ├── trajectory_stats.py            # Compute trajectory statistics
│   └── trajectory_mean.py             # Average multiple trajectories
├── trajectory_data/                    # Collected trajectory data
│   ├── baseline/                       # Baseline trajectory CSVs
│   └── cinematic/                     # Constrained trajectory CSVs
├── configs/                            # Configuration files
│   └── user_params/                   # User-configurable parameters
└── reference.bib                       # Bibliography of referenced works

Prerequisites

System Requirements

  • Ubuntu 20.04+ (or compatible Linux distribution)
  • ROS 2 Jazzy (or compatible version)
  • Python 3.8+
  • CUDA-capable GPU (for depth estimation)

Required Libraries

  • PCL (Point Cloud Library): For 3D point cloud processing
  • VTK: Visualization toolkit (PCL dependency)
  • Eigen3: Linear algebra library
  • OpenCV: Computer vision library
  • CycloneDDS: DDS middleware (recommended over FastDDS for performance)

ROS 2 Dependencies

  • rclcpp, rclpy: ROS 2 C++ and Python client libraries
  • geometry_msgs, nav_msgs, visualization_msgs: Standard ROS 2 message types
  • cv_bridge: OpenCV-ROS bridge
  • pcl_conversions: PCL-ROS conversions

Installation

1. Install ROS 2

Follow the official ROS 2 installation guide for your distribution:

# For Ubuntu 22.04 (Jazzy)
sudo apt install ros-jazzy-desktop

2. Install CycloneDDS (Recommended)

sudo apt install ros-jazzy-rmw-cyclonedds-cpp
echo "export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp" >> ~/.bashrc
source ~/.bashrc

3. Install System Dependencies

sudo apt install \
  libpcl-dev \
  libvtk9-dev \
  libeigen3-dev \
  libopencv-dev \
  python3-pip

4. Build the Workspace

cd trajectory-optimization_ws
source /opt/ros/jazzy/setup.bash
colcon build
source install/setup.bash

Usage

Data Collection

Collect Baseline Trajectories

Collects standard EGO-Planner trajectories without cinematographic constraints:

cd trajectory-optimization_ws
./collect_baseline_data.sh

Data is saved to trajectory_data/baseline_*.csv

Collect Constrained Trajectories

Collects EGO-Planner trajectories modified with cinematographic constraints:

cd trajectory-optimization_ws
# Optionally configure parameters in collect_constrained_data.env
./collect_constrained_data.sh

Data is saved to trajectory_data/cinematic_*.csv

Configuration: Edit collect_constrained_data.env to adjust:

  • USE_SIMULATED_FOCUS: Use simulated focus (default: 1)
  • USE_SIMULATED_LIGHT: Use simulated lighting (default: 1)
  • TARGET_OFFSET_XYZ: Target position offset
  • FOCUS_DEPTH_M: Desired focus distance in meters
  • DRONE_ID: Drone identifier

Trajectory Analysis

Compute Statistics

Generate summary statistics for trajectory datasets:

python scripts/trajectory_stats.py \
  --trajectory-root trajectory_data \
  --datasets baseline cinematic \
  --output-dir trajectory_data

Outputs: baseline_stats.csv, cinematic_stats.csv

Compute Trajectory Means

Resample trajectories onto a uniform time grid and compute mean trajectories:

python scripts/trajectory_mean.py \
  --trajectory-root trajectory_data \
  --datasets baseline cinematic \
  --time-step 0.005 \
  --output-dir trajectory_data

Outputs: baseline_mean_std_trajectory.csv, cinematic_mean_std_trajectory.csv

Running Simulations

Launch Simulation with Baseline Planning

cd trajectory-optimization_ws
./start_simulation.sh baseline

Launch Simulation with Cinematographic Constraints

cd trajectory-optimization_ws
./start_simulation.sh constrained

Configuration

Cinematographic Parameters

Edit configs/user_params/constants.py to adjust:

  • Focus Constraints: DESIRED_FOCUS_DISTANCE_M, FOCUS_DISTANCE_MIN_M
  • Framing: THIRDS_ANCHOR (rule-of-thirds anchor points)
  • Image Parameters: IMAGE_WIDTH, IMAGE_HEIGHT

Trajectory Constraints

Edit cinematographic_adjustment_agent/config/constants.py to adjust:

  • Velocity Limits: FOCUS_MAX_VELOCITY_MPS, DEFAULT_MAX_VELOCITY_MPS
  • Acceleration Limits: FOCUS_MAX_ACCEL_MPS2, DEFAULT_MAX_ACCEL_MPS2
  • Yaw Constraints: YAWDOT_MAX, YAWDOT_MAX_FRAMING
  • Lighting: LIGHT_AVOID_GAIN, LIGHT_INFLUENCE_DISTANCE_M

Data Format

Trajectory CSV files contain the following columns:

  • time: Timestamp (seconds)
  • pos_x, pos_y, pos_z: Position (meters)
  • vel_x, vel_y, vel_z: Velocity (m/s)
  • acc_x, acc_y, acc_z: Acceleration (m/s²)
  • yaw: Yaw angle (radians)
  • Additional metadata columns as needed

Key Technologies

  • EGO-Planner: ESDF-free gradient-based local planner for quadrotors
  • ROS 2: Robot Operating System framework
  • B-splines: Trajectory representation using uniform B-splines
  • L-BFGS: Limited-memory BFGS optimization algorithm
  • PCL: Point Cloud Library for 3D processing
  • OpenCV: Computer vision and image processing
  • Eigen: C++ linear algebra library

See reference.bib for complete bibliography of referenced works.

License

See LICENSE file for details.

Citation

If you use this work, please cite:

  • EGO-Planner: Zhou et al., "EGO-Planner: An ESDF-free Gradient-based Local Planner for Quadrotors," IEEE Robotics and Automation Letters, 2021
  • See reference.bib for additional citations

Contributing

This repository is part of research on autonomous cinematography for drones. For questions or contributions, please refer to the project maintainers.

About

CINE: a Cinematic Intelligent Navigation Engine with Model-based Perception and B-spline Generation. You can find the associated sensing repository here: https://github.com/fishnos/CINE-Sensing

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published