Passive, GPS-free localization for lunar rovers using star pattern matching and particle filter inference.
A lunar rover with no GPS and no prior maps can still determine its position to within ~40 km by looking at the sky. StarNav implements this capability: it takes a raw camera image of the lunar sky, detects stars, matches their geometric arrangement against a real star catalog, and runs a particle filter over a 200 km region of interest to output an estimated surface coordinates.
| Metric | Baseline (v1) | StarNav (v2) |
|---|---|---|
| Star detection rate | ~60% | 100% |
| Triangle signatures extracted | 20 | 364 |
| Search space | 38M km² (whole Moon) | 126k km² (200 km region) |
| Particle density | 0.00013 / km² | 0.008 / km² |
| Typical localization error | 40–50° | ~40 km |
Green circles = stars detected in the image. Red circles = predicted star positions at the estimated location. Near-perfect alignment confirms a correct localization.
Particle weights mapped over the search region. The bright cluster marks where the star patterns match best — the algorithm's estimate of the rover's position.
1000 particles sampled across the 200 km region of interest, colored by weight after star-pattern matching.
Camera image (1024×1024)
│
▼ Multi-threshold blob detection
Star centroids [(x₁,y₁), ..., (x₁₄,y₁₄)]
│
▼ Pinhole camera model + ENU frame transform
Unit vectors in lunar ENU frame
│
▼ All C(14,3) = 364 triangle combinations
Triangle signatures — rotation-invariant angular distances
│
▼ Particle filter over 200 km region
Weighted average of 1000 candidate locations
│
▼
Estimated (lat, lon) ± uncertainty
A multi-threshold algorithm sweeps thresholds [50, 70, 90, 110, 130] to catch stars across all brightness levels, followed by connected-components analysis and OpenCV's SimpleBlobDetector. Duplicate detections within 6 pixels are merged. This achieves 100% detection on simulated images, vs ~60% with a single-threshold approach.
Detected pixel centroids are unprojected through a pinhole camera model into the lunar ENU (East-North-Up) frame using the camera's known pointing direction. The star catalog (BRIGHT_STARS, 150+ real stars in ICRS J2000 coordinates) is transformed through:
ICRS → Moon-fixed frame (via tidal-lock orientation) → local ENU
All C(n, 3) triangle combinations of detected star vectors are computed. Each triangle is represented by three sorted inter-star angular distances (in radians) — a signature that is invariant to rotation and translation. For 14 detected stars this yields 364 signatures.
1000 particles are sampled uniformly within a 200 km radius of a known approximate location (e.g., from prior mission knowledge). Each particle is scored by counting how many observed triangle signatures match the catalog-predicted signatures at that location (tolerance = 0.12 rad). Weights are raised to the 1.8 power to sharpen the distribution, then normalized and used in a weighted mean to produce the final estimate.
StarNav/
├── lunar_sky_model.py # Star catalog (150+ real stars) + lunar coordinate transforms
├── localization.py # Main system: star detection, triangle matching, particle filter
├── localization_v1.py # Baseline system (global Moon search, single threshold)
├── generate_report_plots.py # Visualization scripts for results
├── tests/
│ ├── test_localization.py # Tests for the improved system
│ ├── test_integration.py # End-to-end integration tests
│ └── test_simple.py # Basic sanity checks
├── tools/
│ ├── debug_vectors.py # ENU vector diagnostics
│ └── diagnose.py # System diagnostics
├── results/
│ └── report_plots/ # Output visualizations
└── docs/
└── SYSTEM_ARCHITECTURE.md # Detailed system diagrams and data flow
git clone https://github.com/kothari1/StarNav.git
cd StarNav
pip install -r requirements.txtDependencies: numpy, astropy, opencv-python, scipy, pillow, matplotlib
import numpy as np
from astropy.time import Time
from localization import ImprovedLunarLocalizer
# Define the 200 km region of interest around approximate known position
localizer = ImprovedLunarLocalizer(
region_center_lat=10.0, # degrees
region_center_lon=20.0, # degrees
region_radius_km=200.0
)
# Run particle filter on a rover camera image
image = np.array(...) # grayscale, shape (1024, 1024)
estimated_loc, covariance = localizer.run_particle_filter(
image,
Time("2025-06-01 12:00:00", scale="utc"),
n_particles=1000,
camera_params={
'altitude': 30.0, # camera pointing altitude (degrees)
'azimuth': 45.0, # camera pointing azimuth (degrees)
'roll': 0.0
}
)
print(f"Estimated location: {estimated_loc[0]:.2f}°N, {estimated_loc[1]:.2f}°E")# Generate all report plots
python generate_report_plots.py
# Run the test suite
python tests/test_localization.py| Parameter | Value | Notes |
|---|---|---|
| Star catalog size | 150+ stars | Real ICRS J2000 coordinates |
| Detection thresholds | [50, 70, 90, 110, 130] | Multi-scale |
| Duplicate removal radius | 6 px | Merge nearby detections |
| Camera FOV | 45° | Configurable |
| Triangle tolerance | 0.12 rad (6.9°) | Match threshold |
| Weight exponent | 1.8 | Sharpens particle distribution |
| Region radius | 200 km | ~1.8° on lunar surface |
| Particles | 1000–5000 | Tradeoff: accuracy vs speed |
- Speed: The particle filter currently runs on CPU (~30s for 1000 particles). GPU acceleration would give ~100× speedup.
- Catalog size: Extending to magnitude 5–6 stars would improve coverage near poles.
- Real imagery: System is validated on synthetic images; real lunar camera imagery would require noise modeling and PSF calibration.
- Multi-frame fusion: Combining star patterns across multiple frames could reduce error significantly.
lunar_sky_model.py (star catalog, lunar coordinate frame transforms, camera model) was originally developed by Gautam Neon as part of the RASCAL Navigation project. The localization system (localization.py, localization_v1.py), test suite, and analysis pipeline are my own work.
Aditya Kothari · github.com/kothari1 · kothari1@stanford.edu


