Skip to content

Blazingly fast dead code detector using reachability analysis. Part of Neural Garage - AI-powered code analysis tools.

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
Notifications You must be signed in to change notification settings

neural-garage/tools

Neural Garage Tools πŸ§ πŸ”§

AI-powered code analysis tools built in Rust

Part of the Neural Garage suite of developer tools for the AI era.

License: MIT OR Apache-2.0 CI

Tools

This monorepo contains multiple code analysis tools:

bury πŸͺ¦ - Dead Code Detector

Bury the dead code before it haunts your codebase!

Finds unused code in your Python and TypeScript projects using reachability analysis.

bury πŸͺ¦ - Dead Code Detector

Bury the dead code before it haunts your codebase!

Finds unused code in your Python and TypeScript projects using reachability analysis.

Key Features

  • πŸš€ Blazingly Fast - Written in Rust with parallel processing
  • 🎯 Accurate - Uses reachability analysis, not simple pattern matching
  • 🌍 Multi-Language - Supports Python and TypeScript (more coming!)
  • πŸ€– LLM-Friendly - Outputs structured JSON perfect for AI tools
  • βš™οΈ Configurable - Define entry points and ignore patterns
  • πŸ“Š Multiple Output Formats - JSON, Markdown, or terminal

complexity πŸ“Š - Complexity Analyzer (Coming Soon)

Analyzes code complexity using cyclomatic and cognitive complexity metrics.

Repository Structure

This is a Cargo workspace containing multiple crates:

neural-garage/tools/
β”œβ”€β”€ crates/
β”‚   β”œβ”€β”€ shared/          # Shared library (neural-shared)
β”‚   β”‚   β”œβ”€β”€ parser/      # AST parsing (tree-sitter)
β”‚   β”‚   β”œβ”€β”€ scanner/     # File discovery
β”‚   β”‚   └── report/      # Output generation
β”‚   β”‚
β”‚   β”œβ”€β”€ bury/            # Dead code detector
β”‚   β”‚   β”œβ”€β”€ analyzer/    # Reachability analysis
β”‚   β”‚   └── cli/         # CLI interface
β”‚   β”‚
β”‚   └── complexity/      # Complexity analyzer (WIP)
β”‚       └── analyzer/    # Complexity metrics
β”‚
β”œβ”€β”€ Cargo.toml           # Workspace configuration
└── README.md

Installation

# Install bury
cargo install --path crates/bury

# Install complexity (when ready)
cargo install --path crates/complexity

# Or build all tools
cargo build --workspace --release

Quick Start

Using bury

# Analyze current directory
bury

# Analyze specific path
bury ./src

# Output as JSON
bury --format json ./src

# Verbose mode
bury --verbose ./src

How Bury Works

Bury uses a three-phase reachability analysis:

  1. Scan - Find all source files (respecting .gitignore)
  2. Parse - Build AST using tree-sitter for each language
  3. Analyze - Perform reachability analysis from entry points
  4. Report - Output dead code findings

Reachability Analysis

Entry Points (main, tests, exports)
    ↓
Build Call Graph (function β†’ callees)
    ↓
Mark Reachable Code (BFS/DFS traversal)
    ↓
Dead Code = Definitions - Reachable

Example

# module.py

class Calculator:
    def add(self, a, b):      # βœ… Used
        return a + b

    def multiply(self, a, b):  # ❌ DEAD CODE
        return a * b

def main():
    calc = Calculator()
    result = calc.add(1, 2)  # Only calls add()

Output:

{
  "dead_code": [
    {
      "kind": "Method",
      "name": "multiply",
      "file": "module.py",
      "line": 6,
      "reason": "Not reachable from any entry point",
      "confidence": "High"
    }
  ]
}

Configuration

Create a .bury.json file:

{
  "entry_points": {
    "patterns": [
      "**/main.py",
      "**/test_*.py",
      "src/index.ts"
    ],
    "functions": [
      "main",
      "test_*"
    ]
  },
  "ignore": [
    "**/node_modules/**",
    "**/__pycache__/**"
  ]
}

Development

Building

# Build all crates
cargo build --workspace

# Build specific crate
cargo build -p bury
cargo build -p complexity
cargo build -p neural-shared

# Run tests
cargo test --workspace

# Run clippy
cargo clippy --workspace --all-targets

Running

# Run bury
cargo run -p bury -- --help
cargo run -p bury -- ./src

# Run complexity
cargo run -p complexity

Roadmap

Phase 1 - Monorepo Migration βœ… Complete

  • Create workspace structure
  • Extract shared library (neural-shared)
  • Migrate bury to workspace
  • Create complexity placeholder
  • Generic reporter trait
  • All tests passing

Phase 2 - Bury Enhancements

  • Configuration file support
  • Cross-file analysis
  • Import/export tracking
  • Dynamic code pattern detection
  • Performance optimization (parallel processing)

Phase 3 - Complexity Analyzer

  • Implement cyclomatic complexity
  • Implement cognitive complexity
  • CLI interface
  • Reporter integration
  • Documentation

Phase 4 - Conductor Platform

  • Multi-agent orchestration system (private repo)
  • Remote execution infrastructure
  • Session management
  • Dashboard and UI
  • Integration with analysis tools
  • LLM context generation
  • AI-powered refactoring suggestions

Phase 5 - Premium Features

  • Additional languages (Java, Go, Rust, C#)
  • CI/CD integrations
  • Team dashboards
  • Historical tracking
  • Custom rules engine

Architecture Overview

Shared Library (neural-shared)

The neural-shared crate provides common functionality for all analysis tools:

  • Parser Module - Tree-sitter-based AST parsing

    • Language detection from file extensions
    • Pluggable parser architecture (Python, TypeScript)
    • Symbol extraction (definitions, usages, entry points)
  • Scanner Module - File system traversal

    • .gitignore support
    • Parallel file scanning
    • Language-specific file filtering
  • Report Module - Generic reporting framework

    • Finding trait for all analysis results
    • JSON and Markdown reporters
    • Extensible for custom formats

Tool-Specific Analyzers

Each tool (bury, complexity, etc.) implements its own analysis logic:

  • Bury - Reachability analysis for dead code detection
  • Complexity - Cyclomatic and cognitive complexity metrics (WIP)

Why a Monorepo?

Why a Monorepo?

  • Code Sharing - All tools share parser, scanner, and reporter code
  • Consistent Versioning - Coordinated releases across tools
  • Easier Development - Test changes across all tools simultaneously
  • Better CI/CD - Unified testing and deployment

Each tool can still be:

  • Published independently to crates.io
  • Installed separately via cargo install
  • Used as a library in other projects

Why Open Source?

  • Build a strong community
  • Enable contributions
  • Ensure transparency
  • Provide value to individual developers

Premium features (additional languages, enterprise integrations) will be available separately to support continued development.

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Areas where we need help:

  • Parser improvements (AST traversal)
  • Language support (Java, Go, Rust, C#)
  • Documentation
  • Test fixtures
  • Performance optimizations

License

Licensed under either of:

at your option.

Inspiration

Bury was inspired by excellent tools like:

Support

Part of Neural Garage πŸ§ πŸ”§

This toolset is part of the Neural Garage suite - next-generation developer tools built for the AI era.

Neural Garage Ecosystem:

  • Analysis Tools (this repo) - Open source CLI tools (bury, complexity, etc.)
  • Conductor (private repo) - Multi-agent orchestration platform
  • Context Generator (coming soon) - LLM context optimization

Built with ❀️ and πŸ¦€ by Paolo Rechia and the Neural Garage community

About

Blazingly fast dead code detector using reachability analysis. Part of Neural Garage - AI-powered code analysis tools.

Topics

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

Packages

No packages published