AI-powered code analysis tools built in Rust
Part of the Neural Garage suite of developer tools for the AI era.
This monorepo contains multiple code analysis tools:
Bury the dead code before it haunts your codebase!
Finds unused code in your Python and TypeScript projects using reachability analysis.
Bury the dead code before it haunts your codebase!
Finds unused code in your Python and TypeScript projects using reachability analysis.
- π Blazingly Fast - Written in Rust with parallel processing
- π― Accurate - Uses reachability analysis, not simple pattern matching
- π Multi-Language - Supports Python and TypeScript (more coming!)
- π€ LLM-Friendly - Outputs structured JSON perfect for AI tools
- βοΈ Configurable - Define entry points and ignore patterns
- π Multiple Output Formats - JSON, Markdown, or terminal
Analyzes code complexity using cyclomatic and cognitive complexity metrics.
This is a Cargo workspace containing multiple crates:
neural-garage/tools/
βββ crates/
β βββ shared/ # Shared library (neural-shared)
β β βββ parser/ # AST parsing (tree-sitter)
β β βββ scanner/ # File discovery
β β βββ report/ # Output generation
β β
β βββ bury/ # Dead code detector
β β βββ analyzer/ # Reachability analysis
β β βββ cli/ # CLI interface
β β
β βββ complexity/ # Complexity analyzer (WIP)
β βββ analyzer/ # Complexity metrics
β
βββ Cargo.toml # Workspace configuration
βββ README.md
# Install bury
cargo install --path crates/bury
# Install complexity (when ready)
cargo install --path crates/complexity
# Or build all tools
cargo build --workspace --release# Analyze current directory
bury
# Analyze specific path
bury ./src
# Output as JSON
bury --format json ./src
# Verbose mode
bury --verbose ./srcBury uses a three-phase reachability analysis:
- Scan - Find all source files (respecting .gitignore)
- Parse - Build AST using tree-sitter for each language
- Analyze - Perform reachability analysis from entry points
- Report - Output dead code findings
Entry Points (main, tests, exports)
β
Build Call Graph (function β callees)
β
Mark Reachable Code (BFS/DFS traversal)
β
Dead Code = Definitions - Reachable
# module.py
class Calculator:
def add(self, a, b): # β
Used
return a + b
def multiply(self, a, b): # β DEAD CODE
return a * b
def main():
calc = Calculator()
result = calc.add(1, 2) # Only calls add()Output:
{
"dead_code": [
{
"kind": "Method",
"name": "multiply",
"file": "module.py",
"line": 6,
"reason": "Not reachable from any entry point",
"confidence": "High"
}
]
}Create a .bury.json file:
{
"entry_points": {
"patterns": [
"**/main.py",
"**/test_*.py",
"src/index.ts"
],
"functions": [
"main",
"test_*"
]
},
"ignore": [
"**/node_modules/**",
"**/__pycache__/**"
]
}# Build all crates
cargo build --workspace
# Build specific crate
cargo build -p bury
cargo build -p complexity
cargo build -p neural-shared
# Run tests
cargo test --workspace
# Run clippy
cargo clippy --workspace --all-targets# Run bury
cargo run -p bury -- --help
cargo run -p bury -- ./src
# Run complexity
cargo run -p complexity- Create workspace structure
- Extract shared library (neural-shared)
- Migrate bury to workspace
- Create complexity placeholder
- Generic reporter trait
- All tests passing
- Configuration file support
- Cross-file analysis
- Import/export tracking
- Dynamic code pattern detection
- Performance optimization (parallel processing)
- Implement cyclomatic complexity
- Implement cognitive complexity
- CLI interface
- Reporter integration
- Documentation
- Multi-agent orchestration system (private repo)
- Remote execution infrastructure
- Session management
- Dashboard and UI
- Integration with analysis tools
- LLM context generation
- AI-powered refactoring suggestions
- Additional languages (Java, Go, Rust, C#)
- CI/CD integrations
- Team dashboards
- Historical tracking
- Custom rules engine
The neural-shared crate provides common functionality for all analysis tools:
-
Parser Module - Tree-sitter-based AST parsing
- Language detection from file extensions
- Pluggable parser architecture (Python, TypeScript)
- Symbol extraction (definitions, usages, entry points)
-
Scanner Module - File system traversal
- .gitignore support
- Parallel file scanning
- Language-specific file filtering
-
Report Module - Generic reporting framework
Findingtrait for all analysis results- JSON and Markdown reporters
- Extensible for custom formats
Each tool (bury, complexity, etc.) implements its own analysis logic:
- Bury - Reachability analysis for dead code detection
- Complexity - Cyclomatic and cognitive complexity metrics (WIP)
- Code Sharing - All tools share parser, scanner, and reporter code
- Consistent Versioning - Coordinated releases across tools
- Easier Development - Test changes across all tools simultaneously
- Better CI/CD - Unified testing and deployment
Each tool can still be:
- Published independently to crates.io
- Installed separately via
cargo install - Used as a library in other projects
- Build a strong community
- Enable contributions
- Ensure transparency
- Provide value to individual developers
Premium features (additional languages, enterprise integrations) will be available separately to support continued development.
Contributions are welcome! See CONTRIBUTING.md for guidelines.
Areas where we need help:
- Parser improvements (AST traversal)
- Language support (Java, Go, Rust, C#)
- Documentation
- Test fixtures
- Performance optimizations
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
Bury was inspired by excellent tools like:
- Knip - TypeScript dead code finder
- Vulture - Python dead code finder
- cargo-udeps - Rust unused dependencies
- π Report bugs
- π‘ Request features
- π¬ Discussions
This toolset is part of the Neural Garage suite - next-generation developer tools built for the AI era.
Neural Garage Ecosystem:
- Analysis Tools (this repo) - Open source CLI tools (bury, complexity, etc.)
- Conductor (private repo) - Multi-agent orchestration platform
- Context Generator (coming soon) - LLM context optimization
Built with β€οΈ and π¦ by Paolo Rechia and the Neural Garage community