Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Spark History Server MCP Configuration

# MCP Server Settings
MCP_PORT=18888
MCP_DEBUG=false

# Spark Authentication (Optional)
# SPARK_USERNAME=your_spark_username
# SPARK_PASSWORD=your_spark_password
# SPARK_TOKEN=your_spark_token

# Example for production:
# SPARK_USERNAME=prod_user
# SPARK_PASSWORD=secure_password_here
# SPARK_TOKEN=jwt_token_here
54 changes: 54 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# 🔄 Pull Request

## 📝 Description
Brief description of changes and motivation.

## 🎯 Type of Change
<!-- Mark with [x] -->
- [ ] 🐛 Bug fix (non-breaking change that fixes an issue)
- [ ] ✨ New feature (non-breaking change that adds functionality)
- [ ] 💥 Breaking change (fix or feature that would cause existing functionality to change)
- [ ] 📖 Documentation update
- [ ] 🧪 Test improvement
- [ ] 🔧 Refactoring (no functional changes)

## 🧪 Testing
<!-- Describe how you tested your changes -->
- [ ] ✅ All existing tests pass (`uv run pytest`)
- [ ] 🔬 Tested with MCP Inspector
- [ ] 📊 Tested with sample Spark data
- [ ] 🚀 Tested with real Spark History Server (if applicable)

### 🔬 Test Commands Run
```bash
# Example:
# uv run pytest test_tools.py -v
# npx @modelcontextprotocol/inspector uv run main.py
```

## 🛠️ New Tools Added (if applicable)
<!-- For new MCP tools -->
- **Tool Name**: `new_tool_name`
- **Purpose**: What it does
- **Usage**: Example parameters

## 📸 Screenshots (if applicable)
<!-- For UI changes or new tools, include MCP Inspector screenshots -->

## ✅ Checklist
- [ ] 🔍 Code follows project style guidelines
- [ ] 🧪 Added tests for new functionality
- [ ] 📖 Updated documentation (README, TESTING.md, etc.)
- [ ] 🔧 Pre-commit hooks pass
- [ ] 📝 Added entry to CHANGELOG.md (if significant change)

## 📚 Related Issues
<!-- Link any related issues -->
Fixes #(issue number)
Related to #(issue number)

## 🤔 Additional Context
<!-- Add any additional context, screenshots, or notes -->

---
**🎉 Thank you for contributing!** Your effort helps make Spark monitoring more intelligent.
2 changes: 1 addition & 1 deletion .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ jobs:
enable-cache: true

- name: Run pre-commit
run: uv run pre-commit run --all-files --show-diff-on-failure
run: uv run pre-commit run --all-files --show-diff-on-failure
139 changes: 139 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
name: CI

on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]

jobs:
build:
name: Code Quality Checks
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v2

- name: Set up Python
run: uv python install 3.12

- name: Install dependencies
run: uv sync --group dev

- name: Install pre-commit
run: uv add --group dev pre-commit

- name: Run pre-commit
run: uv run pre-commit run --all-files --show-diff-on-failure

test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.12"]

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v2

- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}

- name: Install dependencies
run: uv sync --group dev

- name: Lint with ruff
run: uv run ruff check .

# TODO: Re-enable mypy after fixing type annotations
# - name: Type check with mypy
# run: uv run mypy *.py --ignore-missing-imports

- name: Test with pytest
run: uv run pytest --cov=. --cov-report=xml --cov-report=term-missing

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
if: success()

integration:
runs-on: ubuntu-latest
needs: test

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v2

- name: Set up Python
run: uv python install 3.12

- name: Install dependencies
run: uv sync

- name: Setup test configuration
run: |
# Ensure config.yaml exists and is properly configured for CI
if [ ! -f config.yaml ]; then
echo "Creating default config.yaml for CI"
cat > config.yaml << EOF
servers:
default:
default: true
url: "http://localhost:18080"
EOF
fi

- name: Verify test data
run: |
echo "Verifying test data structure..."
ls -la examples/basic/
ls -la examples/basic/events/
cat examples/basic/history-server.conf

- name: Start Spark History Server
run: |
echo "Starting Spark History Server with Docker..."
docker run -d \
--name spark-history-server \
-v $(pwd)/examples/basic:/mnt/data \
-p 18080:18080 \
docker.io/apache/spark:3.5.5 \
/opt/java/openjdk/bin/java \
-cp '/opt/spark/conf:/opt/spark/jars/*' \
-Xmx1g \
org.apache.spark.deploy.history.HistoryServer \
--properties-file /mnt/data/history-server.conf

- name: Wait for Spark History Server
run: |
timeout 60 bash -c 'until curl -f http://localhost:18080; do sleep 2; done'

- name: Test MCP Server startup
run: |
# Test import structure
uv run python -c "import app; print('✓ App imports successfully')"
uv run python -c "import main; print('✓ Main imports successfully')"

# Test MCP server can start (brief startup test)
timeout 10 uv run python main.py &
SERVER_PID=$!
sleep 5
kill $SERVER_PID 2>/dev/null || true
echo "✓ MCP Server startup test completed"

- name: Cleanup
if: always()
run: |
echo "Cleaning up Docker containers..."
docker stop spark-history-server 2>/dev/null || true
docker rm spark-history-server 2>/dev/null || true
Loading