Skip to content

feat: add AI / Local LLM scan category#15

Merged
Spoonman1091 merged 1 commit intotrustedsec:feature/scanning-reworkfrom
bandrel:feature/ai-default-ports
Apr 22, 2026
Merged

feat: add AI / Local LLM scan category#15
Spoonman1091 merged 1 commit intotrustedsec:feature/scanning-reworkfrom
bandrel:feature/ai-default-ports

Conversation

@bandrel
Copy link
Copy Markdown
Contributor

@bandrel bandrel commented Apr 22, 2026

Closes #14

Summary

  • New AI / Local LLM entry in SERVICE_CATEGORIES covering 9 ports used by Ollama, LM Studio, llama.cpp, text-generation-webui, vLLM, Jan, Open WebUI, KoboldCpp, and Tabby
  • Five uniquely-AI ports (11434, 1234, 7860, 5001, 1337) added to EXTERNAL_SENSITIVE_PORTS at HIGH severity; ports already covered by existing Web findings rules (8000, 8080, 5000, 3000) were not duplicated

Test plan

  • uv run pytest tests/ — all existing tests pass
  • AI / Local LLM appears in the interactive category selection prompt
  • scan_categories: ["AI / Local LLM"] in config.json targets the correct ports
  • findings.txt flags 11434, 1234, 7860, 5001, and 1337 as HIGH on an external scan result

🤖 Generated with Claude Code

Adds Ollama, LM Studio, text-generation-webui, KoboldCpp, and Jan to
EXTERNAL_SENSITIVE_PORTS as HIGH severity, and introduces a new
'AI / Local LLM' SERVICE_CATEGORIES entry covering common LLM API/UI ports.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@bandrel bandrel changed the base branch from main to feature/scanning-rework April 22, 2026 17:27
@Spoonman1091 Spoonman1091 merged commit 138c8e5 into trustedsec:feature/scanning-rework Apr 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: add AI / Local LLM scan category

2 participants