Skip to content

Feature 019: LLM Proxy Wrapper Mode #39

@levleontiev

Description

@levleontiev

Summary

Implement LLM Proxy Wrapper Mode per spec features-edge/019-llm-proxy-wrapper/spec.md.

Makes Fairvisor a drop-in LLM endpoint: clients point their SDK at Fairvisor instead of api.openai.com. Composite Bearer token CLIENT_JWT:UPSTREAM_KEY carries both identity (for enforcement) and upstream key (for forwarding).

Scope

  • src/fairvisor/wrapper.lua — composite bearer parsing, provider registry (13 providers), auth injection, provider-native error bodies, streaming cutoff formats
  • src/nginx/wrapper_access.lua — pre-flight enforcement phase handler
  • src/nginx/wrapper_body_filter.lua — SSE body filter with provider-native cutoff
  • docker/nginx.conf.template — GATEWAY_MODE=wrapper server block
  • Busted unit tests + Cucumber BDD scenarios

Provider Registry (13)

OpenAI, Anthropic, Gemini native, Gemini OpenAI-compat, Grok, Groq, Mistral, DeepSeek, Perplexity, Together AI, Fireworks AI, Cerebras, Ollama

Spec

/srv/agentshare/fairvisor-vault/features-edge/019-llm-proxy-wrapper/spec.md

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions