-
Notifications
You must be signed in to change notification settings - Fork 0
Closed
Description
Summary
Implement LLM Proxy Wrapper Mode per spec features-edge/019-llm-proxy-wrapper/spec.md.
Makes Fairvisor a drop-in LLM endpoint: clients point their SDK at Fairvisor instead of api.openai.com. Composite Bearer token CLIENT_JWT:UPSTREAM_KEY carries both identity (for enforcement) and upstream key (for forwarding).
Scope
src/fairvisor/wrapper.lua— composite bearer parsing, provider registry (13 providers), auth injection, provider-native error bodies, streaming cutoff formatssrc/nginx/wrapper_access.lua— pre-flight enforcement phase handlersrc/nginx/wrapper_body_filter.lua— SSE body filter with provider-native cutoffdocker/nginx.conf.template— GATEWAY_MODE=wrapper server block- Busted unit tests + Cucumber BDD scenarios
Provider Registry (13)
OpenAI, Anthropic, Gemini native, Gemini OpenAI-compat, Grok, Groq, Mistral, DeepSeek, Perplexity, Together AI, Fireworks AI, Cerebras, Ollama
Spec
/srv/agentshare/fairvisor-vault/features-edge/019-llm-proxy-wrapper/spec.md
Reactions are currently unavailable