Skip to content

Support Gate proxy authentication via ANTHROPIC_API_KEY and ANTHROPIC…#1096

Merged
git-hyagi merged 1 commit into
pulp:mainfrom
git-hyagi:pulp-service-agent-splunk-gcloud-credentials
Apr 23, 2026
Merged

Support Gate proxy authentication via ANTHROPIC_API_KEY and ANTHROPIC…#1096
git-hyagi merged 1 commit into
pulp:mainfrom
git-hyagi:pulp-service-agent-splunk-gcloud-credentials

Conversation

@git-hyagi
Copy link
Copy Markdown
Collaborator

@git-hyagi git-hyagi commented Apr 23, 2026

…_BASE_URL

When running inside Alcove, Gate proxies all LLM traffic and injects real credentials. This change lets agent-splunk use that flow by reading ANTHROPIC_API_KEY and ANTHROPIC_BASE_URL to send requests through Gate instead of authenticating directly to Vertex AI.

Also adds CLAUDE_MODEL env var to control the default model, and VERTEX_SA_JSON for direct Vertex AI auth via service account JSON.

Three auth modes are now supported:

  • Proxy: ANTHROPIC_API_KEY + ANTHROPIC_BASE_URL (Alcove/Gate)
  • Service account: VERTEX_SA_JSON (direct Vertex AI)
  • ADC: ANTHROPIC_VERTEX_PROJECT_ID (original behavior)

Assisted By: claude-opus-4.6

Summary by Sourcery

Add support for proxy-based Anthropic authentication in agent-splunk alongside existing Vertex AI modes and make the default Claude model configurable via environment variables.

New Features:

  • Support proxy-based Anthropic authentication via ANTHROPIC_API_KEY and ANTHROPIC_BASE_URL in agent-splunk.
  • Allow configuring the default Claude model through the CLAUDE_MODEL environment variable.

Enhancements:

  • Extend Claude model configuration to choose between proxy and Vertex AI-backed clients at runtime based on environment variables.
  • Relax mandatory Vertex AI project ID requirements when running in proxy mode and improve startup auth-mode logging.

@sourcery-ai
Copy link
Copy Markdown
Contributor

sourcery-ai Bot commented Apr 23, 2026

Reviewer's Guide

Adds proxy-based Anthropic auth support (via ANTHROPIC_API_KEY/ANTHROPIC_BASE_URL), expands Vertex AI auth options, and introduces a configurable default Claude model for the Splunk agent.

Sequence diagram for Claude ModelRun auth path selection

sequenceDiagram
    participant Caller
    participant Claude
    participant AnthropicSDK as AnthropicSDK
    participant VertexClient as VertexVertexClient
    participant GateProxy as GateProxy
    participant AnthropicAPI as AnthropicAPI

    Caller->>Claude: ModelRun(ctx, cfg)
    Claude->>Claude: Build options with WithModel
    alt Proxy mode (APIKey present)
        Claude->>AnthropicSDK: New(WithModel, WithToken(APIKey), optional WithBaseURL(BaseURL))
        AnthropicSDK-->>Claude: llm client
        Claude->>GateProxy: llm request (BaseURL)
        GateProxy->>AnthropicAPI: Forward request with real credentials
        AnthropicAPI-->>GateProxy: Response
        GateProxy-->>Claude: Proxied response
    else Vertex AI mode (no APIKey)
        Claude->>VertexClient: Init VertexClient(ProjectID, Region, Model, SACredential)
        Claude->>AnthropicSDK: New(WithModel, WithToken(vertex-ai), WithHTTPClient(VertexClient))
        AnthropicSDK-->>Claude: llm client
        Claude->>VertexClient: llm request
        VertexClient->>AnthropicAPI: Vertex AI routed request
        AnthropicAPI-->>VertexClient: Response
        VertexClient-->>Claude: Response
    end
    Claude-->>Caller: Answer string
Loading

Updated class diagram for Claude model configuration

classDiagram
    class Claude {
        string Model
        string ProjectID
        string Region
        byte[] SACredential
        string APIKey
        string BaseURL
        ModelRun(ctx context.Context, cfg RunConfig) string
    }
Loading

Flow diagram for auth mode resolution in main.run

flowchart TD
    A[Start run] --> B[Read ANTHROPIC_API_KEY into apiKey]
    B --> C[Read ANTHROPIC_BASE_URL into baseURL]
    C --> D[Read VERTEX_SA_JSON into saJSON]
    D --> E{saJSON is non empty?}
    E -- Yes --> F[Set saCredential from saJSON and log using VERTEX_SA_JSON]
    E -- No --> G[saCredential remains empty]
    F --> H[Read ANTHROPIC_VERTEX_PROJECT_ID into projectID]
    G --> H
    H --> I{projectID is empty and saCredential present?}
    I -- Yes --> J[Parse saCredential as JSON and extract project_id into projectID]
    I -- No --> K[Keep existing projectID]
    J --> L{apiKey is empty and projectID is empty?}
    K --> L
    L -- Yes --> M[[Error: require ANTHROPIC_API_KEY or ANTHROPIC_VERTEX_PROJECT_ID / VERTEX_SA_JSON]]
    L -- No --> N{apiKey is non empty?}
    N -- Yes --> O[Log using proxy mode with ANTHROPIC_BASE_URL]
    N -- No --> P[Proceed with Vertex AI mode]
    O --> Q[Read CLOUD_ML_REGION or default us-east5]
    P --> Q
    Q --> R[Resolve defaultModel from CLAUDE_MODEL env or claude-opus-4-6]
    R --> S[Parse flags including -model]
    S --> T[Construct Claude with ProjectID, Region, SACredential, APIKey, BaseURL]
    T --> U[Continue agent execution]
    M --> U
Loading

File-Level Changes

Change Details Files
Add configurable Anthropic client initialization supporting proxy and Vertex-backed modes.
  • Extend Claude struct to carry API key and base URL used for Anthropic auth.
  • Refactor client construction to build an options slice and conditionally configure proxy mode when an API key is present.
  • Retain Vertex AI-specific HTTP client and token-based flow as the fallback when no API key is supplied.
tools/agents/agent-splunk/models/anthropic.go
Introduce multiple authentication modes and configurable default model in the Splunk agent entrypoint.
  • Read ANTHROPIC_API_KEY and ANTHROPIC_BASE_URL from the environment to enable proxy mode and log which auth mode is used.
  • Add VERTEX_SA_JSON handling for service-account-based Vertex AI auth and relax the project ID requirement when using proxy mode.
  • Improve startup validation to require either proxy credentials or Vertex AI configuration before running.
  • Add CLAUDE_MODEL env support to override the default Claude model used when the --model flag is omitted.
  • Plumb the Anthropic API key and base URL into the Claude model configuration so downstream calls use the selected auth mode.
tools/agents/agent-splunk/main.go

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 1 issue, and left some high level feedback:

  • When ANTHROPIC_API_KEY is set but ANTHROPIC_BASE_URL is empty, the code silently falls back to the SDK default base URL while still claiming "proxy mode" in the log; consider validating that ANTHROPIC_BASE_URL is non-empty in proxy mode (or clearly defining the intended behavior in that case).
  • The -model flag help string still hardcodes claude-opus-4-6 as the default even though CLAUDE_MODEL can change it at runtime; consider formatting the help text using the computed defaultModel so the CLI output matches actual behavior.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- When `ANTHROPIC_API_KEY` is set but `ANTHROPIC_BASE_URL` is empty, the code silently falls back to the SDK default base URL while still claiming "proxy mode" in the log; consider validating that `ANTHROPIC_BASE_URL` is non-empty in proxy mode (or clearly defining the intended behavior in that case).
- The `-model` flag help string still hardcodes `claude-opus-4-6` as the default even though `CLAUDE_MODEL` can change it at runtime; consider formatting the help text using the computed `defaultModel` so the CLI output matches actual behavior.

## Individual Comments

### Comment 1
<location path="tools/agents/agent-splunk/main.go" line_range="79" />
<code_context>
+		defaultModel = envModel
+	}
+
+	inputModel := flag.String("model", defaultModel, "Define the model (claude-opus-4-6,gemini-2.5-pro). Default: claude-opus-4-6")
 	inputQuestion := flag.String("question", "", "Question to ask the model")
 	flag.Parse()
</code_context>
<issue_to_address>
**suggestion:** Flag help text no longer matches the dynamic default model behavior

The description still hardcodes `Default: claude-opus-4-6`, but the actual default can change via `CLAUDE_MODEL`. Please either remove the hardcoded default from the help text or interpolate `defaultModel` so the help remains accurate when the env var is set.

```suggestion
	inputModel := flag.String("model", defaultModel, "Define the model (claude-opus-4-6,gemini-2.5-pro). Default: "+defaultModel)
```
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment thread tools/agents/agent-splunk/main.go Outdated
…_BASE_URL

When running inside Alcove, Gate proxies all LLM traffic and injects
real credentials. This change lets agent-splunk use that flow by
reading ANTHROPIC_API_KEY and ANTHROPIC_BASE_URL to send requests
through Gate instead of authenticating directly to Vertex AI.

Also adds CLAUDE_MODEL env var to control the default model, and
VERTEX_SA_JSON for direct Vertex AI auth via service account JSON.

Three auth modes are now supported:
- Proxy: ANTHROPIC_API_KEY + ANTHROPIC_BASE_URL (Alcove/Gate)
- Service account: VERTEX_SA_JSON (direct Vertex AI)
- ADC: ANTHROPIC_VERTEX_PROJECT_ID (original behavior)

Assisted By: claude-opus-4.6
@git-hyagi git-hyagi force-pushed the pulp-service-agent-splunk-gcloud-credentials branch from 2301d7d to 64e3999 Compare April 23, 2026 15:41
@git-hyagi git-hyagi merged commit ffffb0b into pulp:main Apr 23, 2026
4 checks passed
@git-hyagi git-hyagi deleted the pulp-service-agent-splunk-gcloud-credentials branch April 23, 2026 16:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant