Skip to content

BAML playground doesn't send Host: request header as overrides #1856

@jchum

Description

@jchum
  1. In clients.baml I have this configuration:
client<llm> CustomLlama31_405B {
  provider "openai-generic"
  options {
    default_role "user"
    model "meta/llama-3.1-405b-instruct"
    api_key ""  // No API key needed for internal endpoint
    base_url "http://localhost:9494/v1"
    headers {
      "Host" "llama3.dgx-k8s.production"
      "Content-Type" "application/json"
    }
    temperature 0.8
    max_tokens 3000
  }
  1. When I click the "Run..." button on the prompt in BAML Playground in VSCode, I receive an error. When I copy the Curl and paste it into terminal, it works fine.

The raw output suggest that the request header, Host key/value isn't being passed when I click the "Run ..." button

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions