Skip to content

openrouter? #17

@stuartcrobinson

Description

@stuartcrobinson

feels like maybe a stupid question but does your project support openrouter? or is the intended path to use openai for it like this explanation below? seems reasonable but i wanted to double check since i didn't see it mentioned anywhere and it seems to be getting popular lately

everything else in this message is claude output:


Yes! OpenRouter is fully OpenAI-compatible, so you should be able to use it with LLM.js by treating it as an OpenAI endpoint.

OpenRouter implements the OpenAI API specification for /completions and /chat/completions endpoints, allowing you to use any model with the same request/response format.

OpenRouter markets itself as "Fully OpenAI compatible" where "OpenAI SDK works out of the box".

So you should be able to do this with LLM.js:

import LLM from "@themaximalist/llm.js"

// Use OpenRouter as if it were OpenAI
const response = await LLM("hello", {
  service: "openai",  // Tell LLM.js to use OpenAI format
  apiKey: "sk-or-...",  // Your OpenRouter API key
  baseURL: "https://openrouter.ai/api/v1",  // Override the base URL
  model: "anthropic/claude-3.5-sonnet"  // Any OpenRouter model
})

The key is that when using OpenRouter, you just need to set the baseURL to "https://openrouter.ai/api/v1" and use your OpenRouter API key.

This means:

  1. Your users with OpenRouter keys can use any of the 100+ models through OpenRouter
  2. Your users with direct vendor keys can use those specific vendors
  3. You don't need to build special OpenRouter support - just document how to configure it

Test this approach first to confirm it works with LLM.js, but based on OpenRouter's OpenAI compatibility, it should work seamlessly!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions