Skip to content

feat: add Novita AI as a native LLM provider#4939

Open
Alex-wuhu wants to merge 1 commit intocrewAIInc:mainfrom
Alex-wuhu:feat/add-novita-ai-provider
Open

feat: add Novita AI as a native LLM provider#4939
Alex-wuhu wants to merge 1 commit intocrewAIInc:mainfrom
Alex-wuhu:feat/add-novita-ai-provider

Conversation

@Alex-wuhu
Copy link
Copy Markdown

@Alex-wuhu Alex-wuhu commented Mar 18, 2026

Summary

  • Adds Novita AI as a native LLM provider via its OpenAI-compatible endpoint (https://api.novita.ai/openai)
  • New NovitaCompletion class that extends OpenAICompletion — minimal code, inherits all chat completions, streaming, tool-calling, and structured output support
  • Registers novita in the provider routing (SUPPORTED_NATIVE_PROVIDERS, provider_mapping, _get_native_provider, etc.)
  • Supported models: moonshotai/kimi-k2.5 (default), zai-org/glm-5, minimax/minimax-m2.5

Usage

from crewai import LLM

# With provider prefix
llm = LLM(model="novita/moonshotai/kimi-k2.5")

# Direct vendor/model ID (auto-detected)
llm = LLM(model="moonshotai/kimi-k2.5")

# Explicit provider kwarg
llm = LLM(model="zai-org/glm-5", provider="novita")

API key is read from NOVITA_API_KEY environment variable or passed as api_key constructor parameter.

Design decisions

  • Extends OpenAICompletion instead of duplicating code — Novita's API is fully OpenAI-compatible
  • Forces api="completions" — the Responses API is OpenAI-specific and not supported by Novita
  • Vendor-prefixed model IDs (e.g. moonshotai/kimi-k2.5) are auto-detected via NOVITA_MODELS lookup before the generic /-split logic kicks in, so users don't need the novita/ prefix

Test plan

  • Verify LLM(model="novita/moonshotai/kimi-k2.5") routes to NovitaCompletion
  • Verify LLM(model="moonshotai/kimi-k2.5") auto-detects Novita provider
  • Verify NOVITA_API_KEY env var is picked up correctly
  • Verify chat completion and tool calling work end-to-end with a Novita API key
  • Verify existing providers are unaffected (no changes to existing provider files)

Note

Medium Risk
Introduces a new native provider and modifies LLM provider inference/routing, which could change how some model strings are resolved and instantiated at runtime. Low algorithmic complexity, but affects core LLM initialization paths and requires correct API key/base URL configuration.

Overview
Adds Novita AI as a native LLM provider backed by Novita’s OpenAI-compatible endpoint.

Updates LLM provider routing to recognize novita (including vendor-prefixed model IDs like moonshotai/kimi-k2.5), adds NOVITA_MODELS constants, and registers context window sizes for the supported Novita models.

Introduces NovitaCompletion, a thin wrapper over OpenAICompletion that defaults the base URL to https://api.novita.ai/openai, requires NOVITA_API_KEY, and forces use of the OpenAI Completions API.

Written by Cursor Bugbot for commit 2bef65c. This will update automatically on new commits. Configure here.

Add native support for Novita AI's OpenAI-compatible API
(https://api.novita.ai/openai) as a first-class LLM provider.

- New NovitaCompletion class extending OpenAICompletion
- API key via NOVITA_API_KEY env var or constructor param
- Supported models: moonshotai/kimi-k2.5 (default), zai-org/glm-5,
  minimax/minimax-m2.5
- Auto-detection of vendor/model IDs (e.g. "moonshotai/kimi-k2.5")
  routes to Novita without requiring "novita/" prefix

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

if self.client_params:
client_params.update(self.client_params)

return client_params
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Novita models get wrong context window size

Medium Severity

NovitaCompletion inherits get_context_window_size() from OpenAICompletion, which only checks OpenAI-specific model prefixes (e.g. gpt-4, o1). Since Novita models like moonshotai/kimi-k2.5 don't match any OpenAI prefix, the method returns the default ~6,963 tokens instead of the intended ~111,411 tokens (131072 × 0.85). The entries added to LLM_CONTEXT_WINDOW_SIZES are dead code here — that dict is only consumed by the LLM (litellm) class's override, not by native provider classes. Every other native provider (Anthropic, Gemini, Bedrock) overrides get_context_window_size() with its own model-specific windows; NovitaCompletion needs the same.

Additional Locations (1)
Fix in Cursor Fix in Web

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 3, 2026

This PR is stale because it has been open for 45 days with no activity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant