Skip to content

[Refactor] Overhaul Spark adapter with model catalog and dynamic configuration#765

Merged
dingyi222666 merged 3 commits intov1-devfrom
fix/spark-adapter
Mar 8, 2026
Merged

[Refactor] Overhaul Spark adapter with model catalog and dynamic configuration#765
dingyi222666 merged 3 commits intov1-devfrom
fix/spark-adapter

Conversation

@dingyi222666
Copy link
Copy Markdown
Member

Summary

This PR refactors the Spark adapter with a centralized model catalog system and dynamic configuration management. Models are now only shown when their API passwords are properly configured.

New Features

  • Model Catalog System: Introduce SparkModelDefinition interface for centralized model configuration with support for all Spark models (lite, pro, pro-128k, max, max-32k, 4.0-ultra, x1.5, x2)
  • Dynamic Model Filtering: Models only appear in the list when their API passwords are configured
  • Model Aliases Support: Flexible password configuration with multiple aliases per model for better user experience
  • Improved API Routing: Better support for different Spark API versions (v1, v2, x2 endpoints)
  • Load Balancing: Request config selection with round-robin load balancing across available configurations
  • Enhanced Utilities: New getSparkModelPassword, hasSparkModelPassword, and getSparkModelDefinition utilities
  • Documentation: Added comprehensive usage documentation for Spark, Gemini, OpenAI, and OpenAI-like adapters

Bug fixes

  • Fixed message role mapping (proper user/tool distinction in chat messages)
  • Improved error handling for missing passwords and invalid models
  • Better handling of chat completion tool parameters
  • Enhanced response format handling with thinking and JSON response support

Other Changes

  • Updated adapter package versions (spark 1.3.7, gemini 1.3.29, openai 1.3.9, openai-like 1.3.10)
  • Improved locale strings for clearer setup instructions (Chinese and English)
  • Enhanced SparkClient to use dynamic model filtering
  • Better SparkRequester implementation with config selection logic
  • Simplified model refresh logic with password-based filtering
  • Support for additional request parameters (top_k, keep_alive, presence_penalty, frequency_penalty, etc.)

…mic configuration

This commit significantly improves the Spark adapter with a centralized model
catalog system and dynamic configuration management. Models are now only shown
when their API passwords are properly configured.

Key improvements:
- Introduce SparkModelDefinition interface for centralized model configuration
- Create sparkModelCatalog with all Spark models and their properties
- Implement dynamic model filtering based on configured passwords
- Support model aliases for flexible password configuration
- Add comprehensive getSparkModelPassword and hasSparkModelPassword utilities
- Improve API path routing for different Spark API versions
- Enhance request config selection with load balancing across configurations
- Better error handling for missing passwords and invalid models
- Update locale strings for clearer setup instructions in Chinese and English

Other improvements:
- Add usage documentation for Gemini, OpenAI, and OpenAI-like adapters
- Simplify model refresh logic in SparkClient
- Improve message role mapping (user/tool distinction)
- Add support for additional request parameters (top_k, keep_alive, etc.)
- Enhance response format handling with thinking and JSON response support
- Better handling of chat completion tool parameters

Benefits:
- Cleaner configuration with prefilled model aliases
- Reduced duplication through centralized catalog
- Better API flexibility with support for multiple Spark API versions
- More intuitive user experience with only available models shown
- Improved maintainability with model definitions in one place
Update package versions for all adapter packages following the refactoring
and documentation improvements made in the previous commit.

Version updates:
- adapter-spark: 1.3.6 -> 1.3.7
- adapter-gemini: 1.3.28 -> 1.3.29
- adapter-openai: 1.3.8 -> 1.3.9
- adapter-openai-like: 1.3.9 -> 1.3.10

These version bumps reflect the significant improvements made to the adapters
including the new model catalog system, dynamic configuration management, and
comprehensive documentation.
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 8, 2026

Caution

Review failed

The pull request is closed.

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Repository UI

Review profile: CHILL

Plan: Pro

Run ID: 3bc905fd-2788-4025-b842-fc520536aad5

📥 Commits

Reviewing files that changed from the base of the PR and between 9fc40c4 and bc5cc20.

⛔ Files ignored due to path filters (6)
  • packages/adapter-gemini/package.json is excluded by !**/*.json
  • packages/adapter-openai-like/package.json is excluded by !**/*.json
  • packages/adapter-openai/package.json is excluded by !**/*.json
  • packages/adapter-spark/package.json is excluded by !**/*.json
  • packages/adapter-spark/src/locales/en-US.schema.yml is excluded by !**/*.yml
  • packages/adapter-spark/src/locales/zh-CN.schema.yml is excluded by !**/*.yml
📒 Files selected for processing (8)
  • packages/adapter-gemini/src/index.ts
  • packages/adapter-openai-like/src/index.ts
  • packages/adapter-openai/src/index.ts
  • packages/adapter-spark/src/client.ts
  • packages/adapter-spark/src/index.ts
  • packages/adapter-spark/src/requester.ts
  • packages/adapter-spark/src/types.ts
  • packages/adapter-spark/src/utils.ts

Walkthrough

三个适配器(Gemini、OpenAI-like、OpenAI)添加了usage文档常量。Spark适配器进行了重大重构,用模型定义目录替代硬编码模型列表,实现了模型感知的配置选择逻辑,扩展了类型系统以支持新的请求字段和工具功能。

Changes

Cohort / File(s) Summary
Usage文档常量导出
packages/adapter-gemini/src/index.ts, packages/adapter-openai-like/src/index.ts, packages/adapter-openai/src/index.ts
为三个适配器添加导出的usage常量,包含配置说明和使用指南文档。
Spark模型定义和工具函数
packages/adapter-spark/src/utils.ts
新增SparkModelDefinition接口和sparkModelCatalog数组,提供模型元数据(httpModel、apiPath、maxTokens、capabilities)。添加了四个公共助手函数用于模型定义解析和密钥管理,替代了硬编码的模型映射。
Spark类型系统扩展
packages/adapter-spark/src/types.ts
扩展ChatCompletionRequest添加user、top_k、keep_alive、penalties、tool_choice、response_format等字段。调整ChatCompletionResponse字段为可选,新增code、message、sid、status字段,支持工具调用和推理内容。
Spark客户端重构
packages/adapter-spark/src/client.ts
更新SparkClient的plugin参数类型签名,用模型定义目录驱动的方法替代硬编码的rawModels列表,基于密钥可用性动态过滤和构建模型条目。
Spark适配器配置和文档
packages/adapter-spark/src/index.ts
扩展appConfigs的验证逻辑要求启用状态或密钥可用性,更新schema配置默认值,新增usage导出常量。
Spark请求器核心逻辑
packages/adapter-spark/src/requester.ts
更新类型签名支持Config泛型参数。添加_modelConfigCursor_requestConfig内部状态跟踪模型配置轮转。实现三个新的私有方法(_selectConfigForModel_getRequestConfig_getModelDefinition)支持模型感知的配置选择。增强请求构建路径添加新字段和工具支持,改进响应处理和日志记录。

Sequence Diagram

sequenceDiagram
    participant Client as Client Request
    participant Requester as SparkRequester
    participant ConfigSelector as _selectConfigForModel
    participant ModelDef as _getModelDefinition
    participant ModelCatalog as SparkModelCatalog
    participant APIClient as Spark API

    Client->>Requester: completion(model, messages)
    Requester->>ConfigSelector: _selectConfigForModel(model)
    ConfigSelector->>ModelCatalog: 查找模型定义
    ModelCatalog-->>ConfigSelector: SparkModelDefinition
    ConfigSelector->>ConfigSelector: 检查密钥可用性<br/>轮转选择配置
    ConfigSelector-->>Requester: ClientConfigWrapper
    
    Requester->>ModelDef: _getModelDefinition(model)
    ModelDef->>ModelCatalog: 解析模型元数据
    ModelCatalog-->>ModelDef: apiPath, maxTokens, capabilities
    ModelDef-->>Requester: SparkModelDefinition
    
    Requester->>Requester: 构建请求载荷<br/>添加新字段、工具、penalties
    Requester->>Requester: 根据模型定义调整HTTP路径
    Requester->>APIClient: POST请求
    APIClient-->>Requester: 响应流
    Requester->>Requester: 解析、日志记录、处理工具调用
    Requester-->>Client: 完成结果
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Possibly related PRs

Poem

🐰 模型目录整理好,密钥轮转配置妙,
工具调用已齐备,Spark火焰更耀眼。
三个适配齐声唱,文档清晰指方向~ ✨

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch fix/spark-adapter

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Comment thread packages/adapter-spark/src/types.ts Outdated
@gemini-code-assist
Copy link
Copy Markdown
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly refactors the Spark adapter by implementing a robust model catalog system and dynamic configuration management. The changes streamline how models are defined, configured, and accessed, ensuring that only properly credentialed models are available. This overhaul not only enhances the adapter's flexibility and reliability but also improves the overall user experience through better error handling, expanded API parameter support, and clearer documentation across multiple adapters.

Highlights

  • Centralized Model Catalog and Dynamic Configuration: Introduced a SparkModelDefinition interface and sparkModelCatalog for all Spark models, enabling centralized configuration and dynamic filtering. Models are now only displayed and enabled if their API passwords are properly configured, improving clarity and security.
  • Enhanced Spark Adapter Functionality: The Spark adapter now supports model aliases for flexible password configuration, improved API routing for different Spark API versions (v1, v2, x2 endpoints), and round-robin load balancing across available configurations for better request handling.
  • Improved API Request Parameters and Error Handling: Added support for a wider range of request parameters including user, top_k, keep_alive, presence_penalty, frequency_penalty, tool_choice, tool_calls_switch, response_format, and thinking. Error handling for missing passwords, invalid models, and streaming responses has also been significantly improved.
  • Updated Documentation and Locale Strings: Comprehensive usage documentation has been added for Spark, Gemini, OpenAI, and OpenAI-like adapters, including detailed instructions in Chinese for API key configuration and model alias mapping. Locale strings for the Spark adapter have been updated for clearer setup instructions in both English and Chinese.
  • Message Role Mapping Correction: Fixed an issue where function and tool message roles were incorrectly mapped to user, ensuring proper distinction in chat messages.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • packages/adapter-gemini/package.json
    • Updated package version to 1.3.29.
  • packages/adapter-gemini/src/index.ts
    • Added Chinese usage instructions for Gemini adapter API key configuration.
  • packages/adapter-openai-like/package.json
    • Updated package version to 1.3.10.
  • packages/adapter-openai-like/src/index.ts
    • Added Chinese usage instructions for OpenAI-like adapter API key configuration.
  • packages/adapter-openai/package.json
    • Updated package version to 1.3.9.
  • packages/adapter-openai/src/index.ts
    • Added Chinese usage instructions for OpenAI adapter API key configuration.
  • packages/adapter-spark/package.json
    • Updated package version to 1.3.7.
  • packages/adapter-spark/src/client.ts
    • Removed unused ModelCapabilities import.
    • Updated ChatLunaPlugin type definition.
    • Refactored refreshModels to dynamically filter models based on configured API passwords using the new model catalog.
  • packages/adapter-spark/src/index.ts
    • Imported new utility functions for model catalog and password handling.
    • Modified configuration parsing to filter appConfigs based on the presence of model API passwords.
    • Updated Config schema for appConfigs to pre-fill with default Spark model configurations.
    • Added detailed Chinese usage instructions for the Spark adapter, including a model alias table and upgrade notes.
  • packages/adapter-spark/src/locales/en-US.schema.yml
    • Updated description for appConfigs to clarify model-password mapping and pre-filled models.
    • Added key and value descriptions for appConfigs table entries.
    • Added temperature parameter description.
  • packages/adapter-spark/src/locales/zh-CN.schema.yml
    • Updated Chinese description for appConfigs to clarify model-password mapping and pre-filled models.
    • Added Chinese key and value descriptions for appConfigs table entries.
    • Added Chinese temperature parameter description.
  • packages/adapter-spark/src/requester.ts
    • Imported ClientConfigWrapper and deepAssign.
    • Updated SparkRequester to support dynamic configuration selection and load balancing.
    • Modified _post method to accept a more specific request body and model parameter.
    • Refactored _buildHeaders to use the new getSparkModelPassword utility.
    • Implemented _getModelDefinition, _getRequestConfig, and _selectConfigForModel for centralized model lookup and configuration management.
    • Expanded _call method to support new request parameters and improved error handling for streaming responses.
    • Ensured request configuration is reset after each call.
  • packages/adapter-spark/src/types.ts
    • Expanded ChatCompletionRequest interface with new parameters like user, top_k, keep_alive, presence_penalty, frequency_penalty, tool_choice, tool_calls_switch, response_format, and thinking.
    • Extended ChatCompletionResponse with optional error-related fields (code, message, sid, status).
    • Added tool_call_id to ChatCompletionDelta and index to ToolCall.
    • Included reasoning_content in ChatCompletionMessage.
    • Defined new types: ChatCompletionToolChoice, ChatCompletionResponseFormat, ChatCompletionThinking.
  • packages/adapter-spark/src/utils.ts
    • Imported ModelCapabilities.
    • Adjusted langchainMessageToSparkMessage to conditionally set name for assistant or tool roles.
    • Corrected messageTypeSparkAIRole mapping for function and tool types to return 'tool'.
    • Replaced the static modelMapping object with a sparkModelCatalog array of SparkModelDefinition objects, centralizing model metadata.
    • Introduced defaultSparkAppConfig for pre-populating model configurations.
    • Added new utility functions: getSparkModelDefinition, getSparkModelConfigAliases, getSparkModelPassword, hasSparkModelPassword, and humanizeSparkAlias for robust model and password management.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@dingyi222666 dingyi222666 marked this pull request as ready for review March 8, 2026 18:07
Co-authored-by: codefactor-io[bot] <47775046+codefactor-io[bot]@users.noreply.github.com>
@dingyi222666 dingyi222666 merged commit fa8e63c into v1-dev Mar 8, 2026
2 checks passed
@dingyi222666 dingyi222666 deleted the fix/spark-adapter branch March 8, 2026 18:08
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a significant and well-executed refactoring of the Spark adapter, featuring a centralized model catalog that greatly enhances maintainability and extensibility. The dynamic model filtering based on password configuration is a clever and user-friendly addition. The PR also provides valuable documentation updates for several adapters and crucial bug fixes, such as correcting the message role mapping. My feedback is limited to a couple of minor points regarding an affiliate link in the documentation and a hardcoded non-English string in an error message. Overall, these are excellent changes that substantially improve the adapter's architecture.


**如果你没有可用的 Gemini 格式 API,请前往以下地址注册:**

[https://api.bltcy.ai/register](https://api.bltcy.ai/register?aff=ec5e312997)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The new usage documentation includes a hardcoded affiliate link. While this might be intentional, it's generally better to avoid embedding affiliate links directly in the source code to maintain neutrality and avoid potential user trust issues. Consider removing the affiliate query parameter (?aff=...) or making it configurable if this is part of a partnership.

Suggested change
[https://api.bltcy.ai/register](https://api.bltcy.ai/register?aff=ec5e312997)
[https://api.bltcy.ai/register](https://api.bltcy.ai/register)


**如果你没有可用的 OpenAI 格式 API,请前往以下地址注册:**

[https://api.bltcy.ai/register](https://api.bltcy.ai/register?aff=ec5e312997)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The new usage documentation includes a hardcoded affiliate link. While this might be intentional, it's generally better to avoid embedding affiliate links directly in the source code to maintain neutrality and avoid potential user trust issues. Consider removing the affiliate query parameter (?aff=...) or making it configurable if this is part of a partnership.

Suggested change
[https://api.bltcy.ai/register](https://api.bltcy.ai/register?aff=ec5e312997)
[https://api.bltcy.ai/register](https://api.bltcy.ai/register)


**如果你没有可用的 OpenAI 格式 API,请前往以下地址注册:**

[https://api.bltcy.ai/register](https://api.bltcy.ai/register?aff=ec5e312997)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The new usage documentation includes a hardcoded affiliate link. While this might be intentional, it's generally better to avoid embedding affiliate links directly in the source code to maintain neutrality and avoid potential user trust issues. Consider removing the affiliate query parameter (?aff=...) or making it configurable if this is part of a partnership.

Suggested change
[https://api.bltcy.ai/register](https://api.bltcy.ai/register?aff=ec5e312997)
[https://api.bltcy.ai/register](https://api.bltcy.ai/register)

if (matchedConfigs.length < 1) {
throw new ChatLunaError(
ChatLunaErrorCode.API_KEY_UNAVAILABLE,
new Error(`没有找到模型 "${model}" 的 API 密钥`)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This error message is hardcoded in Chinese. For better maintainability and to make the codebase more accessible to a wider range of contributors, it's recommended to use English for internal error messages. If localization is desired, these strings should be moved to locale files and referenced via keys.

Suggested change
new Error(`没有找到模型 "${model}" 的 API 密钥`)
new Error(`API key for model "${model}" not found`)

dingyi222666 added a commit that referenced this pull request Mar 10, 2026
…iguration (#765)

* refactor(adapter): overhaul Spark adapter with model catalog and dynamic configuration

This commit significantly improves the Spark adapter with a centralized model
catalog system and dynamic configuration management. Models are now only shown
when their API passwords are properly configured.

Key improvements:
- Introduce SparkModelDefinition interface for centralized model configuration
- Create sparkModelCatalog with all Spark models and their properties
- Implement dynamic model filtering based on configured passwords
- Support model aliases for flexible password configuration
- Add comprehensive getSparkModelPassword and hasSparkModelPassword utilities
- Improve API path routing for different Spark API versions
- Enhance request config selection with load balancing across configurations
- Better error handling for missing passwords and invalid models
- Update locale strings for clearer setup instructions in Chinese and English

Other improvements:
- Add usage documentation for Gemini, OpenAI, and OpenAI-like adapters
- Simplify model refresh logic in SparkClient
- Improve message role mapping (user/tool distinction)
- Add support for additional request parameters (top_k, keep_alive, etc.)
- Enhance response format handling with thinking and JSON response support
- Better handling of chat completion tool parameters

Benefits:
- Cleaner configuration with prefilled model aliases
- Reduced duplication through centralized catalog
- Better API flexibility with support for multiple Spark API versions
- More intuitive user experience with only available models shown
- Improved maintainability with model definitions in one place

* chore(packages): bump adapter package versions

Update package versions for all adapter packages following the refactoring
and documentation improvements made in the previous commit.

Version updates:
- adapter-spark: 1.3.6 -> 1.3.7
- adapter-gemini: 1.3.28 -> 1.3.29
- adapter-openai: 1.3.8 -> 1.3.9
- adapter-openai-like: 1.3.9 -> 1.3.10

These version bumps reflect the significant improvements made to the adapters
including the new model catalog system, dynamic configuration management, and
comprehensive documentation.

* Update packages/adapter-spark/src/types.ts

Co-authored-by: codefactor-io[bot] <47775046+codefactor-io[bot]@users.noreply.github.com>

---------

Co-authored-by: codefactor-io[bot] <47775046+codefactor-io[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant