Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update wenxin llm #2929

Merged
merged 7 commits into from
Mar 27, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
model: ernie-3.5-8k
label:
en_US: Ernie-3.5-8K
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 4096
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.8
- name: top_p
use_template: top_p
- name: max_output_tokens
use_template: max_output_tokens
default: 1024
min: 2
max: 2048
- name: penalty_score
use_template: penalty_score
- name: frequency_penalty
use_template: frequency_penalty
- name: response_format
use_template: response_format
- name: disable_search
label:
zh_Hans: 禁用搜索
en_US: Disable Search
type: boolean
help:
zh_Hans: 禁用模型自行进行外部搜索。
en_US: Disable the model to perform external search.
required: false
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
model: ernie-bot-8k
model: ernie-3.5-8k-0205
label:
en_US: Ernie Bot 8k
en_US: Ernie-3.5-8K-0205
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8000
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
Expand All @@ -15,14 +15,13 @@ parameter_rules:
default: 0.8
- name: top_p
use_template: top_p
- name: max_tokens
use_template: max_tokens
required: true
- name: max_output_tokens
use_template: max_output_tokens
default: 1024
min: 1
max: 8000
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
- name: presence_penalty
use_template: presence_penalty
- name: penalty_score
use_template: penalty_score
- name: frequency_penalty
use_template: frequency_penalty
- name: response_format
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
model: ernie-3.5-8k-1222
label:
en_US: Ernie-3.5-8K-1222
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.8
- name: top_p
use_template: top_p
- name: max_output_tokens
use_template: max_output_tokens
default: 1024
min: 1
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
max: 8000
- name: penalty_score
use_template: penalty_score
- name: frequency_penalty
use_template: frequency_penalty
- name: response_format
use_template: response_format
- name: disable_search
label:
zh_Hans: 禁用搜索
en_US: Disable Search
type: boolean
help:
zh_Hans: 禁用模型自行进行外部搜索。
en_US: Disable the model to perform external search.
required: false
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
model: ernie-bot-4
model: ernie-3.5-8k
label:
en_US: Ernie Bot 4
en_US: Ernie-3.5-8K
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 4800
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
Expand All @@ -15,14 +15,13 @@ parameter_rules:
default: 0.8
- name: top_p
use_template: top_p
- name: max_tokens
use_template: max_tokens
required: true
default: 256
- name: max_output_tokens
use_template: max_output_tokens
default: 1024
min: 1
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
max: 4800
- name: presence_penalty
use_template: presence_penalty
max: 8000
- name: penalty_score
use_template: penalty_score
- name: frequency_penalty
use_template: frequency_penalty
- name: response_format
Expand Down
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
model: ernie-bot
model: ernie-4.0-8k
label:
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
en_US: Ernie Bot
en_US: Ernie-4.0-8K
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 4800
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
Expand All @@ -15,16 +15,17 @@ parameter_rules:
default: 0.8
- name: top_p
use_template: top_p
- name: max_tokens
use_template: max_tokens
required: true
- name: max_output_tokens
use_template: max_output_tokens
default: 256
min: 1
max: 4800
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
- name: presence_penalty
use_template: presence_penalty
- name: penalty_score
use_template: penalty_score
- name: frequency_penalty
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
use_template: frequency_penalty
- name: response_format
use_template: response_format
- name: disable_search
label:
zh_Hans: 禁用搜索
Expand All @@ -34,5 +35,3 @@ parameter_rules:
zh_Hans: 禁用模型自行进行外部搜索。
en_US: Disable the model to perform external search.
required: false
- name: response_format
use_template: response_format

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
model: ernie-lite-8k-0308
label:
en_US: ERNIE-Lite-8K-0308
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.95
- name: top_p
use_template: top_p
min: 0
max: 1.0
default: 0.7
- name: max_output_tokens
use_template: max_output_tokens
Weaxs marked this conversation as resolved.
Show resolved Hide resolved
- name: penalty_score
use_template: penalty_score
default: 1.0
min: 1.0
max: 2.0
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
model: ernie-lite-8k-0922
label:
en_US: ERNIE-Lite-8K-0922
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.95
- name: top_p
use_template: top_p
min: 0
max: 1.0
default: 0.7
- name: max_output_tokens
use_template: max_output_tokens
- name: penalty_score
use_template: penalty_score
default: 1.0
min: 1.0
max: 2.0

Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
model: ernie-speed-128k
label:
en_US: ERNIE-Speed-128K
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 11200
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.95
- name: top_p
use_template: top_p
min: 0
max: 1.0
default: 0.7
- name: max_output_tokens
use_template: max_output_tokens
- name: penalty_score
use_template: penalty_score
default: 1.0
min: 1.0
max: 2.0
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
model: ernie-speed-8k
label:
en_US: ERNIE-Speed-8K
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.95
- name: top_p
use_template: top_p
min: 0
max: 1.0
default: 0.7
- name: max_output_tokens
use_template: max_output_tokens
- name: penalty_score
use_template: penalty_score
default: 1.0
min: 1.0
max: 2.0
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
model: ernie-speed-appbuilder
label:
en_US: ERNIE-Speed-AppBuilder
model_type: llm
features:
- agent-thought
model_properties:
mode: chat
context_size: 8192
parameter_rules:
- name: temperature
use_template: temperature
min: 0.1
max: 1.0
default: 0.95
- name: top_p
use_template: top_p
min: 0
max: 1.0
default: 0.7
- name: max_output_tokens
use_template: max_output_tokens
- name: penalty_score
use_template: penalty_score
default: 1.0
min: 1.0
max: 2.0
20 changes: 14 additions & 6 deletions api/core/model_runtime/model_providers/wenxin/llm/ernie_bot.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,15 +121,23 @@ def __init__(self, content: str, role: str = 'user') -> None:

class ErnieBotModel:
api_bases = {
'ernie-bot': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions',
'ernie-bot-4': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions_pro',
'ernie-bot-8k': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie_bot_8k',
'ernie-bot-turbo': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/eb-instant',
'ernie-3.5-8k': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions',
'ernie-3.5-8k-0205': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie-3.5-8k-0205',
'ernie-3.5-8k-1222': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie-3.5-8k-1222',
'ernie-3.5-4k-0205': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie-3.5-4k-0205',
'ernie-4.0-8k': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/completions_pro',
'ernie-speed-8k': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie_speed',
'ernie-speed-128k': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie-speed-128k',
'ernie-speed-appbuilder': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ai_apaas',
'ernie-lite-8k-0922': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/eb-instant',
'ernie-lite-8k-0308': 'https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat/ernie-lite-8k',
}

function_calling_supports = [
'ernie-bot',
'ernie-bot-8k',
'ernie-3.5-8k',
'ernie-3.5-8k-0205',
'ernie-3.5-8k-1222',
'ernie-3.5-4k-0205'
]

api_key: str = ''
Expand Down