Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama本地或docker启动,ollama pull llama3 或deepseek-llm,在model_config.py中选择列表中加上,同时online llm中加上ollama的OLLAMA_BASE_URL (http://127.0.0.1:11434) #4028

Open
Ptianyu opened this issue May 16, 2024 · 2 comments

Comments

@Ptianyu
Copy link

Ptianyu commented May 16, 2024

          ollama本地或docker启动,ollama pull llama3 或deepseek-llm,在model_config.py中选择列表中加上,同时online llm中加上ollama的OLLAMA_BASE_URL (http://127.0.0.1:11434)

llama3 要每次提醒他用简体中文回答,或设置在prompt中

Copy link

dosubot bot commented May 16, 2024

To ensure the llama3 model responds in Simplified Chinese, you should provide prompts in Simplified Chinese, as the model's response language is influenced by the input language. The model_config.py file allows specifying which models to use but doesn't offer a direct setting for default language preferences. If you're aiming for responses in Simplified Chinese, guiding the model with Simplified Chinese prompts is the most straightforward approach. There's no documented method in the repository for setting a language reminder or default within the model's configuration directly.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@blackwhites
Copy link

llama3 可以设置系统提示词,让他必须用中文输出

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants