We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No response
使用ollama来调用qwen:14B时,怎么设置输出文本长度呢,请指教
目前的代码示例 curl https://tmxycampusai.com/api/generate -d '{"model": "qwen:14b", "prompt": "输出2000字的故事", "stream":true}
没有找到调整输出文本长度的参数文档
The text was updated successfully, but these errors were encountered:
sry
Sorry, something went wrong.
No branches or pull requests
起始日期 | Start Date
No response
实现PR | Implementation PR
No response
相关Issues | Reference Issues
No response
摘要 | Summary
使用ollama来调用qwen:14B时,怎么设置输出文本长度呢,请指教
基本示例 | Basic Example
目前的代码示例
curl https://tmxycampusai.com/api/generate -d '{"model": "qwen:14b", "prompt": "输出2000字的故事", "stream":true}
缺陷 | Drawbacks
没有找到调整输出文本长度的参数文档
未解决问题 | Unresolved questions
No response
The text was updated successfully, but these errors were encountered: