Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

💡 [REQUEST] - <使用ollama来调用qwen:14B时,怎么设置输出文本长度呢> #1243

Closed
chaoskklt opened this issue May 9, 2024 · 1 comment
Labels
question Further information is requested

Comments

@chaoskklt
Copy link

起始日期 | Start Date

No response

实现PR | Implementation PR

No response

相关Issues | Reference Issues

No response

摘要 | Summary

使用ollama来调用qwen:14B时,怎么设置输出文本长度呢,请指教

基本示例 | Basic Example

目前的代码示例
curl https://tmxycampusai.com/api/generate -d '{"model": "qwen:14b", "prompt": "输出2000字的故事", "stream":true}

缺陷 | Drawbacks

没有找到调整输出文本长度的参数文档

未解决问题 | Unresolved questions

No response

@chaoskklt chaoskklt added the question Further information is requested label May 9, 2024
@chaoskklt chaoskklt changed the title 💡 [REQUEST] - <title> 💡 [REQUEST] - <使用ollama来调用qwen:14B时,怎么设置输出文本长度呢> May 9, 2024
@chaoskklt
Copy link
Author

sry

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant