Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Great project, hoping to support chain's formatted output. #51

Closed
gaye746560359 opened this issue May 19, 2023 · 4 comments · Fixed by #60
Closed

Great project, hoping to support chain's formatted output. #51

gaye746560359 opened this issue May 19, 2023 · 4 comments · Fixed by #60
Labels
feature New feature or request

Comments

@gaye746560359
Copy link

Some chains may output the sources of answers. It is hoped to increase the content of streaming output in JSON format for easy extraction and display on the web.

@gaye746560359 gaye746560359 added the feature New feature or request label May 19, 2023
@ajndkr
Copy link
Owner

ajndkr commented May 19, 2023

hi! that sounds like a good idea! do you maybe have a format in mind? Maybe the streaming output could be something like

{"token": "..."}

and at the end, it can be:

{"token": "", "sources": [...]}

let me know what you think @gaye746560359

@gaye746560359
Copy link
Author

你好!这听起来像一个好主意!您可能有一种格式吗?也许流输出可能是这样的

{"token": "..."}

最后,它可以是:

{"token": "", "sources": [...]}

让我知道你的想法@gaye746560359

Your idea is great and meets the requirements. Alternatively, you can refer to the response format of chat.openai.com and use a similar format like: {"token": ["hello", ...], "sources": [...]} or add custom response prefixes and suffixes, such as: {"prefix":"We have found the following answers for you:", "token":["hello", ...], "sources":[...], "suffix":"This information is confidential."} Additionally, performance, response speed, and ease of API call reading should be considered.

@gaye746560359
Copy link
Author

你好!这听起像一个好主人的意思!你可能有一种格式吗?也许流出可能是这样的

{"token": "..."}

最后,它可以是:

{"token": "", "sources": [...]}

让我知道你的想法@gaye746560359

你的想法很好,符合要求。或者,您可以参考 chat.openai.com 的响应格式并使用类似的格式,例如:{"token": ["hello", ...], "sources": [...]} 或添加自定义响应前缀和后缀,例如:{"prefix":"We have found the following answers for you:", "token":["hello", ...], "sources":[...], " suffix":"This information is confidential."} 此外,还应考虑性能、响应速度和 API 调用读取的难易程度。

one:{"prefix":"We have found the following answers for you:", "token":["hello"], "sources":[...], " suffix":"This information is confidential."}
two:{"prefix":"We have found the following answers for you:", "token":["hello", "world",...], "sources":[...], " suffix":"This information is confidential."}

This was referenced May 23, 2023
@ajndkr
Copy link
Owner

ajndkr commented May 23, 2023

@gaye746560359 hi! i've added support for JSON streaming in 0.6.5. It should be available soon.

pip install lanarky==0.6.5

let me know if it solves your use case! feel free to open new issues for any improvements or bug fixes. cheers!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants