We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
如果在一个话题里继续提问,会将全部上下文都带入,但实际可能只需要近几次上下文即可 所以希望有个最大提交上下文数量的设置。 能比较好的减少token消耗。
感谢大佬的贡献
The text was updated successfully, but these errors were encountered:
d5a1a09
Sorry, something went wrong.
chore: max context count (Close chatgpt-web-dev#394, Close chatgpt-we…
f34201c
…b-dev#221)
说实在的, 增加这个真的不如增加个快速删除某个上下文的方式, 比如鼠标在某个上下文上移上去时右边显示(漂浮)某个删除按钮(垃圾桶或×), 然后单击它即可删除那个上下文, 这样不需要的上下文我们及时删除, 有便于删减没必要的上下文.
No branches or pull requests
如果在一个话题里继续提问,会将全部上下文都带入,但实际可能只需要近几次上下文即可
所以希望有个最大提交上下文数量的设置。
能比较好的减少token消耗。
感谢大佬的贡献
The text was updated successfully, but these errors were encountered: