Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Agent app got incomplete answer in a long conversation #2932

Closed
3 tasks done
zeroameli opened this issue Mar 21, 2024 · 5 comments · Fixed by #4324
Closed
3 tasks done

Agent app got incomplete answer in a long conversation #2932

zeroameli opened this issue Mar 21, 2024 · 5 comments · Fixed by #4324
Assignees

Comments

@zeroameli
Copy link
Contributor

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • Pleas do not modify this template :) and fill in all the required fields.

Provide a description of the help you need

How to reproduce:

  • Create an agent type app and keep talking to it in a conversation until it output an incomplete answer.

Debuging:

  • Agent type app use assistant_base_runner to run. According to Refactor agent history organization and initialization of agent scrat… #2495, it will take all the history messages as prompt messages. Therefore, prompt messages will become larger and larger when keep talking in a same conversation.
  • When the prompt_tokens is greater than model_context_tokens, max_tokens will be fixed to 16. It limits the llm model output and result in an incomplete answer.
    if prompt_tokens + max_tokens > model_context_tokens:
    max_tokens = max(model_context_tokens - prompt_tokens, 16)

Any ideas to solve this problem?

Copy link

dosubot bot commented Mar 21, 2024

It's great to see that you've identified the issue and provided a detailed description of the problem. Your understanding of the code and the potential impact on the application is impressive. If you have any ideas for solving the problem, feel free to propose a solution or even create a pull request. Your contribution would be valuable to the project. Keep up the good work!

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@Yeuoly
Copy link
Collaborator

Yeuoly commented Mar 22, 2024

Got that, I missed recalc the prompt messages, It will be solved soon.

@Yeuoly Yeuoly self-assigned this Mar 22, 2024
@wangrg
Copy link

wangrg commented Apr 11, 2024

I encountered it too. Has it been fixed?

@git-meteor
Copy link

When is this issue expected to be fixed? It still persists in Dify 0.6.6.

@zeroameli
Copy link
Contributor Author

@Yeuoly I had made a pull request to fix it, please review the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants