Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Count spend tokens on issue context (all tokens spent mean issue is too complex and is not solvable by AutoPR) #94

Closed
Konard opened this issue May 5, 2023 · 1 comment

Comments

@Konard
Copy link
Contributor

Konard commented May 5, 2023

It also should not begin solving the issue if the issue itself uses too many tokens even after summarization. AutoPR should always resummarize the issue context between steps and keep track of how much tokens left.

Some context may be temporary offloaded to pull request comments if it is not required for the next step.

This may also require doing one change at a time, and after that change is done it can be removed from context.

Related to #88 and #93

@irgolic
Copy link
Owner

irgolic commented Jun 8, 2023

I might be misunderstanding, but this isn't a viable way of using the LLM. If you ask it to summarize something, it'll do it, into whatever short form you like.

Another way to interpret what you're saying is that it should split issues if they're too long. I'm down to explore this, as a separate trigger on the issue itself (akin to #86 (comment)).

@irgolic irgolic closed this as completed Jun 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants