Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

It seems to consume a lot of tokens #4

Open
AgimaFR opened this issue Jul 17, 2023 · 1 comment
Open

It seems to consume a lot of tokens #4

AgimaFR opened this issue Jul 17, 2023 · 1 comment

Comments

@AgimaFR
Copy link

AgimaFR commented Jul 17, 2023

Maybe it's just an impression, but I find that BlockAGI consumes a lot of tokens compared to AgentGPT or SuperAGI.

@smiled0g
Copy link
Contributor

Yes, unfortunately the way BlockAGI works is quite inefficient at the moment. A lot of tokens are used on formatting instructing model to just work well.

Future iteration's goal include reducing this overhead as well as reducing repetition in the narration tasks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants