Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Determine chunk size based on model being used #2541

Closed
Pwuts opened this issue Apr 19, 2023 · 3 comments
Closed

Determine chunk size based on model being used #2541

Pwuts opened this issue Apr 19, 2023 · 3 comments

Comments

@Pwuts
Copy link
Member

Pwuts commented Apr 19, 2023

See:

The token limit depends directly on the LLM being used, so these settings should be consolidated and calculated where possible, instead of letting the user set them by hand.

@ntindle
Copy link
Member

ntindle commented Apr 23, 2023

Would love a prompts info utils with all this info in it

@github-actions
Copy link
Contributor

github-actions bot commented Sep 6, 2023

This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days.

@github-actions github-actions bot added the Stale label Sep 6, 2023
@github-actions
Copy link
Contributor

This issue was closed automatically because it has been stale for 10 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants