You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
lbeurerkellner
changed the title
As a smith I would like the openai_chunksize to be determined by token length constraints if possible
As a smith I would like the openai_chunksize to be affected by token length constraints if possible
May 3, 2023
If a query specifies len(TOKENS(VAR)) < 20, we should automatically set the chunk size to at most 20, to save some speculative tokens.
The text was updated successfully, but these errors were encountered: