Module behaviour on OpenAI request content is to large #2024
Labels
autoclosed
Closed by the bot. We still want this, but it didn't quite make the latest prioritization round
Bad UX
Issues that frustrate our Users
Module System
Currently, the OpenAI module might return:
Can this be handled in the module? E.g., for this specific example that the
142471token string is split into (142471/2046) 70 snippets, and a centroid is calculated and stored?The text was updated successfully, but these errors were encountered: