-
-
Notifications
You must be signed in to change notification settings - Fork 13.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Is there any chain for sumarizing large context ? #507
Comments
Will need to dive deeper on how to solve max tokens limitation in Flowise, as it will hit the limit of OpenAI. langchainjs reference on how to implement into Flowise Reference: |
I just want a loadSummarizationChain component, langchain has provided: loadSummarizationChain |
Alright, I can do that. Once PR is opened I will update you here. |
@LegendLeo have you tried using ConversationalRetrievalQAChain with different chain type? you can read more what are these for - https://js.langchain.com/docs/modules/chains/document/map_reduce , these chain types are used to summarize long context |
ConversationalRetrievalQAChain is used in conversation situation, but I just want to summarize a very long text, not a chat history. |
Describe the feature you'd like
I want to summarize a webpage, the webpage content resulted tokens exceed the maximum, langchain provide a load_summarize_chain for this situation, is there any component can i use
The text was updated successfully, but these errors were encountered: