Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Is there any chain for sumarizing large context ? #507

Open
LegendLeo opened this issue Jul 8, 2023 · 5 comments
Open

[FEATURE] Is there any chain for sumarizing large context ? #507

LegendLeo opened this issue Jul 8, 2023 · 5 comments
Labels
enhancement New feature or request

Comments

@LegendLeo
Copy link

Describe the feature you'd like
I want to summarize a webpage, the webpage content resulted tokens exceed the maximum, langchain provide a load_summarize_chain for this situation, is there any component can i use

@LegendLeo LegendLeo changed the title [FEATURE] Is there any chain for sumarize large context ? [FEATURE] Is there any chain for sumarizing large context ? Jul 9, 2023
@chungyau97
Copy link
Contributor

Will need to dive deeper on how to solve max tokens limitation in Flowise, as it will hit the limit of OpenAI.

langchainjs reference on how to implement into Flowise

Reference:
langchainjs #1827
Auto-GPT #2906

@LegendLeo
Copy link
Author

Will need to dive deeper on how to solve max tokens limitation in Flowise, as it will hit the limit of OpenAI.

langchainjs reference on how to implement into Flowise

Reference: langchainjs #1827 Auto-GPT #2906

I just want a loadSummarizationChain component, langchain has provided: loadSummarizationChain

@chungyau97
Copy link
Contributor

Alright, I can do that.

Once PR is opened I will update you here.

@HenryHengZJ HenryHengZJ added the enhancement New feature or request label Jul 17, 2023
@HenryHengZJ
Copy link
Contributor

@LegendLeo have you tried using ConversationalRetrievalQAChain with different chain type?
image

you can read more what are these for - https://js.langchain.com/docs/modules/chains/document/map_reduce , these chain types are used to summarize long context

@LegendLeo
Copy link
Author

ConversationalRetrievalQAChain is used in conversation situation, but I just want to summarize a very long text, not a chat history.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants