-
Notifications
You must be signed in to change notification settings - Fork 221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how can i use it in langchain? #31
Comments
Hi @whm233, thank you for your support and interest in LLMLingua. Although I'm not an expert in LangChain, based on my experience, I believe its usage in LangChain should be similar to that in LlamaIndex, i.e., operating at the Postprocessor-level or reranker-level. I briefly reviewed the LangChain pipeline and think you'll need to extend the Afterward, use Thank you again for your interest. |
**Description**: This PR adds support for using the [LLMLingua project ](https://github.com/microsoft/LLMLingua) especially the LongLLMLingua (Enhancing Large Language Model Inference via Prompt Compression) as a document compressor / transformer. The LLMLingua project is an interesting project that can greatly improve RAG system by compressing prompts and contexts while keeping their semantic relevance. **Issue**: microsoft/LLMLingua#31 **Dependencies**: [llmlingua](https://pypi.org/project/llmlingua/) @baskaryan --------- Co-authored-by: Ayodeji Ayibiowu <ayodeji.ayibiowu@getinge.com> Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
Thanks to @thehapyone's contribution, LLMLingua is now available in Langchain. You can follow this notebook for guidance. |
…i#17711) **Description**: This PR adds support for using the [LLMLingua project ](https://github.com/microsoft/LLMLingua) especially the LongLLMLingua (Enhancing Large Language Model Inference via Prompt Compression) as a document compressor / transformer. The LLMLingua project is an interesting project that can greatly improve RAG system by compressing prompts and contexts while keeping their semantic relevance. **Issue**: microsoft/LLMLingua#31 **Dependencies**: [llmlingua](https://pypi.org/project/llmlingua/) @baskaryan --------- Co-authored-by: Ayodeji Ayibiowu <ayodeji.ayibiowu@getinge.com> Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
code like this:
prompt = PromptTemplate(template=prompt_template, input_variables=["context", "question"])
kc = RetrievalQA.from_llm(llm=qwllm, retriever=compression_retriever, prompt=prompt)
The text was updated successfully, but these errors were encountered: