-
Notifications
You must be signed in to change notification settings - Fork 13.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue: I'd like to use Hugging Face's Chat UI frontend with LangChain. #14207
Comments
🤖 Hello @haje01! 👋 I'm Dosu, an assistant here to help you with your LangChain related questions and issues while we wait for a human maintainer. I'm here to make your development process easier and smoother. Let's get started! To serve the chain you have configured with LangChain in Text Generation Inference format to use with Hugging Face's Chat UI frontend, you need to replace the Here is an example of how you can do it: from fastapi import FastAPI
from fastapi.responses import RedirectResponse
from langserve import add_routes
from langchain import TextGenerationInferenceChain
app = FastAPI()
@app.get("/")
async def redirect_root_to_docs():
return RedirectResponse("/docs")
# Create your chain
chain = TextGenerationInferenceChain()
# Add your chain to the routes
add_routes(app, chain)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000) In this example, You can find more details about this in the LangChain repository. As for the I hope this helps! If you have any other questions, feel free to ask. -Dosu Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
I also find this functionality very useful. We are pushing for having an integration between HuggingFace ChatUI and endpoints built using LangServe. But I think this is something that needs to be contributed to HuggingFace project instead of here. We already open a MR for this: huggingface/chat-ui#1009 take a look if you are interested on this |
Issue you'd like to raise.
I'd like to use Hugging Face's Chat UI frontend with LangChain.
https://github.com/huggingface/chat-ui
But it looks like the Chat UI is only available through Hugginf Face's Text Generation Inference endpoint.
huggingface/chat-ui#466
How can I serve the chain I have configured with LangChain in TGI format so I can use Chat UI?
Thank you in advance.
Suggestion:
No response
The text was updated successfully, but these errors were encountered: