Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: I'd like to use Hugging Face's Chat UI frontend with LangChain. #14207

Closed
haje01 opened this issue Dec 4, 2023 · 2 comments
Closed

Issue: I'd like to use Hugging Face's Chat UI frontend with LangChain. #14207

haje01 opened this issue Dec 4, 2023 · 2 comments
Labels
Ɑ: models Related to LLMs or chat model modules 🤖:question A specific question about the codebase, product, project, or how to use a feature

Comments

@haje01
Copy link

haje01 commented Dec 4, 2023

Issue you'd like to raise.

I'd like to use Hugging Face's Chat UI frontend with LangChain.

https://github.com/huggingface/chat-ui

But it looks like the Chat UI is only available through Hugginf Face's Text Generation Inference endpoint.

huggingface/chat-ui#466

How can I serve the chain I have configured with LangChain in TGI format so I can use Chat UI?

Thank you in advance.

Suggestion:

No response

Copy link

dosubot bot commented Dec 4, 2023

🤖

Hello @haje01! 👋

I'm Dosu, an assistant here to help you with your LangChain related questions and issues while we wait for a human maintainer. I'm here to make your development process easier and smoother. Let's get started!

To serve the chain you have configured with LangChain in Text Generation Inference format to use with Hugging Face's Chat UI frontend, you need to replace the NotImplemented in the add_routes function with your chain.

Here is an example of how you can do it:

from fastapi import FastAPI
from fastapi.responses import RedirectResponse
from langserve import add_routes
from langchain import TextGenerationInferenceChain

app = FastAPI()

@app.get("/")
async def redirect_root_to_docs():
    return RedirectResponse("/docs")

# Create your chain
chain = TextGenerationInferenceChain()

# Add your chain to the routes
add_routes(app, chain)

if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="0.0.0.0", port=8000)

In this example, TextGenerationInferenceChain() is your configured chain. You need to replace it with your actual chain. This will add the necessary routes to your FastAPI application to serve your chain. Then, you can use this server with Hugging Face's Chat UI frontend.

You can find more details about this in the LangChain repository.

As for the TextGenerationInferenceChain class, it is a subclass of the LLM base class in the LangChain codebase, designed to interact with the HuggingFace text generation API. It has several attributes and methods that control the behavior of the text generation. You can find more details about this class in the LangChain repository.

I hope this helps! If you have any other questions, feel free to ask.

-Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:question A specific question about the codebase, product, project, or how to use a feature labels Dec 4, 2023
@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Mar 11, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Mar 18, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Mar 18, 2024
@rahermur
Copy link

I also find this functionality very useful. We are pushing for having an integration between HuggingFace ChatUI and endpoints built using LangServe. But I think this is something that needs to be contributed to HuggingFace project instead of here. We already open a MR for this: huggingface/chat-ui#1009 take a look if you are interested on this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Ɑ: models Related to LLMs or chat model modules 🤖:question A specific question about the codebase, product, project, or how to use a feature
Projects
None yet
Development

No branches or pull requests

2 participants