Skip to content

Commit

Permalink
Adds playground chat widget for message list inputs, additional globa…
Browse files Browse the repository at this point in the history
…l callback for chunks (#489)

Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com>
  • Loading branch information
jacoblee93 and eyurtsev committed Feb 28, 2024
1 parent 8ea0cb4 commit ce727fc
Show file tree
Hide file tree
Showing 11 changed files with 426 additions and 313 deletions.
55 changes: 51 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ directory.
| **Auth** with `add_routes`: Simple authentication mechanism based on path dependencies. (No useful on its own for implementing per user logic.) | [server](https://github.com/langchain-ai/langserve/tree/main/examples/auth/path_dependencies/server.py) |
| **Auth** with `add_routes`: Implement per user logic and auth for endpoints that use per request config modifier. (**Note**: At the moment, does not integrate with OpenAPI docs.) | [server](https://github.com/langchain-ai/langserve/tree/main/examples/auth/per_req_config_modifier/server.py), [client](https://github.com/langchain-ai/langserve/tree/main/examples/auth/per_req_config_modifier/client.ipynb) |
| **Auth** with `APIHandler`: Implement per user logic and auth that shows how to search only within user owned documents. | [server](https://github.com/langchain-ai/langserve/tree/main/examples/auth/api_handler/server.py), [client](https://github.com/langchain-ai/langserve/tree/main/examples/auth/api_handler/client.ipynb) |
| **Widgets** Different widgets that can be used with playground (file upload and chat) | [server](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/server.py) |
| **Widgets** Different widgets that can be used with playground (file upload and chat) | [server](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/chat/tuples/server.py) |
| **Widgets** File upload widget used for LangServe playground. | [server](https://github.com/langchain-ai/langserve/tree/main/examples/file_processing/server.py), [client](https://github.com/langchain-ai/langserve/tree/main/examples/file_processing/client.ipynb) |

## Sample Application
Expand Down Expand Up @@ -161,6 +161,23 @@ if __name__ == "__main__":
uvicorn.run(app, host="localhost", port=8000)
```

If you intend to call your endpoint from the browser, you will also need to set CORS headers.
You can use FastAPI's built-in middleware for that:

```python
from fastapi.middleware.cors import CORSMiddleware

# Set all CORS enabled origins
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
expose_headers=["*"],
)
```

### Docs

If you've deployed the server above, you can view the generated OpenAPI docs using:
Expand Down Expand Up @@ -553,7 +570,7 @@ Here are a few examples:

| Description | Links |
|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| **Widgets** Different widgets that can be used with playground (file upload and chat) | [server](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/server.py), [client](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/client.ipynb) |
| **Widgets** Different widgets that can be used with playground (file upload and chat) | [server](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/chat/tuples/server.py), [client](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/client.ipynb) |
| **Widgets** File upload widget used for LangServe playground. | [server](https://github.com/langchain-ai/langserve/tree/main/examples/file_processing/server.py), [client](https://github.com/langchain-ai/langserve/tree/main/examples/file_processing/client.ipynb) |

#### Schema
Expand Down Expand Up @@ -625,7 +642,7 @@ Example widget:
### Chat Widget

Look
at [widget example](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/server.py).
at the [widget example](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/chat/tuples/server.py).

To define a chat widget, make sure that you pass "type": "chat".

Expand All @@ -637,7 +654,6 @@ To define a chat widget, make sure that you pass "type": "chat".
Here's a snippet:

```python

class ChatHistory(CustomUserType):
chat_history: List[Tuple[str, str]] = Field(
...,
Expand Down Expand Up @@ -676,6 +692,37 @@ Example widget:
<img src="https://github.com/langchain-ai/langserve/assets/3205522/a71ff37b-a6a9-4857-a376-cf27c41d3ca4" width="50%"/>
</p>

You can also specify a list of messages as your a parameter directly, as shown in this snippet:

```python
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful assisstant named Cob."),
MessagesPlaceholder(variable_name="messages"),
]
)

chain = prompt | ChatAnthropic(model="claude-2")


class MessageListInput(BaseModel):
"""Input for the chat endpoint."""
messages: List[Union[HumanMessage, AIMessage]] = Field(
...,
description="The chat messages representing the current conversation.",
extra={"widget": {"type": "chat", "input": "messages"}},
)


add_routes(
app,
chain.with_types(input_type=MessageListInput),
path="/chat",
)
```

See [this sample file](https://github.com/langchain-ai/langserve/tree/main/examples/widgets/chat/message_list/server.py) for an example.

### Enabling / Disabling Endpoints (LangServe >=0.0.33)

You can enable / disable which endpoints are exposed when adding routes for a given chain.
Expand Down
64 changes: 64 additions & 0 deletions examples/widgets/chat/message_list/server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
#!/usr/bin/env python
"""Example of a simple chatbot that just passes current conversation
state back and forth between server and client.
"""
from typing import List, Union

from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from langchain.chat_models import ChatAnthropic
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder

from langserve import add_routes
from langserve.pydantic_v1 import BaseModel, Field

app = FastAPI(
title="LangChain Server",
version="1.0",
description="Spin up a simple api server using Langchain's Runnable interfaces",
)


# Set all CORS enabled origins
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
expose_headers=["*"],
)


# Declare a chain
prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful assisstant named Cob."),
MessagesPlaceholder(variable_name="messages"),
]
)

chain = prompt | ChatAnthropic(model="claude-2") | StrOutputParser()


class InputChat(BaseModel):
"""Input for the chat endpoint."""

messages: List[Union[HumanMessage, AIMessage, SystemMessage]] = Field(
...,
description="The chat messages representing the current conversation.",
extra={"widget": {"type": "chat", "input": "messages"}},
)


add_routes(
app,
chain.with_types(input_type=InputChat),
)

if __name__ == "__main__":
import uvicorn

uvicorn.run(app, host="localhost", port=8000)
File renamed without changes.
Loading

0 comments on commit ce727fc

Please sign in to comment.