Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass parameter betweens each Runnables in a chain #504

Closed
eyurtsev opened this issue Mar 4, 2024 Discussed in #479 · 3 comments
Closed

Pass parameter betweens each Runnables in a chain #504

eyurtsev opened this issue Mar 4, 2024 Discussed in #479 · 3 comments
Assignees
Labels
bug Something isn't working investigate

Comments

@eyurtsev
Copy link
Collaborator

eyurtsev commented Mar 4, 2024

Discussed in #479

Originally posted by shaojun February 22, 2024
Hi,
I have a CustomUserType input request, and there's a field of shared_pass_through_parameter that contains non-nature language data which would be used in any followed Runnables (except the LLM node).
like the below sample code, the purpose of this field is to pass to final RunnableLambda(output_process) to output a customized content.

app = FastAPI(
    title="LangChain Server",
    version="1.0",
    description="A simple api server using Langchain's Runnable interfaces",
)

# model = ChatModelBuilder.Create()


class MyCustomRequest(CustomUserType):
    content: str
    prompt: str
    shared_pass_through_parameter: str = "No-natural language content"


def process(request: MyCustomRequest):
    from pydantic import BaseModel, create_model
    import json
    prompt = ChatPromptTemplate.from_messages([
        ("system", '{system_prompt}'),
        ("human", '{user_content}')
    ])
    return prompt.invoke({"system_prompt": request.prompt, "user_content": request.content})


def output_process(input: Any):
    # !!!HOW TO GET the shared_pass_through_parameter
    # if input.shared_pass_through_parameter == 'test':
    #     print('shared_pass_through_parameter is test')
    return input


model = ChatOpenAI()
add_routes(
    app,
    RunnableLambda(process).with_types(
        input_type=MyCustomRequest) | model | RunnableLambda(output_process),
    path="/MyService1",
)
if __name__ == "__main__":
    import uvicorn

    uvicorn.run(app, host="0.0.0.0", port=9000)

Obvious that field will be dropped as it's not included into the output of process.
I can see a solution that wrap all the route chain into a single function, but that will lose the feature of streaming of LLM.
any suggestion?

@eyurtsev
Copy link
Collaborator Author

eyurtsev commented Mar 7, 2024

Looking now

@eyurtsev
Copy link
Collaborator Author

eyurtsev commented Mar 7, 2024

Bug

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough, RunnableGenerator
from langchain_core.beta.runnables.context import Context

async def to_dict(input):
    async for chunk in input:
        yield {
            'foo': chunk
        }

chain = Context.setter('input') | model | to_dict | Context.getter('input')

async for chunk in chain.astream('hello'):
    print(chunk)

@eyurtsev eyurtsev added the bug Something isn't working label Mar 7, 2024
@eyurtsev
Copy link
Collaborator Author

eyurtsev commented Mar 7, 2024

Issue will be tracked in LangChain since this is not a LangServe issue

@eyurtsev eyurtsev closed this as completed Mar 7, 2024
eyurtsev added a commit to langchain-ai/langchain that referenced this issue Mar 8, 2024
eyurtsev added a commit to langchain-ai/langchain that referenced this issue Mar 14, 2024
gkorland pushed a commit to FalkorDB/langchain that referenced this issue Mar 30, 2024
hinthornw pushed a commit to langchain-ai/langchain that referenced this issue Apr 26, 2024
hinthornw pushed a commit to langchain-ai/langchain that referenced this issue Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working investigate
Projects
None yet
Development

No branches or pull requests

1 participant