Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming only final output of agent (#2483) #4630

Merged
merged 6 commits into from
May 20, 2023

Conversation

UmerHA
Copy link
Contributor

@UmerHA UmerHA commented May 13, 2023

Streaming only final output of agent (#2483)

As requested in issue #2483, this Callback allows to stream only the final output of an agent (ie not the intermediate steps).

Fixes #2483

Who can review?

Community members can review the PR once tests pass. Tag maintainers/contributors who might be interested: @agola11

Twitter: @UmerHAdil | Discord: RicChilligerDude#7589

Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is fantastic! can we add an example showing how to use this in a notebook?

@UmerHA
Copy link
Contributor Author

UmerHA commented May 13, 2023

Done ✅

@UmerHA
Copy link
Contributor Author

UmerHA commented May 15, 2023

@hwchase17 ping :)

@UmerHA UmerHA requested a review from hwchase17 May 18, 2023 02:55
Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this lgtm! @nfcampos @agola11 any thoughts?

only nit would be we should probably move this to the agent documentation section, as its pretty specific to that

@UmerHA
Copy link
Contributor Author

UmerHA commented May 19, 2023

this lgtm! @nfcampos @agola11 any thoughts?

only nit would be we should probably move this to the agent documentation section, as its pretty specific to that

Okay, moved the jupyter example to the agent section of docs & slightly adjusted the docs to better fit the new location.

@dev2049 dev2049 merged commit 7388248 into langchain-ai:master May 20, 2023
13 checks passed
@UmerHA UmerHA deleted the 2483-agent-final-only-streaming branch May 20, 2023 16:51
@soumyabeebolt
Copy link

@UmerHA Do we have any library in node js for streaming only the result?

@UmerHA
Copy link
Contributor Author

UmerHA commented May 22, 2023

Hey @soumyabeebolt So far we don't, it's also not currently on my roadmap. But the PR is don't that complicated. You can port it if you want :)

@Alexmhack
Copy link

if someone needs the same implementation in TS then,

import {
  type AIStreamCallbacksAndOptions,
  createCallbacksTransformer,
} from "ai";
import { createStreamDataTransformer } from "ai";

const DEFAULT_ANSWER_PREFIX_TOKENS = ["Final", " Answer", ":"];

function arraysEqual(a: Array<string>, b: Array<string>) {
  if (a === b) return true;
  if (a == null || b == null) return false;
  if (a.length !== b.length) return false;

  // If you don't care about the order of the elements inside
  // the array, you should sort both arrays here.
  // Please note that calling sort on an array will modify that array.
  // you might want to clone your array first.

  for (var i = 0; i < a.length; ++i) {
    if (a[i] !== b[i]) return false;
  }
  return true;
}

export function CustomLangChainStream(
  callbacks?: AIStreamCallbacksAndOptions,
  answerPrefixTokens?: typeof DEFAULT_ANSWER_PREFIX_TOKENS
) {
  const stream = new TransformStream();
  const writer = stream.writable.getWriter();

  const runs = new Set();

  if (!answerPrefixTokens) {
    answerPrefixTokens = DEFAULT_ANSWER_PREFIX_TOKENS;
  }
  let lastTokens = Array(answerPrefixTokens?.length).fill("");
  let answerReached: boolean = false;

  const handleError = async (e: Error, runId: string) => {
    runs.delete(runId);
    await writer.ready;
    await writer.abort(e);
  };

  const handleStart = async (runId: string) => {
    runs.add(runId);
  };

  const handleEnd = async (runId: string) => {
    runs.delete(runId);

    if (runs.size === 0) {
      await writer.ready;
      await writer.close();
    }
  };

  return {
    stream: stream.readable
      .pipeThrough(createCallbacksTransformer(callbacks))
      .pipeThrough(createStreamDataTransformer()),
    writer,
    handlers: {
      handleLLMNewToken: async (token: string) => {
        // Remember the last n tokens, where n = len(answer_prefix_tokens)
        lastTokens.push(token);
        if (lastTokens.length > answerPrefixTokens.length) {
          lastTokens.shift();
        }

        // Check if the last n tokens match the answer_prefix_tokens list ...
        if (arraysEqual(lastTokens, answerPrefixTokens)) {
          answerReached = true;
          // Do not print the last token in answer_prefix_tokens,
          // as it's not part of the answer yet
          return;
        }

        //  if yes, then print tokens from now on
        if (answerReached) {
          await writer.ready;
          await writer.write(token);
        }
      },
      handleLLMStart: async (_llm: any, _prompts: string[], runId: string) => {
        handleStart(runId);
        answerReached = false;
      },
      handleLLMEnd: async (_output: any, runId: string) => {
        await handleEnd(runId);
      },
      handleLLMError: async (e: Error, runId: string) => {
        await handleError(e, runId);
      },
      handleChainStart: async (_chain: any, _inputs: any, runId: string) => {
        handleStart(runId);
      },
      handleChainEnd: async (_outputs: any, runId: string) => {
        await handleEnd(runId);
      },
      handleChainError: async (e: Error, runId: string) => {
        await handleError(e, runId);
      },
      handleToolStart: async (_tool: any, _input: string, runId: string) => {
        handleStart(runId);
      },
      handleToolEnd: async (_output: string, runId: string) => {
        await handleEnd(runId);
      },
      handleToolError: async (e: Error, runId: string) => {
        await handleError(e, runId);
      },
    },
  };
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

using a Agent and wanted to stream just the final response
5 participants