Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

core[patch]: Adding raiseError Field to Callbacks #5373

Merged
merged 5 commits into from
May 15, 2024

Conversation

CahidArda
Copy link
Contributor

@CahidArda CahidArda commented May 14, 2024

Hi there!

We want to add a callback to manage rate limiting based on the number of requests or the number of tokens. For this to work, callback needs to be able to stop the execution of the chain. This is not possible in LangChain Js, while it is possible in LangChain Python (see raise_error field). In LangChain Js, the error is simply logged but the chain continues.

To make this possible, we update the BaseRunManager and its subclasses by throwing the error if the new raiseError field of the handler is true.

To give you an idea about the rate limiting callback we want to add to LangChain under community, here is a code sample:

Ratelimiting Callback
import * as uuid from "uuid";
import {
  BaseCallbackHandler, type BaseCallbackHandlerInput
} from "@langchain/core/src/callbacks/base.js";
import {
  CallbackManager
} from "@langchain/core/src/callbacks/manager.js";
import { Serialized } from "@langchain/core/src/load/serializable.js";
import { ChainValues } from "@langchain/core/src/utils/types/index.js";

// mock rate limit request
const checkRateLimit = async (identifier: string) => {
  return false;
};

export class RateLimitError extends Error {};

class RateLimitHandler extends BaseCallbackHandler {
  name = "rate-limit-handler"
  raiseError = true; // will make the callback manager throw the error
  checked = false;   // to call rate limit only once in a pipeline
  identifier: string;

  constructor(identifier: string,  inputs?: BaseCallbackHandlerInput) {
    super(inputs);
    this.identifier = identifier;
  }

  async handleChainStart(_chain: Serialized, _inputs: ChainValues): Promise<void> {
    if (!this.checked) {
      const success = await checkRateLimit(this.identifier);
      if (success) {
        this.checked = true;
      } else {
        throw new RateLimitError("Limit reached!")
      } 
    }
  }
}

const serialized: Serialized = {
  lc: 1,
  type: "constructor",
  id: ["test"],
  kwargs: {},
};

const handler = new RateLimitHandler("ip-address");
const manager = new CallbackManager(undefined);
manager.addHandler(handler);

try {
  await manager.handleChainStart(serialized, ["test"]);
} catch (err) {
  if (err instanceof RateLimitError) {
    // handle rate limit error
    console.log("handling error!")
  }
  // handle other error
}

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label May 14, 2024
Copy link

vercel bot commented May 14, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchainjs-api-refs ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 14, 2024 11:48pm
langchainjs-docs ✅ Ready (Inspect) Visit Preview May 14, 2024 11:48pm

@dosubot dosubot bot added the auto:improvement Medium size change to existing code to handle new use-cases label May 14, 2024
@jacoblee93 jacoblee93 changed the title Adding raiseError Field to Callbacks core[patch]: Adding raiseError Field to Callbacks May 14, 2024
@jacoblee93
Copy link
Collaborator

These errors will be uncatchable if backgrounding is enabled - I think it'd be reasonable to make a callback awaited if this condition is true.

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels May 14, 2024
@jacoblee93 jacoblee93 added the lgtm PRs that are ready to be merged as-is label May 14, 2024
@jacoblee93
Copy link
Collaborator

@baskaryan can you take a quick look?

@jacoblee93
Copy link
Collaborator

@CahidArda will aim to ship this next week as part of 0.2 - if you have a burning need for it sooner, I can cherry pick and cut a 0.1.x core release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:improvement Medium size change to existing code to handle new use-cases lgtm PRs that are ready to be merged as-is size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants