Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Added setLlmTokenCountCallback API endpoint to register a callback for calculating token count when none is provided #2065

Merged
merged 2 commits into from Mar 6, 2024

Conversation

bizob2828
Copy link
Member

Description

Adds setLlmTokenCountCallback to register callback to be called to get token counts when they are not present on events. This function must be synchronous and return token count.

Related Issues

Closes #2055

…ack for calculating token counts when none is provided
…ll `agent.llm.tokenCountCallback` when token_count is not present
Copy link

codecov bot commented Mar 5, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.18%. Comparing base (3c50606) to head (03adc06).
Report is 1 commits behind head on main.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2065   +/-   ##
=======================================
  Coverage   97.17%   97.18%           
=======================================
  Files         248      248           
  Lines       41880    41936   +56     
=======================================
+ Hits        40698    40754   +56     
  Misses       1182     1182           
Flag Coverage Δ
integration-tests-16.x 78.23% <71.42%> (-0.02%) ⬇️
integration-tests-18.x 78.21% <71.42%> (-0.02%) ⬇️
integration-tests-20.x 78.22% <71.42%> (-0.02%) ⬇️
unit-tests-16.x 90.49% <100.00%> (+0.01%) ⬆️
unit-tests-18.x 90.47% <100.00%> (+0.01%) ⬆️
unit-tests-20.x 90.47% <100.00%> (+0.01%) ⬆️
versioned-tests-16.x 74.49% <89.83%> (+<0.01%) ⬆️
versioned-tests-18.x 75.61% <89.83%> (-0.01%) ⬇️
versioned-tests-20.x 75.62% <89.83%> (-0.01%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Contributor

@jsumners-nr jsumners-nr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

* @example
* // @param {string} model - name of model (i.e. gpt-3.5-turbo)
* // @param {string} content - prompt or completion response
* function tokenCallback(model, content) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The value of content is extremely varied. I can see us fielding plenty of issues asking what it will be.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It must be a string. In the openai cases they are always strings. I know for langchain they aren't always strings but we don't assign tokens. Looking at bedrock they are strings as well

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it shouldn't be the parsed body. it should be the key in the body that contains the content.

@bizob2828 bizob2828 merged commit d2faf1a into newrelic:main Mar 6, 2024
24 checks passed
Node.js Engineering Board automation moved this from Needs PR Review to Done: Issues recently completed Mar 6, 2024
@bizob2828 bizob2828 deleted the add-set-llm-token-count-cb branch April 3, 2024 19:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Node.js Engineering Board
  
Done: Issues recently completed
Development

Successfully merging this pull request may close these issues.

Add new api endpoint setLlmTokenCountCallback to provide ability to do token counting
2 participants