Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sweep: turn this into a warning #3825

Merged
merged 1 commit into from
May 20, 2024
Merged

Conversation

sweep-nightly[bot]
Copy link
Contributor

@sweep-nightly sweep-nightly bot commented May 20, 2024

Description

This pull request modifies the behavior of the openai_call_embedding function within the vector_db.py file of the sweepai/core module. Specifically, it changes the logging level from error to warning when the token count exceeds the maximum allowed by the model during batch processing.

Summary

  • Changed logging level from error to warning in vector_db.py when the token count exceeds the model's maximum context length.
  • This modification ensures that exceeding the token limit is treated as a warning rather than an error, reflecting its non-critical nature.
  • Affected file: sweepai/core/vector_db.py.

Fixes #3820.


💡 To get Sweep to edit this pull request, you can:

  • Comment below, and Sweep can edit the entire PR
  • Comment on a file, Sweep will only modify the commented file
  • Edit the original issue to get Sweep to recreate the PR from scratch

This is an automated message generated by Sweep AI.

Copy link
Contributor Author

sweep-nightly bot commented May 20, 2024

Rollback Files For Sweep

  • Rollback changes to sweepai/core/vector_db.py

This is an automated message generated by Sweep AI.

Copy link

vercel bot commented May 20, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
sweep-chat 🔄 Building (Inspect) Visit Preview 💬 Add feedback May 20, 2024 9:15pm
sweep-chat-demo 🔄 Building (Inspect) Visit Preview 💬 Add feedback May 20, 2024 9:15pm
sweep-docs 🔄 Building (Inspect) Visit Preview 💬 Add feedback May 20, 2024 9:15pm

Copy link
Contributor Author

sweep-nightly bot commented May 20, 2024

Sweep: PR Review

sweepai/core/vector_db.py

The logging level for a specific token count exceeded message has been changed from error to warning.

Sweep Found These Issues

  • Changing the log level from error to warning for the message about token count exceeding the model's maximum context length may reduce the visibility of this issue in production logs, potentially leading to delayed responses to this problem.
  • logger.warning(f"Token count exceeded for batch: {max([tiktoken_client.count(text) for text in batch])} truncating down to 8192 tokens.")

    View Diff


@sweep-nightly sweep-nightly bot added the sweep Assigns Sweep to an issue or pull request. label May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sweep Assigns Sweep to an issue or pull request.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Sweep: turn this into a warning
1 participant