Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: implement LLM monitoring with langchainrb integration #2411

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

monotykamary
Copy link

@monotykamary monotykamary commented Sep 23, 2024

Description

This PR introduces a crude implementation of LLM Monitoring with LangChainrb integration to the Sentry Ruby SDK. The changes include:

  1. Addition of a new monitoring.rb file in the sentry-ruby/lib/sentry/ai/ directory, which implements AI monitoring functionality.
  2. Creation of a langchain.rb file in both sentry-ruby/lib/sentry/ and sentry-ruby/lib/sentry/ai/ directories, providing LangChain integration for the Sentry Ruby SDK.
  3. Potential updates to span.rb and transaction.rb to support these new features.

These changes enhance Sentry's capabilities in monitoring and integrating with AI-related technologies, particularly focusing on LangChain integration.

Current problems

  • Currently can't get it to show on the LLM Monitoring page, but most, if not all the span data are listed in the implementation.

Related Issues/PRs

Refactoring

  • No major refactoring was performed in this PR. All changes are related to new feature additions.

Changelog Entry

Added

  • Introduced AI monitoring capabilities (sentry-ruby/lib/sentry/ai/monitoring.rb)
  • Added LangChain integration (sentry-ruby/lib/sentry/langchain.rb and sentry-ruby/lib/sentry/ai/langchain.rb)
  • Enhanced span and transaction handling to support AI monitoring

Basic Testing:

require 'sentry-ruby'
require 'langchain'
require 'sentry/langchain'

puts "Initializing Sentry..."
Sentry.init do |config|
  config.dsn = ENV['SENTRY_DSN']
  config.traces_sample_rate = 1.0
  config.debug = true # Enable debug mode for more verbose logging
end

Sentry.with_scope do |scope|
  Sentry.set_tags(ai_operation: "Testing")
  
  transaction = Sentry.start_transaction(
    op: "ai.query",
    name: "AI Query Execution"
  )

  Sentry.configure_scope do |scope|
    scope.set_span(transaction)
  end

  begin
    Sentry::AI::Monitoring.ai_track("Testing")
    llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
    result = llm.chat(messages: [{role: "user", content: "testing input"}]).completion
    puts(result)
  rescue => e
    Sentry.capture_exception(e)
    raise e
  ensure
    transaction.finish
  end
end

@monotykamary
Copy link
Author

@sl0thentr0py I'm not too familiar Ruby or with how to get it completely up and running with LLM Monitoring, but I think have a good start. Can you have a look?
image

@sl0thentr0py sl0thentr0py self-requested a review September 23, 2024 15:10
@sl0thentr0py
Copy link
Member

ty @monotykamary I will review this in a few days and see how best to package the new integration.

@rwojsznis
Copy link

rwojsznis commented Feb 18, 2025

This looks like an awesome addition to the sentry ruby ecosystem, please do net let it die end in a graveyard of forgotten open source pull requests 😅

@solnic solnic self-requested a review February 24, 2025 10:34
@solnic
Copy link
Collaborator

solnic commented Feb 24, 2025

This looks like an awesome addition to the sentry ruby ecosystem, please do net let it die end in a graveyard of forgotten open source pull requests 😅

@rwojsznis we're not letting that to happen 🙃

@monotykamary thanks again for kick-starting this effort - are you still interested in working on this feature?

@monotykamary
Copy link
Author

I'm down for a redemption arc 🤘

Copy link
Collaborator

@solnic solnic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm down for a redemption arc 🤘

@monotykamary I'm taking this as a...yes? 🙃 If you'd like to continue, the first thing to do as the next step would be to add some basic tests. Please let me know if I could help you with this!


def summarize(...)
wrap_with_sentry("chat_completions") { super(...) }
end
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here and in the previous methods we shouldn't be using ... syntax because unfortunately it's not gonna work with older Rubies that we still need to support. If the original methods have the same signature in case of all backends, I would suggest using the exact same signature and passing arguments to super.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants