Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 Feature: Add Azure Application Insights as an open telemetry destination #93

Closed
1 task done
aavetis opened this issue Oct 14, 2023 · 12 comments
Closed
1 task done

Comments

@aavetis
Copy link

aavetis commented Oct 14, 2023

Which component is this feature for?

OpenAI Instrumentation

🔖 Feature description

In addition to the destinations currently supported to catch traces, it would be great to extend and include Azure Application Insights. OpenTelemetry is a key recommendation for almost any solution deployed on Azure as they should be tracking traces and metrics, and it would be great to have native support.

I was going to add links to Azure documentation for OTEL + App Insights... Realized it might just be easier to send this ChatGPT instruction for a sample Python app instrumented with OTEL, sending traces to App Insights.

https://chat.openai.com/share/8b6cc9a7-b8c5-4dea-8e0b-7c6728ff40a0

🎤 Why is this feature needed ?

Holistic Azure solution, e.g. using Azure OpenAI and sending metrics to Azure Application Insights. All data would be airgapped and stored in your own subscription.

✌️ How do you aim to achieve this?

Support Azure App Insights as a trace export destination, using instrumentation key and endpoint.

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

Yes I am willing to submit a PR!

@nirga
Copy link
Member

nirga commented Oct 14, 2023

Sounds good @aavetis! Indeed it makes sense.
Let me know once you verified that it works and update the docs in this repo (under /docs) and the README. We'll then announce that support with proper credit 😄

@aavetis
Copy link
Author

aavetis commented Oct 15, 2023

here's the setup i used, and everything "just works" 😄

config.py
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

# Set the tracer provider
trace.set_tracer_provider(TracerProvider())

# Configure the tracer provider to export traces to Azure Application Insights
exporter = AzureMonitorTraceExporter(
    connection_string = "$INSTRUMENTATION_KEY_HERE"
)
span_processor = SimpleSpanProcessor(exporter)
trace.get_tracer_provider().add_span_processor(span_processor)
app.py
import config  # Import the configuration file
from traceloop.sdk import Traceloop
from traceloop.sdk.decorators import workflow
from opentelemetry import trace
import openai  # Ensure you have the openai library installed

Traceloop.init(app_name="your_app_name")
tracer = trace.get_tracer(__name__)

@workflow(name="llm_execution")
def execute_llm():
    with tracer.start_as_current_span("llm_span"):
        # Replace with your actual OpenAI API call
        response = openai.ChatCompletion.create(
            model="gpt-3.5-turbo",
            messages=[{"role": "user", "content": "checking if this is logged"}],
            max_tokens=60
        )
        print (response['choices'][0]['message']['content'])
        return response['choices'][0]['message']['content']

if __name__ == "__main__":
    execute_llm()

Resulting in:

image

one thing - it seems that all telemetry goes through Traceloop (the saas) before landing in the final destination. is there a way to turn that off? while personally i think that's very cool how you can setup demo dashboards for people to get an idea of how great the platform is, it's a concern for any real production scenario. definitely want to keep your LLM logs and traces isolated.

image

@nirga
Copy link
Member

nirga commented Oct 15, 2023

Can you try passing your exporter to the sdk's init function? It just that we have an exporter we define internally in the sdk so that's what you're seeing - traces are sent to both destinations in parallel.

@aavetis
Copy link
Author

aavetis commented Oct 16, 2023

Confirmed, when I explicitly pass in the config, it ONLY shows up in Azure and not in Traceloop dashboard. However, it appears the open dashboard is created anyway, just stays empty if the exporter is set.

# this appears to ensure traces are only sent to your desired exporter
exporter = AzureMonitorTraceExporter(connection_string="InstrumentationKey=.......")
Traceloop.init(app_name="your_app_name", exporter=exporter)

As you can see, the dashboard is still created (just a note).
image

@nirga
Copy link
Member

nirga commented Oct 16, 2023

Huh! That's a bug. Fixed in #96.
@aavetis can you add the details of how you set this up in the docs?

@aavetis
Copy link
Author

aavetis commented Oct 16, 2023

will work on getting in the docs.

on another note, are Metrics / MeterProvider expected to work? i've noticed traces, spans, and dependencies are all tracked seamlessly.

but i haven't been able to see custom metrics (counters) getting passed down, not sure if i'm making a mistake or if sdk isn't expected to support it.
https://opentelemetry.io/docs/specs/otel/metrics/api/#meter-operations

@nirga
Copy link
Member

nirga commented Oct 16, 2023

Not yet, hopefully soon :)

@drewby
Copy link

drewby commented Oct 18, 2023

SimpleSpanProcessor

It may be better to document use of APPLICATIONINSIGHTS_CONNECTION_STRING env variable as a good practice. I believe you can then even omit it from the exporter constructor to simplify the code.

Traceloop.init(app_name="your_app_name", exporter=AzureMonitorTraceExporter())

@nirga
Copy link
Member

nirga commented Oct 20, 2023

FYI @aavetis docs have been moved to a separate repo - https://github.com/traceloop/docs
(To support the fact that we now have a JS SDK as well)

@nirga
Copy link
Member

nirga commented Oct 30, 2023

@aavetis any update on this?

@aavetis
Copy link
Author

aavetis commented Nov 1, 2023

traceloop/docs#2

@aavetis
Copy link
Author

aavetis commented Nov 30, 2023

Closing as traceloop/docs#2 is merged and deployed

@aavetis aavetis closed this as completed Nov 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants