Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: DataDog LLM Observability Pipeline #90

Merged
merged 4 commits into from
Jun 17, 2024

Conversation

0xThresh
Copy link
Contributor

Adds a DataDog pipeline, a new Dockerfile image with Rust installed (which is required for the DataDog library to build and install correctly), and a new script to start a basic dev environment in Docker that includes Open WebUI, Ollama and Pipelines.

Here's an example of what you see in DataDog when you enable this pipeline and communicate with your LLMs through Open WebUI:
image

To correctly set Pipelines up for use with DataDog, you have to first build the Dockerfile.rust image, then spin up your Pipelines container using that image. A basic rundown of the steps is below:

  1. Sign up for a DataDog account.
  2. Create a .env file, and fill out the following variables:
DD_API_KEY=<your API key>
DD_LLMOBS_ENABLED=1
DD_LLMOBS_ML_APP=<your app name for the DataDog UI, such as "pipelines-test">
DD_LLMOBS_AGENTLESS_ENABLED=1
DD_SITE=<your DataDog site based on your selected region, such as us1.datadoghq.com>
  1. Start your environment locally using the commands below:
docker build -t datadog-pipelines -f Dockerfile.rust .
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always --env-file .env datadog-pipelines
# If you also want to start Open WebUI with integrated Ollama, run the command below
docker run -d -p 3000:8080 -v ~/.ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always -e OPENAI_API_BASE_URL=http://host.docker.internal:9099 -e OPENAI_API_KEY=0p3n-w3bu! ghcr.io/open-webui/open-webui:ollama
  1. Start a chat in Open WebUI, and validate that you see results in your DataDog LLM Observability UI

@0xThresh 0xThresh marked this pull request as draft June 13, 2024 23:23
@0xThresh
Copy link
Contributor Author

Need to figure out the best way to deal with the fact that you need a Rust compiler to build the ddtrace package for their LLM tool to work correctly.

@0xThresh 0xThresh marked this pull request as ready for review June 16, 2024 18:18
@0xThresh
Copy link
Contributor Author

DataDog updated DDTrace to include the LLM capabilities without requiring the Rust install. I've tested this setup again and it's good to go with ddtrace added to the requirements.txt file 🎉

@tjbck
Copy link
Collaborator

tjbck commented Jun 17, 2024

Beautiful 🤩

@tjbck tjbck merged commit a6daafe into open-webui:main Jun 17, 2024
0xThresh pushed a commit to 0xThresh/open-webui-pipelines that referenced this pull request Jul 1, 2024
…lter

feat: DataDog LLM Observability Pipeline
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants