Skip to content

Commit bd9a4e3

Browse files
lydiagyoumarcklingenclemra
authored
docs: add haystack integration (langfuse#626)
--------- Co-authored-by: Marc Klingen <git@marcklingen.com> Co-authored-by: Clemens Rawert <clemens@langfuse.com>
1 parent a98330b commit bd9a4e3

File tree

17 files changed

+42590
-4
lines changed

17 files changed

+42590
-4
lines changed

components/Authors.tsx

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,6 +37,11 @@ export const allAuthors = {
3737
image: "/images/people/marliesmayerhofer.jpg",
3838
twitter: "marliessophie",
3939
},
40+
lydiayou: {
41+
firstName: "Lydia",
42+
name: "Lydia You",
43+
image: "/images/people/lydiayou.jpg",
44+
},
4045
} as const;
4146

4247
export const Authors = (props: { authors: (keyof typeof allAuthors)[] }) => {
@@ -63,10 +68,10 @@ export const Author = (props: { author: string }) => {
6368

6469
return (
6570
<a
66-
href={`https://twitter.com/${author.twitter}`}
71+
href={author.twitter ? `https://twitter.com/${author.twitter}` : "#"}
6772
className="group shrink-0"
6873
target="_blank"
69-
key={author.twitter}
74+
key={props.author}
7075
rel="noopener noreferrer"
7176
>
7277
<div className="flex items-center gap-4" key={author.name}>

cookbook/_routes.json

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,10 @@
2727
"notebook": "integration_llama-index.ipynb",
2828
"docsPath": "docs/integrations/llama-index/example-python"
2929
},
30+
{
31+
"notebook": "integration_haystack.ipynb",
32+
"docsPath": "docs/integrations/haystack/example-python"
33+
},
3034
{
3135
"notebook": "evaluation_with_langchain.ipynb",
3236
"docsPath": null

cookbook/integration_haystack.ipynb

Lines changed: 41762 additions & 0 deletions
Large diffs are not rendered by default.

next.config.mjs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -153,6 +153,7 @@ const nonPermanentRedirects = [
153153
["/docs/integrations/llama-index", "/docs/integrations/llama-index/get-started"],
154154
["/docs/integrations/llama-index/overview", "/docs/integrations/llama-index/get-started"],
155155
["/docs/integrations/llama-index/cookbook", "/docs/integrations/llama-index/example-python"],
156+
["/docs/integrations/haystack", "/docs/integrations/haystack/get-started"],
156157
["/docs/integrations/openai/get-started", "/docs/integrations/openai/python/get-started"],
157158
["/docs/integrations/openai/examples", "/docs/integrations/openai/python/examples"],
158159
["/docs/integrations/openai/track-errors", "/docs/integrations/openai/python/track-errors"],
Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
---
2+
title: Haystack <> Langfuse Integration
3+
date: 2024/05/16
4+
description: Easily monitor and trace your Haystack pipelines with this new Langfuse integration.
5+
tag: integration
6+
ogImage: /images/blog/haystack/haystack_og.png
7+
author: Lydia
8+
---
9+
10+
import { BlogHeader } from "@/components/blog/BlogHeader";
11+
12+
<BlogHeader
13+
title="Haystack <> Langfuse Integration"
14+
description="OSS observability and analytics for the popular RAG application framework."
15+
date="May 17, 2024"
16+
authors={["lydiayou"]}
17+
/>
18+
19+
We're excited to highlight a new Langfuse integration with Haystack!
20+
21+
This integration allows you to easily trace your Haystack pipelines in the Langfuse UI. We've previously launched integrations with popular tools that devs love -- including [LlamaIndex](/blog/llama-index-integration), [LangChain](/docs/integrations/langchain/tracing) and [LiteLLM](/docs/integrations/litellm) -- and we're excited to be continuing that with Haystack.
22+
23+
Thanks to the team at deepset for developing the integration. We're excited to see how you use it!
24+
25+
## What's New
26+
27+
The `langfuse-haystack` package integrates tracing capabilities into Haystack (2.x) pipelines using Langfuse. You can then add `LangfuseConnector` as a tracer to automatically trace the operations and data flow within the pipeline.
28+
29+
<CloudflareVideo
30+
videoId="36a42457d879923f84681f8fe62443e4"
31+
aspectRatio={16 / 10.26}
32+
title="Haystack Trace"
33+
gifStyle
34+
/>
35+
36+
## What is Haystack?
37+
38+
[Haystack](https://haystack.deepset.ai/) is the open-source Python framework developed by deepset. Its modular design allows users to implement custom pipelines to build production-ready LLM applications, like retrieval-augmented generative (RAG) pipelines and state-of-the-art search systems. It integrates with Hugging Face Transformers, Elasticsearch, OpenSearch, OpenAI, Cohere, Anthropic and others, making it an extremely popular framework for teams of all sizes.
39+
40+
**RAG has proven to be a pragmatic and efficient way of working with LLMs**. The integration of custom data sources through RAG can significantly enhance the quality of an LLM’s response, improving user experience. Haystack is a lightweight and powerful tool to build data-augmented LLM applications; you can read more about their approach to the building blocks of pipelines [here](https://docs.haystack.deepset.ai/docs/intro).
41+
42+
Haytsack recently introduced Haystack 2.0 with a new architecture and modular design. Building on top of Haystack makes applications easier to understand, maintain and extend.
43+
44+
## How Can Langfuse Help?
45+
46+
[Langfuse tracing](/docs/tracing) can be helpful for Haystack pipelines in the following ways:
47+
48+
- Capture comprehensive details of each execution trace in a beautiful UI dashboard
49+
- Latency
50+
- Token usage
51+
- Cost
52+
- Scores
53+
- Capture the full context of the execution
54+
- Monitor and score traces
55+
- Build fine-tuning and testing datasets
56+
57+
Langfuse integration with a tool like Haystack can help monitor model performance, can assist with pinpointing areas for improvement, or create datasets from your pipeline executions for fine-tuning and testing.
58+
59+
## Overview
60+
61+
<CloudflareVideo
62+
videoId="aaa30e674c281a9c9591af03b5f668d2"
63+
aspectRatio={16 / 10}
64+
title="Haystack Integration Overview"
65+
/>
66+
67+
## Quickstart
68+
69+
Here are the steps to get started! First, add your API keys. You can find your Langfuse public and private keys on the dashboard. Make sure to set `HAYSTACK_CONTENT_TRACING_ENABLED` to `"True"`.
70+
71+
```python
72+
import os
73+
74+
# Get keys for your project from the project settings page
75+
# https://cloud.langfuse.com
76+
os.environ["LANGFUSE_PUBLIC_KEY"] = "pk-lf-..."
77+
os.environ["LANGFUSE_SECRET_KEY"] = "sk-lf-..."
78+
os.environ["LANGFUSE_HOST"] = "https://cloud.langfuse.com" # 🇪🇺 EU region
79+
# os.environ["LANGFUSE_HOST"] = "https://us.cloud.langfuse.com" # 🇺🇸 US region
80+
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "True"
81+
82+
# Your openai key
83+
os.environ["OPENAI_API_KEY"] = "sk-proj-..."
84+
```
85+
86+
Here's how you add `LangfuseConnector` as a tracer to the pipeline:
87+
88+
```python
89+
from datasets import load_dataset
90+
from haystack import Document, Pipeline
91+
from haystack.components.builders import PromptBuilder
92+
from haystack.components.embedders import SentenceTransformersDocumentEmbedder, SentenceTransformersTextEmbedder
93+
from haystack.components.generators import OpenAIGenerator
94+
from haystack.components.retrievers import InMemoryEmbeddingRetriever
95+
from haystack.document_stores.in_memory import InMemoryDocumentStore
96+
from haystack_integrations.components.connectors.langfuse import LangfuseConnector
97+
98+
basic_rag_pipeline = Pipeline()
99+
# Add components to your pipeline
100+
basic_rag_pipeline.add_component("tracer", LangfuseConnector("Basic RAG Pipeline"))
101+
basic_rag_pipeline.add_component(
102+
"text_embedder", SentenceTransformersTextEmbedder(model="sentence-transformers/all-MiniLM-L6-v2")
103+
)
104+
basic_rag_pipeline.add_component("retriever", retriever)
105+
basic_rag_pipeline.add_component("prompt_builder", prompt_builder)
106+
basic_rag_pipeline.add_component("llm", OpenAIGenerator(model="gpt-3.5-turbo", generation_kwargs={"n": 2}))
107+
```
108+
109+
For each trace, you can see:
110+
111+
- Latency for each component of the pipeline
112+
- Input and output for each step
113+
- For generations, token usage and costs are automatically calculated.
114+
115+
If you want to learn more about traces and what they can do in Langfuse, read our [documentation](/docs/tracing).
116+
117+
## Dive In
118+
119+
Head to the [Langfuse Docs](/docs/integrations/haystack/get-started) or see an example integration in this [end-to-end cookbook](/docs/integrations/haystack/example-python) to dive straight in.
120+
121+
import { FileCode, BookOpen, Video } from "lucide-react";
122+
123+
<Cards num={3}>
124+
<Card
125+
title="Docs"
126+
href="/docs/integrations/haystack/get-started"
127+
icon={<BookOpen />}
128+
/>
129+
<Card
130+
title="Coookbook"
131+
href="/docs/integrations/haystack/example-python"
132+
icon={<FileCode />}
133+
/>
134+
<Card title="Video" href="/guides/videos/haystack" icon={<Video />} />
135+
</Cards>

pages/blog/showcase-llm-chatbot.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ The fully integrated showcase is available on GitHub if you are interested: [lan
6060

6161
### Backend
6262

63-
We demonstrate the intgeration via the Langfuse Typescript SDK in [app/api/chat/route.ts](https://github.com/langfuse/ai-chatbot/blob/main/app/api/chat/route.ts). This API route handles the sreaming response from OpenAI using the Vercel AI SDK and saves the chat history in Vercel KV.
63+
We demonstrate the integration via the Langfuse Typescript SDK in [app/api/chat/route.ts](https://github.com/langfuse/ai-chatbot/blob/main/app/api/chat/route.ts). This API route handles the streaming response from OpenAI using the Vercel AI SDK and saves the chat history in Vercel KV.
6464

6565
_**Using Langchain?** Read the [Langchain Integration announcement](/blog/langchain-integration) to skip the details and integrate in seconds._
6666

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
date: 2024-05-17
3+
title: Haystack Integration
4+
description: Easily monitor and trace your Haystack pipelines with this new Langfuse integration.
5+
author: Lydia
6+
showOgInHeader: false
7+
ogImage: /images/blog/haystack/haystack_og.png
8+
---
9+
10+
import { ChangelogHeader } from "@/components/changelog/ChangelogHeader";
11+
12+
<ChangelogHeader />
13+
14+
With the new Langfuse and Haystack integration, you use [Langfuse Tracing](/docs/tracing) for you Haystack pipelines. You just need to install the `langfuse-haystack` package and enable the `HAYSTACK_CONTENT_TRACING_ENABLED` environment variable.
15+
16+
<CloudflareVideo
17+
videoId="aaa30e674c281a9c9591af03b5f668d2"
18+
aspectRatio={16 / 10}
19+
title="Haystack Integration Overview"
20+
className="mt-10"
21+
/>
22+
23+
This integration adds a tracing feature, giving insights into various aspects of pipeline execution, including latency, token usage, and cost, among others.
24+
25+
## Learn more
26+
27+
import { FileCode, BookOpen } from "lucide-react";
28+
29+
<Cards num={3}>
30+
<Card
31+
title="Docs"
32+
href="/docs/integrations/haystack/get-started"
33+
icon={<BookOpen />}
34+
/>
35+
<Card
36+
title="Coookbook"
37+
href="/docs/integrations/haystack/example-python"
38+
icon={<FileCode />}
39+
/>
40+
<Card
41+
title="Blog post"
42+
href="/blog/2024-05-haystack-integration"
43+
icon={<BookOpen />}
44+
/>
45+
</Cards>

pages/docs/integrations/_meta.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
"openai": "OpenAI SDK",
44
"langchain": "Langchain",
55
"llama-index": "LlamaIndex",
6+
"haystack": "Haystack",
67
"litellm": "LiteLLM",
78
"instructor": "Instructor",
89
"mirascope": "Mirascope",
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
{
2+
"get-started": "Get Started",
3+
"example-python": "Example (Python)"
4+
}

0 commit comments

Comments
 (0)