layout | name | description | authors | pypi | repo | type | report_issue | logo | version | toc | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
integration |
Anthropic |
Use Anthropic Models with Haystack |
|
Model Provider |
/logos/anthropic.png |
Haystack 2.0 |
true |
This integration supports Anthropic Claude models provided through Anthropic鈥檚 own inferencing infrastructure. For a full list of available models, check out the Anthropic Claude documentation.
You can use Anthropic models with AnthropicGenerator
and AnthropicChatGenerator
.
Currently, available models are:
claude-2.1
claude-3-haiku-20240307
claude-3-sonnet-20240229
(default)claude-3-opus-20240229
pip install anthropic-haystack
Based on your use case, you can choose between AnthropicGenerator
or AnthropicChatGenerator
to work with Anthropic models. To learn more about the difference, visit the Generators vs Chat Generators guide.
Before using, make sure to set the ANTHROPIC_API_KEY
environment variable.
Below is an example RAG Pipeline where we answer a predefined question using the contents from the below mentioned URL pointing to Anthropic prompt engineering guide. We fetch the contents of the URL and generate an answer with the AnthropicChatGenerator
.
from haystack import Pipeline
from haystack.components.builders import DynamicChatPromptBuilder
from haystack.components.converters import HTMLToDocument
from haystack.components.fetchers import LinkContentFetcher
from haystack.components.generators.utils import print_streaming_chunk
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret
from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator
messages = [
ChatMessage.from_system("You are a prompt expert who answers questions based on the given documents."),
ChatMessage.from_user("Here are the documents: {{documents}} \\n Answer: {{query}}"),
]
rag_pipeline = Pipeline()
rag_pipeline.add_component("fetcher", LinkContentFetcher())
rag_pipeline.add_component("converter", HTMLToDocument())
rag_pipeline.add_component("prompt_builder", DynamicChatPromptBuilder(runtime_variables=["documents"]))
rag_pipeline.add_component(
"llm",
AnthropicChatGenerator(
api_key=Secret.from_env_var("ANTHROPIC_API_KEY"),
model="claude-3-sonnet-20240229",
streaming_callback=print_streaming_chunk,
),
)
rag_pipeline.connect("fetcher", "converter")
rag_pipeline.connect("converter", "prompt_builder")
rag_pipeline.connect("prompt_builder", "llm")
question = "What are the best practices in prompt engineering?"
rag_pipeline.run(
data={
"fetcher": {"urls": ["https://docs.anthropic.com/claude/docs/prompt-engineering"]},
"prompt_builder": {"template_variables": {"query": question}, "prompt_source": messages},
}
)
Below is an example of using AnthropicGenerator
:
from haystack_integrations.components.generators.anthropic import AnthropicGenerator
client = AnthropicGenerator(model="claude-2.1")
response = client.run("What's Natural Language Processing? Be brief.")
print(response)
>>{'replies': ['Natural language processing (NLP) is a branch of artificial intelligence focused on enabling
>>computers to understand, interpret, and manipulate human language. The goal of NLP is to read, decipher,
>> understand, and make sense of the human languages in a manner that is valuable.'], 'meta': {'model':
>> 'claude-2.1', 'index': 0, 'finish_reason': 'end_turn', 'usage': {'input_tokens': 18, 'output_tokens': 58}}}
You can use Anhtropic Claude in your Haystack 1.x pipelines with the PromptNode, which can also be used with and Agent.
pip install farm-haystack[inference]
You can use Anthropic models in various ways:
To use Claude for prompting and generating answers, initialize a PromptNode
with the model name, your Anthropic API key and a prompt template. You can then use this PromptNode
in a question answering pipeline to generate answers based on the given context.
Below is the example of a PromptNode
that uses a custom PromptTemplate
from haystack.nodes import PromptTemplate, PromptNode
prompt_text = """
Answer the following question.
Question: {query}
Answer:
"""
prompt_template = PromptTemplate(prompt=prompt_text)
prompt_node = PromptNode(
model_name_or_path = "claude-2",
default_prompt_template=PromptTemplate(prompt_text),
api_key='YOUR_ANTHROPIC_API_KEY',
max_length=768,
model_kwargs={"stream": True},
)
To use Calude for an Agent
, simply provide a PromptNode
that uses Claude to the Agent
:
from haystack.agents import Agent
from haystack.nodes import PromptNode
prompt_node = PromptNode(model_name_or_path="YOUR_ANTHROPIC_API_KEY", api_key=anthropic_key, stop_words=["Observation:"])
agent = Agent(prompt_node=prompt_node)