Skip to content

Commit

Permalink
feat: Add ChatAnthropic integration (#477)
Browse files Browse the repository at this point in the history
  • Loading branch information
davidmigloz committed Jul 2, 2024
1 parent 8d92d9b commit 44c7faf
Show file tree
Hide file tree
Showing 19 changed files with 1,418 additions and 6 deletions.
1 change: 1 addition & 0 deletions docs/_sidebar.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,7 @@
- [Tool calling](/modules/model_io/models/chat_models/how_to/tools.md)
- [LLMChain](/modules/model_io/models/chat_models/how_to/llm_chain.md)
- Integrations
- [Anthropic](/modules/model_io/models/chat_models/integrations/anthropic.md)
- [OpenAI](/modules/model_io/models/chat_models/integrations/openai.md)
- [Firebase Vertex AI](/modules/model_io/models/chat_models/integrations/firebase_vertex_ai.md)
- [GCP Vertex AI](/modules/model_io/models/chat_models/integrations/gcp_vertex_ai.md)
Expand Down
1 change: 1 addition & 0 deletions docs/modules/model_io/models/chat_models/how_to/tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
> We use the term "tool calling" interchangeably with "function calling". Although function calling is sometimes meant to refer to invocations of a single function, we treat all models as though they can return multiple tool or function calls in each message.
> Tool calling is currently supported by:
> - [`ChatAnthropic`](/modules/model_io/models/chat_models/integrations/anthropic.md)
> - [`ChatOpenAI`](/modules/model_io/models/chat_models/integrations/openai.md)
> - [`ChatFirebaseVertexAI`](/modules/model_io/models/chat_models/integrations/firebase_vertex_ai.md)
> - [`ChatGoogleGenerativeAI`](/modules/model_io/models/chat_models/integrations/googleai.md)
Expand Down
145 changes: 145 additions & 0 deletions docs/modules/model_io/models/chat_models/integrations/anthropic.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
# ChatAnthropic

Wrapper around [Anthropic Messages API](https://docs.anthropic.com/en/api/messages) (aka Claude API).

## Setup

The Anthropic API uses API keys for authentication. Visit your [API Keys](https://console.anthropic.com/settings/keys) page to retrieve the API key you'll use in your requests.

The following models are available:
- `claude-3-5-sonnet-20240620`
- `claude-3-haiku-20240307`
- `claude-3-opus-20240229`
- `claude-3-sonnet-20240229`
- `claude-2.0`
- `claude-2.1`

Mind that the list may not be up-to-date. See https://docs.anthropic.com/en/docs/about-claude/models for the updated list.

## Usage

```dart
final apiKey = Platform.environment['ANTHROPIC_API_KEY'];
final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
),
);
final chatPrompt = ChatPromptTemplate.fromTemplates([
(ChatMessageType.system, 'You are a helpful assistant that translates {input_language} to {output_language}.'),
(ChatMessageType.human, 'Text to translate:\n{text}'),
]);
final chain = chatPrompt | chatModel | StringOutputParser();
final res = await chain.invoke({
'input_language': 'English',
'output_language': 'French',
'text': 'I love programming.',
});
print(res);
// -> 'J'adore programmer.'
```

## Multimodal support

```dart
final apiKey = Platform.environment['ANTHROPIC_API_KEY'];
final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
),
);
final res = await chatModel.invoke(
PromptValue.chat([
ChatMessage.human(
ChatMessageContent.multiModal([
ChatMessageContent.text('What fruit is this?'),
ChatMessageContent.image(
mimeType: 'image/jpeg',
data: base64.encode(
await File('./bin/assets/apple.jpeg').readAsBytes(),
),
),
]),
),
]),
);
print(res.output.content);
// -> 'The fruit in the image is an apple.'
```

## Streaming

```dart
final apiKey = Platform.environment['ANTHROPIC_API_KEY'];
final promptTemplate = ChatPromptTemplate.fromTemplates([
(ChatMessageType.system, 'You are a helpful assistant that replies only with numbers in order without any spaces or commas.'),
(ChatMessageType.human, 'List the numbers from 1 to {max_num}'),
]);
final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
),
);
final chain = promptTemplate.pipe(chatModel).pipe(const StringOutputParser());
final stream = chain.stream({'max_num': '30'});
await stream.forEach(print);
// 123
// 456789101
// 112131415161
// 718192021222
// 324252627282
// 930
```

## Tool calling

`ChatAnthropic` supports tool calling.

Check the [docs](https://langchaindart.dev/#/modules/model_io/models/chat_models/how_to/tools) for more information on how to use tools.

Example:
```dart
const tool = ToolSpec(
name: 'get_current_weather',
description: 'Get the current weather in a given location',
inputJsonSchema: {
'type': 'object',
'properties': {
'location': {
'type': 'string',
'description': 'The city and state, e.g. San Francisco, CA',
},
},
'required': ['location'],
},
);
final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
tools: [tool],
),
);
final res = await model.invoke(
PromptValue.string('What’s the weather like in Boston and Madrid right now in celsius?'),
);
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
// ignore_for_file: avoid_print
import 'dart:convert';
import 'dart:io';

import 'package:langchain/langchain.dart';
import 'package:langchain_anthropic/langchain_anthropic.dart';

void main(final List<String> arguments) async {
await _invokeModel();
await _multiModal();
await _streaming();
}

Future<void> _invokeModel() async {
final apiKey = Platform.environment['ANTHROPIC_API_KEY'];

final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: const ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
),
);

final chatPrompt = ChatPromptTemplate.fromTemplates(const [
(
ChatMessageType.system,
'You are a helpful assistant that translates {input_language} to {output_language}.'
),
(ChatMessageType.human, 'Text to translate:\n{text}'),
]);

final chain = chatPrompt | chatModel | const StringOutputParser();

final res = await chain.invoke({
'input_language': 'English',
'output_language': 'French',
'text': 'I love programming.',
});
print(res);
// -> 'J'adore programmer.'

chatModel.close();
}

Future<void> _multiModal() async {
final apiKey = Platform.environment['ANTHROPIC_API_KEY'];

final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: const ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
),
);
final res = await chatModel.invoke(
PromptValue.chat([
ChatMessage.human(
ChatMessageContent.multiModal([
ChatMessageContent.text('What fruit is this?'),
ChatMessageContent.image(
mimeType: 'image/jpeg',
data: base64.encode(
await File('./bin/assets/apple.jpeg').readAsBytes(),
),
),
]),
),
]),
);
print(res.output.content);
// -> 'The fruit in the image is an apple.'

chatModel.close();
}

Future<void> _streaming() async {
final apiKey = Platform.environment['ANTHROPIC_API_KEY'];

final promptTemplate = ChatPromptTemplate.fromTemplates(const [
(
ChatMessageType.system,
'You are a helpful assistant that replies only with numbers '
'in order without any spaces or commas.',
),
(ChatMessageType.human, 'List the numbers from 1 to {max_num}'),
]);

final chatModel = ChatAnthropic(
apiKey: apiKey,
defaultOptions: const ChatAnthropicOptions(
model: 'claude-3-5-sonnet-20240620',
temperature: 0,
),
);

final chain = promptTemplate.pipe(chatModel).pipe(const StringOutputParser());

final stream = chain.stream({'max_num': '30'});
await stream.forEach(print);
// 123
// 456789101
// 112131415161
// 718192021222
// 324252627282
// 930

chatModel.close();
}
14 changes: 14 additions & 0 deletions examples/docs_examples/pubspec.lock
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,13 @@ packages:
url: "https://pub.dev"
source: hosted
version: "1.0.6"
anthropic_sdk_dart:
dependency: "direct overridden"
description:
path: "../../packages/anthropic_sdk_dart"
relative: true
source: path
version: "0.0.1"
args:
dependency: transitive
description:
Expand Down Expand Up @@ -239,6 +246,13 @@ packages:
relative: true
source: path
version: "0.7.2"
langchain_anthropic:
dependency: "direct main"
description:
path: "../../packages/langchain_anthropic"
relative: true
source: path
version: "0.0.1-dev.1"
langchain_chroma:
dependency: "direct main"
description:
Expand Down
1 change: 1 addition & 0 deletions examples/docs_examples/pubspec.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ environment:

dependencies:
langchain: ^0.7.2
langchain_anthropic: ^0.0.1-dev.1
langchain_chroma: ^0.2.0+5
langchain_community: 0.2.1+1
langchain_google: ^0.5.1
Expand Down
6 changes: 5 additions & 1 deletion examples/docs_examples/pubspec_overrides.yaml
Original file line number Diff line number Diff line change
@@ -1,9 +1,13 @@
# melos_managed_dependency_overrides: chromadb,langchain,langchain_chroma,langchain_google,langchain_mistralai,langchain_ollama,langchain_openai,mistralai_dart,ollama_dart,openai_dart,vertex_ai,langchain_core,langchain_community,tavily_dart
# melos_managed_dependency_overrides: chromadb,langchain,langchain_chroma,langchain_google,langchain_mistralai,langchain_ollama,langchain_openai,mistralai_dart,ollama_dart,openai_dart,vertex_ai,langchain_core,langchain_community,tavily_dart,anthropic_sdk_dart,langchain_anthropic
dependency_overrides:
anthropic_sdk_dart:
path: ../../packages/anthropic_sdk_dart
chromadb:
path: ../../packages/chromadb
langchain:
path: ../../packages/langchain
langchain_anthropic:
path: ../../packages/langchain_anthropic
langchain_chroma:
path: ../../packages/langchain_chroma
langchain_community:
Expand Down
1 change: 1 addition & 0 deletions packages/anthropic_sdk_dart/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ Unofficial Dart client for [Anthropic](https://docs.anthropic.com/en/api) API (a
- [Usage](#usage)
* [Authentication](#authentication)
* [Messages](#messages)
* [Tool use](#tool-use)
- [Advance Usage](#advance-usage)
* [Default HTTP client](#default-http-client)
* [Custom HTTP client](#custom-http-client)
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,41 @@
void main() {
// TODO
// ignore_for_file: avoid_print, unused_element
import 'dart:io';

import 'package:langchain_anthropic/langchain_anthropic.dart';
import 'package:langchain_core/chat_models.dart';
import 'package:langchain_core/prompts.dart';

/// Check the docs for more examples:
/// https://langchaindart.dev
void main() async {
// Uncomment the example you want to run:
await _example1();
// await _example2();
}

/// The most basic example of LangChain is calling a model on some input
Future<void> _example1() async {
final openAiApiKey = Platform.environment['ANTHROPIC_API_KEY'];
final llm = ChatAnthropic(
apiKey: openAiApiKey,
defaultOptions: const ChatAnthropicOptions(temperature: 1),
);
final ChatResult res = await llm.invoke(
PromptValue.string('Tell me a joke'),
);
print(res);
}

/// Instead of waiting for the full response from the model, you can stream it
/// while it's being generated
Future<void> _example2() async {
final openAiApiKey = Platform.environment['ANTHROPIC_API_KEY'];
final llm = ChatAnthropic(
apiKey: openAiApiKey,
defaultOptions: const ChatAnthropicOptions(temperature: 1),
);
final Stream<ChatResult> stream = llm.stream(
PromptValue.string('Tell me a joke'),
);
await stream.forEach((final chunk) => stdout.write(chunk.output.content));
}
2 changes: 2 additions & 0 deletions packages/langchain_anthropic/lib/langchain_anthropic.dart
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
/// Anthropic module for LangChain.dart.
library;

export 'src/chat_models/chat_models.dart';
Loading

0 comments on commit 44c7faf

Please sign in to comment.