anthropic-rs
is an unofficial Rust library to interact with Anthropic REST API, with async support.
Wanna play with Claude in Rust? This is the place to be!
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
// Load the environment variables from the .env file.
dotenv().ok();
// Build from configuration.
let cfg = AnthropicConfig::new()?;
let client = Client::try_from(cfg)?;
let complete_request = CompleteRequestBuilder::default()
.prompt(format!("{HUMAN_PROMPT}How many toes do dogs have?{AI_PROMPT}"))
.model("claude-instant-1".to_string())
.stream_response(false)
.stop_sequences(vec![HUMAN_PROMPT.to_string()])
.build()?;
// Send a completion request.
let complete_response = client.complete(complete_request).await?;
println!("completion response: {complete_response:?}");
Ok(())
}
You can find full working examples in the examples directory.
anthropic-rs uses dotenv
to automatically load environment variables from a .env
file. You can also set these variables manually in your environment. Here is an example of the configuration variables used:
ANTHROPIC_API_KEY="..."
ANTHROPIC_DEFAULT_MODEL="claude-v1"
Replace the "..." with your actual tokens and preferences.
You can also set these variables manually when you crate a new Client
instance, see more details in usage section.
- Completion (
/v1/complete
) - Manage stream mode
Contributions to anthropic-rs
are welcomed! Feel free to submit a pull request or create an issue.
anthropic-rs is licensed under the MIT License.
- Anthropic API reference for the clear and concise documentation.
- The architecture of the SDK is inspired by async-openai, an asynchronous Rust library developed for OpenAI. We extend our heartfelt gratitude to the creators for their invaluable work. We envisage significant benefits in developing a standardized interface for interaction with various AI GPT providers' APIs. As an example, it would facilitate the development of versatile wrappers that could seamlessly interface with different providers.