An open-source Rust library for building AI-powered applications, inspired by the Vercel AI SDK. It provides a type-safe interface for interacting with Large Language Models (LLMs).
⚠️ Early Stage Warning: This project is in very early development and not ready for production use. APIs may change significantly, and features are limited. Use at your own risk.
- OpenAI Provider Support: Initial support for OpenAI models with text generation and streaming.
- Type-Safe API: Built with Rust's type system for reliability.
- Asynchronous: Uses Tokio for async operations.
- Prompt Templating: Filesystem-based prompts using Tera templates (coming soon).
Add aisdk
to your Cargo.toml
:
[dependencies]
aisdk = "0.1.0"
Enable the OpenAI feature:
aisdk = { version = "0.1.0", features = ["openai"] }
use aisdk::{
core::{LanguageModelRequest},
providers::openai::OpenAI,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// with default openai provider settings
let openai = OpenAI::new("gpt-5");
let result = LanguageModelRequest::builder()
.model(openai)
.prompt("hello world")
.build()
.generate_text()
.await;
println!("{}", result.text);
Ok(())
}
use aisdk::{
core::{LanguageModelRequest},
providers::openai::OpenAI,
};
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// with custom openai provider settings
let openai = OpenAI::builder()
.api_key("your-api-key")
.model_name("gpt-4o")
.build()?;
let mut stream = LanguageModelRequest::builder()
.model(openai)
.prompt("Count from 1 to 10.")
.build()
.stream_text()
.await?;
while let Some(chunk) = stream.stream.next().await {
print!("{}", chunk.text);
}
Ok(())
}
Model/Input | Max Tokens | Temprature | Top P | Top K | Stop | Seed |
---|---|---|---|---|---|---|
OpenAi | ✅ | ✅ | ✅ | NA | ✅ | NA1 |
You can define a tool using the use aisdk::core::tool
;
#[tool]
/// Returns the username
fn get_username(id: String) {
// Your code here
}
A tool has a name, a description, an input and a body. all three can be infered from standard rust function. The name is the function name, get_username
in the above example. The description is infered from the doc comments of the fucntion, /// Returns the username
is going to be used to describe the tool. make sure to use a verbose, language model friendly description in the comments. The input is built from the function arguments and converted to a json schema using schemars so make sure any type you add to the function arguments derive JsonSchema. Any think you implement in the function body will be executed on the language model's request and is thread safe.
The first two components can be overridden by using the macro arguments #[tool(name, description)]
attribute.
#[tool(
name = "the-name-for-this-tool",
desc = "the-description-for-this-tool"
)]
fn get_username(id: String) {
// Your code here
}
The file in ./prompts
contains various example prompt files to demonstrate the capabilities of the aisdk
prompt templating system, powered by the tera
engine. These examples showcase different features like variable substitution, conditionals, loops, and template inclusion, simulating common AI prompt constructions.
- Rust: Core language.
- Tokio: Async runtime.
- Tera: Template engine for prompts.
- async-openai: OpenAI API client.
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Licensed under the MIT License. See LICENSE for details.
Footnotes
-
Seed is deprecated on the newer response api so it is not supported in open ai. ↩