Skip to content

lazy-hq/ai-sdk-rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AISDK

Build Status License: MIT Issues PRs Welcome

An open-source Rust library for building AI-powered applications, inspired by the Vercel AI SDK. It provides a type-safe interface for interacting with Large Language Models (LLMs).

⚠️ Early Stage Warning: This project is in very early development and not ready for production use. APIs may change significantly, and features are limited. Use at your own risk.

Key Features

  • OpenAI Provider Support: Initial support for OpenAI models with text generation and streaming.
  • Type-Safe API: Built with Rust's type system for reliability.
  • Asynchronous: Uses Tokio for async operations.
  • Prompt Templating: Filesystem-based prompts using Tera templates (coming soon).

Installation

Add aisdk to your Cargo.toml:

[dependencies]
aisdk = "0.1.0"

Enable the OpenAI feature:

aisdk = { version = "0.1.0", features = ["openai"] }

Usage

Basic Text Generation

use aisdk::{
    core::{GenerateTextCallOptions, generate_text},
    providers::openai::{OpenAI, OpenAIProviderSettings},
};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let settings = OpenAIProviderSettings::builder()
        .api_key("your-api-key".to_string())
        .model_name("gpt-4o".to_string())
        .build()?;

    let openai = OpenAI::new(settings);

    let options = GenerateTextCallOptions::builder()
        .prompt("Say hello.")
        .build()?;

    let result = generate_text(openai, options).await?;
    println!("{}", result.text);
    Ok(())
}

Streaming Text Generation

use aisdk::{
    core::{GenerateTextCallOptions, generate_stream},
    providers::openai::{OpenAI, OpenAIProviderSettings},
};
use futures::StreamExt;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let settings = OpenAIProviderSettings::builder()
        .api_key("your-api-key".to_string())
        .model_name("gpt-4o".to_string())
        .build()?;

    let openai = OpenAI::new(settings);

    let options = GenerateTextCallOptions::builder()
        .prompt("Count from 1 to 10.")
        .build()?;

    let mut stream = generate_stream(openai, options).await?;
    while let Some(chunk) = stream.stream.next().await {
        print!("{}", chunk.text);
    }
    Ok(())
}

Providers

  • Yes: ✅
  • NA: Not Applicable
Model/Input Max Tokens Temprature Top P Top K Stop
OpenAi NA

Prompts

The file in ./prompts contains various example prompt files to demonstrate the capabilities of the aisdk prompt templating system, powered by the tera engine. These examples showcase different features like variable substitution, conditionals, loops, and template inclusion, simulating common AI prompt constructions.

Technologies Used

  • Rust: Core language.
  • Tokio: Async runtime.
  • Tera: Template engine for prompts.
  • async-openai: OpenAI API client.

Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

License

Licensed under the MIT License. See LICENSE for details.

About

Collection of tools and common practices to work with LLMs for rust.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages