The last LLM tool you'll need.
LLM agents typically get dozens of specialized tools:
- A calculator for arithmetic
- A date formatter for timestamps
- A string manipulator for text operations
- A JSON parser, a base64 encoder, a hash generator...
Each tool requires API design, documentation, and testing. Tools don't compose well. And you're always limited by what tools you thought to create.
onetool provides one universal computation tool powered by a sandboxed Lua runtime.
Instead of hunting for the right tool, your LLM can solve problems programmatically. State persists between calls for multi-step reasoning. It's safe by design with comprehensive sandboxing. And it integrates seamlessly with major LLM libraries.
onetool provides adapters for popular Rust LLM frameworks:
- genai - Multi-provider LLM client (OpenAI, Google, Anthropic)
- mistral.rs - Fast local model inference
- rig - Modular LLM application framework
- aisdk - Rust port of Vercel's AI SDK
See Framework Integration for usage details.
use onetool::Repl;
// Create the sandboxed Lua runtime
let repl = Repl::new()?;
// Execute Lua code
let response = repl.eval("return 2 + 2")?;
// Access results
println!("Result: {}", response.result.unwrap().join("\n")); // "4"
println!("Output: {}", response.output.join("\n")); // (print() output)The REPL maintains state between calls, so variables and functions persist:
repl.eval("x = 10")?;
repl.eval("y = 20")?;
let result = repl.eval("return x + y")?; // "30"onetool provides ready-to-use adapters for popular LLM frameworks:
- genai -
LuaRepl::new(&repl)withdefinition()andcall()methods - mistralrs -
LuaRepl::new(repl)withdefinition()andcall()methods - rig -
LuaRepl::new(repl)implementsTooltrait - aisdk -
LuaRepl::new(repl)with.tool()method
Each adapter handles tool definition registration and execution for its framework. See Framework Integration for detailed usage.
Here's an actual interaction from the included example:
User: "What's the sum of the 10 first prime numbers?"
LLM calls lua_repl with:
{
"source_code": "
local primes = {}
local num = 2
while #primes < 10 do
local is_prime = true
for i = 2, math.sqrt(num) do
if num % i == 0 then
is_prime = false
break
end
end
if is_prime then
table.insert(primes, num)
end
num = num + 1
end
local sum = 0
for _, p in ipairs(primes) do
sum = sum + p
end
return sum
"
}
Response: {
"result": "129",
"output": ""
}
LLM: "The sum of the first 10 prime numbers is 129."
The LLM wrote a complete algorithm, executed it safely, and got the answer - all without needing a specialized "prime number calculator" tool.
Feature flag: genai
The genai adapter provides seamless integration with the genai multi-provider LLM client.
Key Methods:
LuaRepl::new(&repl)- Creates the adapter.definition()- Returnsgenai::chat::Toolfor registration.call(&tool_call)- Executes tool call and returnsToolResponse
Example:
use onetool::{Repl, genai::LuaRepl};
let repl = Repl::new()?;
let lua_repl = LuaRepl::new(&repl);
// Register with genai client
let chat_req = genai::chat::ChatRequest::new(messages)
.with_tools(vec![lua_repl.definition()]);
// Execute tool calls
let tool_response = lua_repl.call(&tool_calls[0]);Full example: examples/genai-basic.rs
Feature flag: mistralrs
The mistralrs adapter integrates with mistral.rs for fast local model inference.
Key Methods:
LuaRepl::new(repl)- Creates the adapter.definition()- Returnsmistralrs::Toolfor registration.call(&tool_call)- Executes tool call and returns result string
Example:
use onetool::{Repl, mistralrs::LuaRepl};
let repl = Repl::new()?;
let lua_repl = LuaRepl::new(repl);
// Register with mistralrs model
let messages = RequestBuilder::new()
.add_message(TextMessageRole::User, "Calculate something")
.set_tools(vec![lua_repl.definition()]);
// Execute tool calls
let result = lua_repl.call(&tool_calls[0]);Full example: examples/mistralrs-basic.rs
Feature flag: rig
The rig adapter implements the Tool trait from rig-core.
Key Methods:
LuaRepl::new(repl)- Creates the tool (implementsTooltrait)
Example:
use onetool::{Repl, rig::LuaRepl};
let repl = Repl::new()?;
let lua_tool = LuaRepl::new(repl);
// Use with rig agents
let agent = client
.agent(model)
.tool(lua_tool)
.build();Full example: examples/rig-basic.rs
Feature flag: aisdk
The aisdk adapter provides integration with aisdk.
Key Methods:
LuaRepl::new(repl)- Creates the adapter.tool()- Returns a tool function for use with aisdk
Example:
use onetool::{Repl, aisdk::LuaRepl};
let repl = Repl::new()?;
let lua_repl = LuaRepl::new(repl);
// Use with aisdk
let result = LanguageModelRequest::builder()
.model(OpenAI::gpt_4o())
.prompt("Calculate something")
.with_tool(lua_repl.tool())
.build()
.generate_text()
.await?;Full example: examples/aisdk-basic.rs
onetool includes a complete tool definition system that works with any LLM framework:
use onetool::tool_definition;
// Tool metadata
tool_definition::NAME // "lua_repl"
tool_definition::DESCRIPTION // Full description for LLM context
tool_definition::PARAM_SOURCE_CODE // "source_code"
// JSON Schema (framework-agnostic)
let schema = tool_definition::json_schema();Framework-specific helpers:
// genai (requires "genai" feature)
let tool = tool_definition::genai_tool();
// For mistralrs, rig, aisdk: use the adapter's .definition() method
// See Framework Integration section aboveCompatible with:
- OpenAI function calling
- Google Gemini function calling
- Anthropic tool use
- Any JSON Schema-based tool system
- Sandboxed Lua 5.4 runtime - Dangerous operations blocked at the language level
- String manipulation (
string.*) - Table operations (
table.*) - Math functions (
math.*) - UTF-8 support (
utf8.*) - Safe OS functions (
os.time,os.date) - All Lua control flow and data structures
- File I/O (
io,file) - Network access
- Code loading (
require,dofile,load*) - OS commands (
os.execute,os.getenv, etc.) - Metatable manipulation
- Coroutines
- Garbage collection control
For LLM Integration:
- Universal computation tool (replaces dozens of specialized tools)
- Built-in tool definitions (OpenAI, Google, Anthropic compatible)
- JSON Schema generation
- Comprehensive documentation in tool description
For Developers:
- Drop-in integration with genai, mistralrs, rig, and aisdk libraries
- Separate
print()output from return values - Clear error messages
- Type-safe Rust API via mlua
For LLM Agents:
- Persistent state between calls (variables, functions, tables)
- Runtime introspection via
docsglobal - Can solve multi-step problems programmatically
- Self-documenting environment
Basic REPL only (no LLM framework):
[dependencies]
onetool = "0.0.1-alpha.4"With genai:
[dependencies]
onetool = { version = "0.0.1-alpha.4", features = ["genai"] }
genai = "0.5"With mistralrs:
[dependencies]
onetool = { version = "0.0.1-alpha.4", features = ["mistralrs"] }
mistralrs = { git = "https://github.com/EricLBuehler/mistral.rs.git" }With rig:
[dependencies]
onetool = { version = "0.0.1-alpha.4", features = ["rig"] }
rig-core = "0.3"With aisdk:
[dependencies]
onetool = { version = "0.0.1-alpha.4", features = ["aisdk"] }
aisdk = "0.2"Feature flags:
| Feature | Includes | Description |
|---|---|---|
genai |
json_schema |
genai adapter + tool definition |
mistralrs |
json_schema |
mistralrs adapter + tool definition |
rig |
json_schema |
rig-core Tool implementation |
aisdk |
json_schema |
aisdk #[tool] macro integration |
json_schema |
- | JSON Schema generation (included by all above) |
Note: Currently in alpha - API may change.
All examples solve the same problem (sum of first 10 primes = 129) to demonstrate consistent behavior across frameworks.
genai (multi-provider client):
export OPENAI_API_KEY=your_key_here # or GEMINI_API_KEY, etc.
cargo run --features genai --example genai-basicSource: examples/genai-basic.rs
mistralrs (local inference):
cargo run --features mistralrs --example mistralrs-basicDownloads and runs Phi-3.5-mini locally. No API key required.
Source: examples/mistralrs-basic.rs
rig (modular framework):
export OPENAI_API_KEY=your_key_here
cargo run --features rig --example rig-basicSource: examples/rig-basic.rs
aisdk (Vercel AI SDK port):
export OPENAI_API_KEY=your_key_here
cargo run --features aisdk --example aisdk-basicSource: examples/aisdk-basic.rs
Test the sandboxed environment directly:
cargo run --example lua-replThis lets you experiment with Lua code and understand what the LLM sees. No API key required.
custom-functions (runtime extension):
cargo run --example custom-functionsShows how to extend the runtime with custom Rust functions. Includes interactive REPL for testing.
Source: examples/custom-functions.rs
onetool allows you to extend the Lua runtime with custom Rust functions, enabling domain-specific capabilities for your LLM agents.
There are two approaches to adding custom functions:
Best for adding functions after creating the REPL:
use onetool::Repl;
let repl = Repl::new()?;
// Add a custom function
repl.with_runtime(|lua| {
let my_func = lua.create_function(|_, name: String| {
Ok(format!("Hello, {}!", name))
})?;
lua.globals().set("greet", my_func)?;
Ok(())
})?;
// Now callable from Lua
let result = repl.eval("return greet('World')")?;Use when:
- Adding functions to an existing REPL
- Functions don't need to interact with sandboxing
- Simpler initialization flow
Best for complex initialization scenarios:
use onetool::{Repl, runtime};
let lua = mlua::Lua::new();
// Set up custom globals
lua.globals().set("API_KEY", "secret")?;
// Register custom functions
let fetch = lua.create_function(|_, url: String| {
// ... implementation
Ok("response".to_string())
})?;
lua.globals().set("fetch", fetch)?;
// Apply sandboxing AFTER custom setup
runtime::sandbox::apply(&lua)?;
let repl = Repl::new_with(lua)?;Use when:
- Need to set up complex state before sandboxing
- Custom functions require special initialization
- Building framework adapters
See examples/custom-functions.rs for a complete demonstration including:
- Multiple function patterns (simple, error handling, stateful)
- Stateful closures with Arc + Atomic
- Error propagation from Rust to Lua
- Documentation registration
- Interactive testing
Make your custom functions discoverable via the docs system:
use onetool::runtime::docs::{register, LuaDoc, LuaDocTyp};
repl.with_runtime(|lua| {
// ... create and register function ...
// Register documentation
register(lua, &LuaDoc {
name: "my_function".to_string(),
typ: LuaDocTyp::Function,
description: "Does something useful".to_string(),
})?;
Ok(())
})?;The LLM can then query docs["my_function"] at runtime to understand available functions.
Full API documentation available at docs.rs/onetool.
- Lightweight: Small runtime, fast startup
- Embeddable: Designed from the ground up to be embedded in host applications
- Simple: Easy for LLMs to generate correct code
- Powerful: Full programming language, not a domain-specific language
- Safe: Straightforward to sandbox effectively
Perfect for:
- LLM agents that need computation capabilities
- AI assistants with multi-step reasoning
- Applications requiring safe user-generated code execution
- Version: 0.0.1-alpha.4
- Stability: Alpha - API may change, but core concept is stable
- Production Ready: Not yet - use at your own risk
Building:
cargo build
cargo test
cargo doc --openNix Support:
nix develop # Dev shell with Rust, cargo-watch, rust-analyzerRunning Examples:
# Framework examples (requires API keys for genai, rig, aisdk)
cargo run --features genai --example genai-basic
cargo run --features mistralrs --example mistralrs-basic
cargo run --features rig --example rig-basic
cargo run --features aisdk --example aisdk-basic
# Interactive REPL
cargo run --example lua-replFor implementation details, see:
src/runtime/mod.rs- Lua runtime definitionsrc/runtime/docs.rs- Runtime documentation implementationsrc/runtime/sandbox.rs- Sandboxing implementationsrc/tool_definition.rs- Tool integration system
Key patterns:
- Nil-based sandboxing (simple, effective)
- Output capture via mpsc channels
- Persistent Lua state across invocations
- Runtime documentation system
License: MIT - Copyright 2026 Caio Augusto Araujo Oliveira
Contributing:
- Early stage project - feedback welcome!
- Issues and PRs appreciated
Built with mlua.