Skip to content

skryl/mlx-ruby-lm

Repository files navigation

mlx-ruby-lm

Tests Gem Version

Ruby LLM inference toolkit built on the mlx gem.

Index

For full reference pages and deep dives, start at docs/index.md.

Installation

gem install mlx-ruby-lm

Or add it to a project:

bundle add mlx-ruby-lm

See docs/installation.md for requirements and source installs.

CLI Usage

Executable: mlx_lm

Commands:

  • mlx_lm generate
  • mlx_lm chat
  • mlx_lm server

Quick examples:

mlx_lm generate --model /path/to/model --prompt "Hello"
mlx_lm chat --model /path/to/model --system-prompt "You are concise."
mlx_lm server --model /path/to/model --host 127.0.0.1 --port 8080

See docs/cli.md for options, defaults, and current parser/behavior caveats.

High-Level Ruby API Usage

require "mlx"
require "mlx_lm"

model, tokenizer = MlxLm::LoadUtils.load("/path/to/model")
text = MlxLm::Generate.generate(model, tokenizer, "Hello", max_tokens: 64)
puts text

Streaming:

MlxLm::Generate.stream_generate(model, tokenizer, "Hello", max_tokens: 64).each do |resp|
  print resp.text
end
puts

See docs/ruby-apis.md for the full API inventory.

High-Level Model Usage

LoadUtils.load expects a local model directory with files such as config.json, tokenizer.json, and model*.safetensors.

To inspect supported model keys at runtime:

require "mlx_lm"
puts MlxLm::Models::REGISTRY.keys.sort

See docs/models.md for full registry keys and remapping behavior.

About

Language modeling with mlx-ruby

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages