Skip to content

Blankeos/modelcli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

13 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

β–ˆβ–ˆβ–ˆβ•—β–‘β–‘β–‘β–ˆβ–ˆβ–ˆβ•—β–‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•—β–‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–‘β–ˆβ–ˆβ•—β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•—
β–ˆβ–ˆβ–ˆβ–ˆβ•—β–‘β–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•”β•β•β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•‘
β–ˆβ–ˆβ•”β–ˆβ–ˆβ–ˆβ–ˆβ•”β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–‘β–‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–‘β–‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–‘β–‘β–ˆβ–ˆβ•‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•‘β–‘β–‘β•šβ•β•β–ˆβ–ˆβ•‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•‘
β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–‘β–‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–‘β–‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β•β–‘β–‘β–ˆβ–ˆβ•‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•‘β–‘β–‘β–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–‘β–‘β–‘β–‘β–‘β–ˆβ–ˆβ•‘
β–ˆβ–ˆβ•‘β–‘β•šβ•β•β–‘β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘
β•šβ•β•β–‘β–‘β–‘β–‘β–‘β•šβ•β•β–‘β•šβ•β•β•β•β•β–‘β•šβ•β•β•β•β•β•β–‘β•šβ•β•β•β•β•β•β•β•šβ•β•β•β•β•β•β•β–‘β•šβ•β•β•β•β•β–‘β•šβ•β•β•β•β•β•β•β•šβ•β•

modelcli

Call any LLM from the command line via models.dev.

Install

npm install -g @blankeos/modelcli # npm
bun install -g @blankeos/modelcli # or bun
cargo install --path . # or cargo

Quick Start

# 1. Connect to a provider (Any known provider thanks to models.dev)
modelcli connect

# 2. Browse models and set a default
modelcli models

# 3. Send a prompt
modelcli "What is the meaning of life?"

Usage

modelcli [OPTIONS] [PROMPT]

Commands

Command Description
connect Connect to a provider (add API key)
models Browse and manage models

Options

Flag Description
--model <provider/model-id> Model to use (overrides default)
--stream Stream tokens as they arrive
--thinking Show thinking/reasoning tokens
--reasoning-effort <level> Reasoning effort: low, medium, or high
--format json Output raw JSON instead of human-readable text

Examples

# Use a specific model
modelcli --model openai/gpt-4o "Explain quicksort"

# Stream the response
modelcli --stream "Write a haiku about Rust"

# Enable reasoning
modelcli --thinking --reasoning-effort high "Prove that √2 is irrational"

# JSON output
modelcli --format json "Hello"

Custom Providers

You can add any OpenAI-compatible provider not listed on models.dev.

1. Add a credential:

modelcli connect
# Select "Other (custom provider)" β†’ enter a provider ID and API key

2. Configure the provider in ~/.config/modelcli.jsonc:

{
  "provider": {
    "myprovider": {
      "name": "My AI Provider",
      "baseURL": "https://api.myprovider.com/v1",
      "models": {
        "my-model": {
          "name": "My Model", // optional display name
          "reasoning": false, // optional, default false
          "context": 200000, // optional context window
          "output": 65536, // optional max output tokens
        },
      },
    },
  },
}

Then use it like any other model:

modelcli --model myprovider/my-model "Hello!"

The config file is auto-created the first time you add a custom provider. Both .jsonc and .json are supported, but not both at the same time.

Data Storage

  • Credentials and app data: ~/.local/share/modelcli/
  • Custom provider config: ~/.config/modelcli.jsonc

Motivation

modelcli enables piping LLM calls directly from your terminalβ€”perfect for generating commit messages in lazygit (see PR #5389), or powering any other CLI app with AI capabilities. Quickly ask questions or pipe stdout from other tools to get instant AI-powered responses.

Inspired by OpenCode's seamless multi-provider experience and built on models.dev's unified LLM API.

πŸ¦€ Made w/ Rust. A fast, minimal but intuitive CLI made with Rust.

About

🧠 Call any LLM from the command line via models.dev.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors