A TypeScript CLI application that provides an interactive chat interface with Ollama AI models and tool calling capabilities.
- Interactive command-line chat interface
- Tool calling support with type-safe schema definitions
- Automatic model management (downloads models if not available)
- Real-time loading indicators with Ora spinner
- Built-in tools (dice rolling example)
- Extensible tool system with Zod schema validation
- Node.js (version 14 or higher)
- Ollama installed and running locally
- Clone the repository:
git clone <repository-url>
cd ollama-cli- Install dependencies:
npm installStart the interactive chat:
npm run devThe application will:
- Check if the required model (llama3.1) is available
- Download it automatically if needed
- Start an interactive chat session
- Type
exitto quit
src/
├── agent.ts # Main agent logic and conversation handling
├── model.ts # Ollama model initialization and chat interface
├── ui.ts # User interface utilities (spinners, logging)
├── index.ts # Application entry point
├── tools/ # Tool definitions and runners
│ ├── dice.ts # Example dice rolling tool
│ └── index.ts # Tool registry and execution
└── utils/
└── schema.ts # Zod to JSON Schema conversion utilities
Tools are defined using Zod schemas for type safety:
import { z } from "zod";
import { createToolFromZod } from "../utils/schema";
const myToolSchema = z.object({
input: z.string().describe("Description of the input parameter"),
});
export const myToolDefinition = createToolFromZod({
name: "myTool",
description: "What this tool does",
schema: myToolSchema,
});
type Args = z.infer<typeof myToolDefinition.schema>;
export async function myTool({ input }: Args): Promise<string> {
// Tool implementation
return "result";
}Then register it in src/tools/index.ts.
The application uses the llama3.1 model by default. You can change this in src/model.ts:
const selectedModel = "your-preferred-model";Build the project:
tsc -bRun the compiled JavaScript:
node dist/index.jsISC