Skip to content

A simple LLM chat front-end that makes it easy to find, download, and mess around with models on your local machine.

License

Notifications You must be signed in to change notification settings

clarkmcc/chitchat

Repository files navigation

Banner-Dark Banner-Light

A simple LLM chat front-end that makes it easy to find, download, and mess around with LLM models on your local machine. This is a very early-stage project, so expect bugs and missing features. On the bright side, here's what it supports today:

  • Easily download and run built-in LLM models
  • Load your own models
  • GPU support
  • Statically compiled
  • Cross-platform
  • Dark and light modes
  • Warm-up prompting
  • Upload files (.pdf, .txt, .html) and chat about the file contents
  • Chat-style context
  • Prompt templates

Downloads

See releases for more downloads.

Custom Models

All models are downloaded and loaded from the ~/.chitchat/models directory. You can drop the .bin files in here. Currently, this project only supports ggml models.

To download models that aren't supported natively in this project, check out the following links.

How does it work?

This is just a Tauri frontend on the incredible rustformers/llm project. This means that any bugs in model execution or performance should be taken up in that project.

Troubleshooting

See troubleshooting for more information.