[!NOTE] > This project is no longer maintained.
Raycast recently announced that you can bring your own API key. Raycast offers a much more polished and actively maintained interface for chatting with LLMs. This change fully addresses the original problem that Prompta was designed to solve.
I created Prompta to solve my own need: a fast, keyboard-centric way to chat with LLMs using my own API key. For many months, it was my primary AI chat app and it was a joy to build something that solved my own problem and was useful to others.
For that reason, I've decided to archive the project. The code will remain available on GitHub and the web app will stay online, but I will no longer be working on it.
Thank you to everyone who used the app, filed issues, and provided feedback. If you're interested in taking over the project, please feel free to reach out.
If you want more context: https://notes.iansinnott.com/blog/posts/Sunsetting+Prompta+-+My+LLM+Chat+App
Yet another interface for chatting with LLMs via API.
Website | Downloads | Launch App
Mobile | Search chats | Keyboard Centric | Comments |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
- Search all previous conversations (full-text!)
- Sync your chat history across devices
- Keyboard centric
- Leave notes on responses, such as "working code!" or "not working"
- Keep all your chat history stored locally
- Search previous chat threads
- Chat with the latest models (updated dynamically)
- Use local LLMs like Llama, Mistral, etc
- Customize the system message
- In your web browser: chat.prompta.dev
- Desktop app: download the latest build from the releases page
For macOS users you will need to right-click the app and select "Open" the first time you run it. This is because the app is signed but not notarized.
Right-click to open | Now you can click "Open" |
---|---|
![]() |
![]() |
bun
is used for development. You cam try using yarn
, bun
, npm
, etc but other package managers have not been tested and are not deliberately supported:
bun install
bun run dev
# To devlop the Tuari desktop app as well:
bun run dev:tauri
To create a production version of your app:
bun run build
If you want to build only for the browser, ignoring the desktop app:
bun run ui:build-static
The advantage here is that you don't need any Rust dependencies which are required for building Tauri.
bun run release
You will be prompted to enter a new version number. New versions that don't contain a suffix such as -beta
or -alpha
will be published to GitHub.
- SQLite via vlcn/cr-sqlite - SQLite compiled to WASM running in the browser using CRDTs for conflict-free replication.
- Tauri - A Rust-based alternative to Electron (Only used in desktop builds)
- Svelte - Reactive UI framework