Baa the AI client https://baarilliant.ai
I've crafted Baa an LLM and generative AI client for the web and desktop.
Baa has a lot of useful features for chat, handling prompts, building functions, sharing and experimenting with LLMs and generative AI from OpenAI, Anthropic, and models like StableDiffusion and Llama using Fireworks. I created Baa because I was struggling to find an existing client that was spot on for my needs.
Like many people, I had used the OpenAi playground very occasionally for maybe a couple of years, and thought oh that's kinda cool, then ChatGPT happened; I was blown away by the capability! Not only being able to talk to AI but also how OpenAI had presented there LLM's in a usable way, remember I used the playground for a few years.
Great, i want more! Anthropic, open models like Llama, generative models, functions gimme gimme. I found myself working in different apps/UI's with few or differing features for tasks such as copying, downloading, sharing chats and code blocks. Saving the prompts I use or the functions and art I create.
That's why I built Baa.
- Modern UI with common features.
- Own the data, local and cloud.
- Any AI, local and cloud.
- Create and organise prompts, functions and models.
- Save and experiment, playground for different models and outputs.
- No vendor lock-in.
- Self host.
- Desktop apps.
- Sharing features.
- Developer features.
- Chat:
- Organise:
- Multiple LLM's:Switch between different AI's OpenAI, Anthropic and models such as StableDiffusion and Llama using Fireworks.ai.
- Experiment:
- Share and Export:
- Desktop Web App:Use Baa as an App on your desktop, Baa will continue to use online services for data and auth.
- Bring your own keys (BYOK) and billing.
- Desktop app everything local auth, storage and LLMs.
- Remove cloud dependencies. Local storage, auth, chats, prompts, functions and experiments.
- Talk to Baa. Two-way audio, home assistant and agents.
- Multimodal and vision
- Local models, support local LLMs.
- Open interpreter, integrate https://openinterpreter.com
- VS Code integration, open file, open experiments in VS Code.
- OpenAI Plugins *complete
What's Not on the Roadmap ?
Note this needs updating as Baa now includes many more features...
Chat: The usual Chat with GPT, but Baa has some useful additions such as system prompts, forking chats and multiple models.
When working with code or queries, there are some useful utilities like, forking a chat, downloading files, starting experiments, and copy of course.
Handle prompts: Organise, create, edit and share your prompts.
Construct functions: Create edit organise and share functions.
Experiment: Experiment with different approaches or ideas, save and iterate AI functions, completions, chains and agents.
Sharing Chats: You can now share chats from the chat screen. A shared chat is a point in time copy (the prompt and current chat thread), the chat and its link are public, as such cannot be undone. The chat will also be listed at Community (https://baarilliant.ai/share/)
Recently, Anthropic widened access to Claude2. 30 minutes to integrate their API with Baa! Just imagine what we can do to improve the UI of local LLMs.
Yeah same as Anthropic 30 minutes to integrate the API with Baa! Excited for future LLM's using langchain chat models.
I'm not sure, but please let me know your thoughts. You can email me at hello@baarilliant.ai.
Baa is developed using the following technologies:
- Node.js 18: an open-source, cross-platform JavaScript runtime environment.
- Next.js 14: enables you to create full-stack web applications by extending the latest React features.
- NextAuth.js: a configurable authentication framework for Next.js 13.
- LangChain JS: an AI orchestration layer to build intelligent apps.
- Vercel AI SDK: An open source library for building AI-powered user interfaces.
- Tailwind CSS: a utility-first CSS framework that provides a series of predefined classes for styling.
- shadcn/UI: re-usable components built using Radix UI and Tailwind CSS.
- Azure Cosmos DB: a fully managed platform-as-a-service (PaaS) NoSQL database used to store chat history.
- Helicone: a tool to manage internal tools and monitoring for LLMs at scale.
- Fireworks: launch open LLMs at scale.
For the Cloud version, there are a couple of things. One is how to securely manage API keys provided by you, and the other is how to bill you for API usage.
For the Local version, we already have an Electron app and authentication in place. The main blocker is implementing a lightweight document storage solution. We are considering writing a JSON file API for LangChain, similar to Azure Cosmos.
Why a Sheep? Why Baa? Well, I'm Welsh! ๐ด๓ ง๓ ข๓ ท๓ ฌ๓ ณ๓ ฟ
Why was Baa built? More information coming soon!
Is there a mobile version of Baa? No, but I have plans. Although unlikely, it'll be text based. I see mobile Baa as voice only.
- Baa won't be ChatGPT.
- RAG (document chat) maybe, probably not.
- Agents, Bots, Personas, these features come from prompting LLM's correctly. Perhaps agents or chaining multiple LLM's will happen.
Local LLM's, its complete and used the features but on consumer hardware it's a disappointing experience, hence using Fireewrks.ai for the same outcome. Revisit 2024.