Skip to content

ai-joe-git/Belullama

Repository files navigation

Image Description

Belullama

CasaOS + Ollama + Open WebUI = Belullama

Belullama

Image Description

Belullama is a powerful app for CasaOS that combines the capabilities of Ollama and Open WebUI. It allows you to create and manage conversational AI applications with ease. This README file provides an overview of Belullama and guides you on how to get started.

Table of Contents

Introduction

Belullama is a custom app for CasaOS that integrates the functionalities of Ollama and Open WebUI. It provides a comprehensive solution for creating and managing conversational AI applications on your own local server. With Belullama, you can leverage the power of large language models and enjoy a user-friendly interface for seamless interaction.

Features

Belullama offers the following key features:

  • Conversational AI Platform: Belullama combines the capabilities of Ollama and Open WebUI to provide a robust conversational AI platform. You can create and manage chatbots and conversational AI applications effortlessly.

  • Extensibility: Belullama is designed to be highly extensible, allowing you to integrate additional functionalities and customize the app according to your specific requirements.

  • User-Friendly Interface: The app features a simple and intuitive interface that makes it easy to create, configure, and manage chatbots and conversational AI applications.

  • Offline Operation: Belullama operates entirely offline, ensuring data privacy and security. You can use it without relying on external services or internet connectivity.

Screenshots

Screenshot 1 Screenshot 2

Installation

To install Belullama on your CasaOS server, follow these steps:

  • Access your CasaOS server:

Open your internet browser. Enter the address shown after completing the CasaOS installation. Press Enter to access the CasaOS web interface.

  • Install a customized app:

In the CasaOS web interface, look for a button with a "+" symbol. Click on the "+" button to open a menu. From the menu, select "Install a customized app". This will allow you to install an app using a Docker file.

  • Download the Docker file:

Once you have selected "Install a customized app," you will be prompted to choose a Docker file. Download the Docker file from the provided link here. Make sure to save the Docker file in a location that you can easily access later.

  • Install the customized app:

After downloading the Docker file, go back to the CasaOS web interface. Click on the "Install" button to start the installation process. Follow any additional prompts or instructions that may appear during the installation.

  • Verify the installation:

Once the installation is complete, you should see the app listed in the CasaOS dashboard. You can click on the app to access its web interface and start using it.

Please note that the specific steps may vary slightly depending on the version of CasaOS you are using. It's always a good idea to refer to the official CasaOS documentation or seek support from the CasaOS development team for any specific issues or questions you may have.

Usage

Once Belullama is installed, you can start using it to create and manage conversational AI applications. Here are some basic usage instructions:

  1. Launch the Belullama app on your CasaOS server.
  2. Access the app through your preferred web browser.
  3. Follow the on-screen instructions to create a new chatbot or import an existing one.
  4. Configure the chatbot settings, including the language model, prompts, and system messages.
  5. Save your changes and start interacting with your chatbot through the user-friendly interface.

For more detailed usage instructions and examples, please refer to the documentation provided in the Belullama repository.

Contributing

Contributions to Belullama are welcome! If you have any ideas, bug reports, or feature requests, please open an issue in the repository. You can also submit pull requests to contribute code improvements or new features.

License

Belullama is released under the MIT License. Please refer to the LICENSE file in the repository for more details.

Sources

  • Ollama Get up and running with Llama 3, Mistral, Gemma, and other large language models.
  • CasaOS A simple, easy-to-use, elegant open-source Personal Cloud system.
  • Open-WebUI User-friendly WebUI for LLMs (Formerly Ollama WebUI)