Skip to content

mizy/local-agent-chat

Repository files navigation

LLM Flutter Application

This is a Flutter application that utilizes the Llama.cpp library to run large LLM models offline.

Features

  • Offline Model Execution: The application is capable of running large LLM models offline, making it ideal for environments with limited or no internet connectivity.
  • Cross-Platform: Built with Flutter, this application can be compiled and run on multiple platforms including iOS, Android, and web.
  • Efficient Performance: The use of Llama.cpp ensures efficient execution of large models, providing fast and accurate results.

demo png

  • Image 3
  • Image 4
  • Image 1
  • Image 2

Getting Started

To get started with this project, clone the repository and navigate to the project directory.

git clone https://github.com/mizy/local-agent-chat.git
cd local-agent-chat
git submodule update --init --recursive

Supported Platforms

  • MacOS
  • Linux
  • IOS

Building the Project

To build the project, use the following command:

flutter build

This will generate a build based on your current platform.

Running the Project

To run the project, use the following command:

flutter run

Add a new prompt format

change llama_cpp_dart/src/llm.cpp antiprompt_map to add a new prompt format

Contributing

Contributions are welcome!

License

This project is licensed under the terms of the MIT license.