Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -292,4 +292,5 @@ XMLs
logs
wwwroot
appsettings.Production.json
*.csproj.user
*.csproj.user
env/
9 changes: 9 additions & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,6 +96,15 @@ The main documentation for the site is organized into the following sections:
llm/few-shot-learning
llm/provider

.. _llamasharp:

.. toctree::
:maxdepth: 2
:caption: Use Local LLM Models

llama-sharp/config-llamasharp
llama-sharp/use-llamasharp-in-ui

.. _architecture-docs:

.. toctree::
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/llama-sharp/assets/click-test-button.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/llama-sharp/assets/edit-agent.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
60 changes: 60 additions & 0 deletions docs/llama-sharp/config-llamasharp.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Config LLamaSharp

BotSharp contains LLamaSharp plugin that allows you to run local llm models. To use the LLamaSharp, you need to config the BotSharp project with few steps.

## Install LLamaSharp Backend

Before use LLamaSharp plugin, you need to install one of the LLamaSharp backend services that suits your environment.

- [`LLamaSharp.Backend.Cpu`](https://www.nuget.org/packages/LLamaSharp.Backend.Cpu): Pure CPU for Windows & Linux. Metal for Mac.
- [`LLamaSharp.Backend.Cuda11`](https://www.nuget.org/packages/LLamaSharp.Backend.Cuda11): CUDA 11 for Windows and Linux
- [`LLamaSharp.Backend.Cuda12`](https://www.nuget.org/packages/LLamaSharp.Backend.Cuda12): CUDA 12 for Windows and Linux

**Please install the same version of LLamaSharp Backend with the LLamaSharp in BotSharp.Plugin.LLamaSharp.csproj.**

![Check LLamaSharp Version](assets/check-llamasharp-version.png)

```shell
# move to the LLamaSharp Plugin Project
$ cd src/Plugins/BotSharp.Plugin.LLamaSharp
# Install the LLamaSharp Backend
$ dotnet add package LLamaSharp.Backend.Cpu --version 0.9.1
```

## Download and Config Local LLM Models

LLamaSharp supports many LLM Models like LLaMA and Alpaca. Download the `gguf` format models and save them in your machine.

We will use a [Llama 2](https://huggingface.co/TheBloke/llama-2-7B-Guanaco-QLoRA-GGUF) model in this tutorial.

After downloading the model, open the `src/WebStarter/appsettings.json` file to config the LLamaSharp models. Set the `LlmProviders` and `LlamaSharp` fields to correct settings as your computer. For example:

```json
{
...,
"LlmProviders": [
...,
{
"Provider": "llama-sharp",
"Models": [
{
"Name": "llama-2-7b.Q2_K.gguf",
"Type": "chat"
}
]
},
...
],
...,
"LlamaSharp": {
"Interactive": true,
"ModelDir": "/Users/wenwei/Desktop/LLM",
"DefaultModel": "llama-2-7b.Q2_K.gguf",
"MaxContextLength": 1024,
"NumberOfGpuLayer": 20
},
...
}
```

For more details about LLamaSharp, visit [LLamaSharp - GitHub](https://github.com/SciSharp/LLamaSharp).
29 changes: 29 additions & 0 deletions docs/llama-sharp/use-llamasharp-in-ui.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Use LLamaSharp in BotSharp

Start the BotSharp backend and frontend services, and follow this tutorial.

## Install LLamaSharp Plugin in UI.

Go to the Plugin page and install LLamaSharp Plugin.

![Install LlamaSharp Plugin](assets/install-llamasharp-plugin.png)

## Config LLamaSharp as LLM Providers for Agents

Edit or create an agent in Agents page, and config the agent.

![Edit Agent](assets/edit-agent.png)

In the edit page, config the provider as llama-sharp.

![Choose LLamaSharp as Provider](assets/choose-llamasharp-as-provider.png)

Then test the agent.

![Click Test Agent Button](assets/click-test-button.png)

![Test Agent Example](assets/converstaion-examples.png)

If run successfully, you will see log like this in BotSharp service's console.

![Console Output](assets/console-output-in-botsharp.png)