Skip to content

Latest commit

 

History

History
201 lines (139 loc) · 9.79 KB

README.md

File metadata and controls

201 lines (139 loc) · 9.79 KB

Trilium-chat

Introduction

This chat plugin is highly integrated with Trilium, and allows you to access ChatGPT right inside Trilium!

You can even use your own locally hosted Ollama.

This project is written in vanilla JavaScript, and is a frontend-only project. For those interested in developing Trilium plugins, there are some details within that can be referenced.

Features

  • Normal chat
  • Use Ollama
  • Custom prompt
    • Supports mustache syntax to render options. e.g. {{language:Enligsh|Chinese|Czech}} will be rendered as a select element (This can be changed in CHAT_PROMPTS)
    • {{message}} as your message
    • {{activeNote}} as the content of active note
  • Commands
    • Copy
    • Save to history
    • Favor
    • Save to active note
    • Append to active note
    • Save to new child note
    • Insert(for message)
  • input shortcuts
    • /p - prompt
    • /c - command
    • /h - history
  • Automatic history saving
  • Supports light and dark themes

Preview

https://soulsands.github.io/trilium-chat/

After you save your Chatgpt apikey and refresh the page, You can use most of the features, except those that depend on Trilium.

Please feel free to reference this page as it directly calls the ChatGPT API.

Start

  1. Create a JS frontend note, then copy the contents from the release file "trilium-chat.js" into the note, or alternatively, import the "trilium-chat.js" file.

  2. Set the note with the attribute #run=frontendStartup

    1687307963633

  3. Reload (then an options note and a prompt note will be created as children of the script note).

  4. Configure your ChatGPT API key in the options note and reload.

  5. Alt+q is the default shortcut to toggle the UI, or you can click the chat icon on the top right corner.

Documentation

OPTIONS

Options is stored in a JSON note with #CHAT_OPTIONS label.

OPTION Description Default
viewWidth Width of the chat interface, can be adjusted directly by dragging the left side 400
engine AI provider that powers the chat, currently only supports ChatGPT chatGpt
apiKey API key for chatGPT, check it here ‘’
requestUrls URLs used for requests, currently only supports completion, other functionalities are not yet implemented { completion: 'https://api.openai.com/v1/chat/completions'},
engineOptions Request parameters for chatGPThttps://platform.openai.com/docs/api-reference/completions/create) {
model: 'gpt-3.5-turbo',
max_tokens: 2500,
temperature: 0.3,
top_p: 1,
presence_penalty: 0.5,
frequency_penalty: 0.5,
stream: true,
n: 1,
}
shortcut Keyboard shortcuts for controlling chat display {
toggle: 'Alt+Q',
hide: 'Esc',
}
faces Randomly displayed facial expressions in the top left corner, using class names from the Trilium icon library [
'bx-smile',
'bx-wink-smile',
'bx-face',
'bx-happy-alt',
'bx-cool',
'bx-laugh',
'bx-upside-down',
],
colors Colors of the randomly displayed facial expressions ['var(--muted-text-color)']
autoSave Whether to automatically save the conversation history, if set to false, a save command will be displayed in the command list. true
systemPrompt Background prompt used for system messages, e.g., set it to: "You are a helpful assistant for Trilium note-taking." ‘’
checkUpdates Whether to automatically check for updates. If enabled, a dot will be displayed on the face icon when an update is available. true

Use with Ollama

If you want to use your own locally hosted Ollama, you need to set the values of requestUrls.completion to https://<your-endpoint>/api/chat, set stream to false, and set model to whichever model you want to use from your Ollama. You should keep the rest of the settings as if they were pointed to ChatGpt. An example of my settings can be seen below:

{
	"viewWidth": 364,
	"engine": "ChatGpt",
	"apiKey": "asdfasdfasdfasdfasdf",
	"requestUrls": {
		"completion": "https://ollama.internal.network/api/chat"
	},
	"engineOptions": {
		"model": "llama3",
		"max_tokens": 2500,
		"temperature": 0.3,
		"top_p": 1,
		"presence_penalty": 0.5,
		"frequency_penalty": 0.5,
		"stream": false,
		"n": 1
	},
	"shortcut": {
		"toggle": "Alt+Q",
		"hide": "Esc"
	},
	"autoSave": true,
	"systemPrompt": "",
	"checkUpdates": true
}

Allowing Authorization header for use with Ollama

You may also need to modify the Ollama Nginx configuration with the following, to allow it to accept the Authorization header (you can see if this is needed by inspecting the traffic from Trilium to your Ollama):

server {
    listen 80;
    server_name example.com;  # Replace with your domain or IP
    location / {
        proxy_pass http://localhost:11434;
        proxy_set_header Host localhost:11434;
	add_header Access-Control-Allow-Headers "Authorization";
    }
}

Allowing CORS for use with Ollama

If you recieve an error regarding CORS, such as the following, you may need to update your Ollama's OLLAMA_ORIGINS environment variable, as outlined in this ollama/ollama issue. You can set it to something like OLLAMA_ORIGINS=*.

Request has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

Prompt

The prompt supports customizable options, making it highly flexible to use.

translation options

The {{language:English|Chinese|French}} will be rendered as a dropdown component, allowing you to select the desired option directly when using it.

image-20230607233913358

{{activeNote}} supports two types of messages: text and code. For text messages, the content will be rendered in the chat window as it appears in the note, with one exception - if it is an included note, it will be rendered as a link.

It is important to note that because Trilium stores text notes in HTML format, messages sent through {{activeNote}} will also be sent in HTML format. This is to provide AI with more information. The prompt can specify how the response should be formatted.

When using {{activeNote}}, AI may return HTML, but it will still be rendered as plain text because it is difficult to determine whether the returned HTML is formatting information or actual code. Markdown is easier to handle, and support for markdown rendering may be added in the future.

Prompts will be stored under the #CHAT_PROMPTS note. You can manually modify the order or content within it.

Command

thread

image-20230607234407989

The favor command will prioritize the entry in the chat history and display a flag indicating its importance.

The set command replaces the current note with the chat content.

The append command inserts the current note at the end of the note.

message

image-20230607234443677

The new command, insert, allows you to insert the current message at the selected position of the cursor.

The set, insert, and append commands support both text and code note types.

History

image-20230607235219388

The chat history will be stored under the note labeled with the attribute CHAT_HISTORY_HOME. If such a note doesn't exist, it will be stored under the default "trilium-chat" note.

When opening the history or prompts, the search bar will be focused by default. You can navigate through the options using the Tab and Shift+Tab keys, and select an option by pressing Enter.

Contribute

yarn
//create .env.dev
yarn dev

// create .env.triliumTest
yarn build:test "{{temp file path}}" ## use open note externally/custom to create a temporary file

This project is developed using vanilla JavaScript, making it simple and easy to understand.

Any contributions would be appreciated. If you have any questions or comments, please feel free to reach out to me.

Shoutout

Trilium

Grateful to zadam for developing Trilium and making it open source.


That you've reached this point shows you have enough interest in this project. If you find it helpful, please consider giving it a star. Your support will motivate me to further improve it.

This readme was translated by ChatGPT, and thanks to ChatGPT as well.