ScribePal is an Open Source intelligent browser extension that leverages AI to empower your web experience by providing contextual insights, efficient content summarization, and seamless interaction while you browse.
ScribePal works with local Ollama models, ensuring that all AI processing and messaging is conducted within your local network. Your private data remains on your system and is never transmitted to external servers. This design provides you with full control over your information and guarantees that nobody outside your network has access to your data.
It is compatible with all Chromium and Gecko-based browsers: Chrome, Vivaldi, Opera, Edge, Firefox, Brave etc.
- AI-powered assistance: Communicates with an AI service (using ollama) to generate responses.
- It is PRIVATE: Because it communicates with a local (within your LAN) Ollama service and LLMs, all your information stays private.
- Theming: Supports light and dark themes.
- Chat Interface: A draggable chat box for sending and receiving messages.
- Model Management: Select, refresh, download, and delete models.
- Advanced Capture Tools: Options for capturing both text and images are available. Captured content is inserted directly into your chat using special tags (
@captured-text
for text and@captured-image
for images). - Prompt Customization: Adjust and customize prompts to instruct the AI model on how to generate responses.
Before installing ScribePal, ensure that you have Node Version Manager (nvm) installed. You can install nvm by following the instructions at nvm-sh/nvm. nvm helps you easily switch to the Node.js version specified in .nvmrc
.
Also, ensure that the Ollama host is installed on your local machine or available on your LAN:
- Install Ollama on your host.
- Edit the systemd service file by running:
sudo nano /etc/systemd/system/ollama.service
- Add the following environment variables in the
[Service]
section:Environment="OLLAMA_HOST=0.0.0.0" Environment="OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*"
Note
The OLLAMA_HOST=0.0.0.0
setting is optional if the Ollama server is running on localhost and you do not need the Ollama server to be accessed from LAN.
- Save the file, then reload and restart the service:
sudo systemctl daemon-reload sudo systemctl restart ollama.service
- Install Ollama on your host.
- On the machine running Ollama, set the environment variables:
You can do this via the System Properties or using PowerShell.
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=chrome-extension://*,moz-extension://*
Note
The OLLAMA_HOST=0.0.0.0
setting is optional if the Ollama server is running on localhost and you do not need the Ollama server to be accessed from LAN.
- Restart Ollama app.
-
Clone the repository:
git clone https://github.com/code-forge-temple/scribe-pal.git cd scribe-pal
-
Set the Node.js version:
- For Unix-based systems:
nvm use
- For Windows:
nvm use $(cat .nvmrc)
- For Unix-based systems:
-
Install dependencies:
npm install
If you're not a developer, you can choose one of the following methods:
Note
Releases available in the browser stores might be slightly out of sync with the GitHub releases. This can be due to the review process, packaging delays, or manual submission requirements. For the most up-to-date version, please refer to the Releases page.
Visit the Releases page to download the latest packages:
- For Chromium-based browsers, download
chrome.zip
. - For Gecko-based browsers, download
firefox.zip
.
After downloading, unzip the package and install the extension manually.
To build the project for development, run:
A. For Chromium-based browsers like Chrome, Vivaldi, Edge, Brave, Opera and others:
npm run dev:chrome
B. For Gecko-based browsers like Firefox, Waterfox, Pale Moon, and others:
npm run dev:firefox
To build the project for production, run:
-
For Chromium-based browsers:
npm run build:chrome
-
For Gecko-based browsers:
npm run build:firefox
To lint the project, run:
npm run lint
To install the the compiled extension, for:
-
Chromium based browsers you need to go to
chrome://extensions/
(in Chrome browser)vivaldi://extensions/
(in Vivaldi browser)opera://extensions/
(in Opera browser)- etc.
and activate the
Developer Mode
, thenLoad unpacked
then select<scribe-pal folder>/dist/chrome
folder. -
For Gecko-based browsers, navigate to
about:debugging#/runtime/this-firefox
- etc.
and click on
Load Temporary Add-on…
then select<scribe-pal folder>/dist/firefox
folder.
-
Open the Extension Popup:
- Once installed, click the extension icon in your browser’s toolbar.
- The popup allows you to set your configuration options.
-
Configure Settings:
- Ollama Server URL:
Enter the URL for your Ollama API server in the provided text field and click “Save”. - Theme Selection:
Use the toggle switch to activate the dark theme as desired.
- Ollama Server URL:
-
Launch the Chat Interface:
- Click “Show ScribePal chat” in the popup or press Ctrl+Shift+Y.
- A responsive, draggable chat box will open on the active webpage.
- Use the chat interface to send messages to the Ollama AI service, review conversation history, and manage models.
- Additional features include capturing selected HTML content (that can be referenced in the discussion with
@captured-text
tag), capturing an image of an area on the page (that can be referenced in the discussion with@captured-image
tag) for VISION LLMs, and customizing prompts (to instruct the loaded model on how to answer).
-
Interacting with the Chat:
- Type your query in the chat input and press Enter or click the
Send
button. - The AI response is rendered below the input as markdown.
- You can manage (delete or refresh) available Ollama models using the available controls in the model select dropdown.
- Type your query in the chat input and press Enter or click the
Some short video tutorials on how to use the plugin:
- Release 1.0.x:
- Release 1.2.x:
This project is licensed under the GNU General Public License v3.0. See the LICENSE file for more details.