Conversation
…during extraction.
Xpaper can connect to local LLM servers like [Ollama](https://ollama.com/) or [LM Studio](https://lmstudio.ai/). To use a local LLM, set the provider to **Custom API Base URL** in the options. ### Ollama Setup Launch Ollama with the `OLLAMA_ORIGINS` environment variable to allow the extension to communicate: ```bash OLLAMA_ORIGINS="chrome-extension://*" ollama serve ``` - **Base URL**: `http://localhost:11434/v1/chat/completions` (or use `.local` addresses for cross-machine access) - **API Key**: (leave empty) ### LM Studio Setup 1. Open LM Studio and navigate to the **Local Server** (↔) tab. 2. Enable **CORS** and set the Network Address to **Local Network (0.0.0.0)** if accessing from another machine. 3. Start the server. - **Base URL**: `http://<your-ip>:1234/v1/chat/completions` - **API Key**: (leave empty) ### Cross-Machine Access If running the LLM on a different machine (e.g., a Windows PC with a GPU), use mDNS hostnames (e.g., `http://peny.local:1234/...`). Xpaper is configured to allow `.local` and Private IP (RFC 1918) communication by default.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Local LLM Support
Xpaper can connect to local LLM servers like Ollama or LM Studio.
To use a local LLM, set the provider to Custom API Base URL in the options.
Ollama Setup
Launch Ollama with the
OLLAMA_ORIGINSenvironment variable to allow the extension to communicate:OLLAMA_ORIGINS="chrome-extension://*" ollama servehttp://localhost:11434/v1/chat/completions(or use.localaddresses for cross-machine access)LM Studio Setup
http://<your-ip>:1234/v1/chat/completionsCross-Machine Access
If running the LLM on a different machine (e.g., a Windows PC with a GPU), use mDNS hostnames (e.g.,
http://peny.local:1234/...). Xpaper is configured to allow.localand Private IP (RFC 1918) communication by default.