Skip to content

Local LLM Support#2

Merged
laiso merged 3 commits intomainfrom
local-llm
Feb 22, 2026
Merged

Local LLM Support#2
laiso merged 3 commits intomainfrom
local-llm

Conversation

@laiso
Copy link
Copy Markdown
Owner

@laiso laiso commented Feb 22, 2026

Local LLM Support

Xpaper can connect to local LLM servers like Ollama or LM Studio.

To use a local LLM, set the provider to Custom API Base URL in the options.

Ollama Setup

Launch Ollama with the OLLAMA_ORIGINS environment variable to allow the extension to communicate:

OLLAMA_ORIGINS="chrome-extension://*" ollama serve
  • Base URL: http://localhost:11434/v1/chat/completions (or use .local addresses for cross-machine access)
  • API Key: (leave empty)

LM Studio Setup

  1. Open LM Studio and navigate to the Local Server (↔) tab.
  2. Enable CORS and set the Network Address to Local Network (0.0.0.0) if accessing from another machine.
  3. Start the server.
  • Base URL: http://<your-ip>:1234/v1/chat/completions
  • API Key: (leave empty)

Cross-Machine Access

If running the LLM on a different machine (e.g., a Windows PC with a GPU), use mDNS hostnames (e.g., http://peny.local:1234/...). Xpaper is configured to allow .local and Private IP (RFC 1918) communication by default.

@laiso laiso changed the title Local llm Local LLM Support Feb 22, 2026
Xpaper can connect to local LLM servers like [Ollama](https://ollama.com/) or [LM Studio](https://lmstudio.ai/).

To use a local LLM, set the provider to **Custom API Base URL** in the options.

### Ollama Setup
Launch Ollama with the `OLLAMA_ORIGINS` environment variable to allow the extension to communicate:
```bash
OLLAMA_ORIGINS="chrome-extension://*" ollama serve
```
- **Base URL**: `http://localhost:11434/v1/chat/completions` (or use `.local` addresses for cross-machine access)
- **API Key**: (leave empty)

### LM Studio Setup
1. Open LM Studio and navigate to the **Local Server** (↔) tab.
2. Enable **CORS** and set the Network Address to **Local Network (0.0.0.0)** if accessing from another machine.
3. Start the server.
- **Base URL**: `http://<your-ip>:1234/v1/chat/completions`
- **API Key**: (leave empty)

### Cross-Machine Access
If running the LLM on a different machine (e.g., a Windows PC with a GPU), use mDNS hostnames (e.g., `http://peny.local:1234/...`). Xpaper is configured to allow `.local` and Private IP (RFC 1918) communication by default.
@laiso laiso merged commit effc44e into main Feb 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant