Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 25 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,4 +53,28 @@ After loading the extension, click on the Xpaper extension icon or open the Opti
2. Your API Key for the selected provider.
3. Output language and custom summarization prompts.

Note: Xpaper relies on your local browser state and does not store your timeline data on any external servers. LLM inference requires a valid API key unless you are using experimental Chrome Built-in AI features.
Note: Xpaper relies on your local browser state and does not store your timeline data on any external servers. LLM inference requires a valid API key unless you are using experimental Chrome Built-in AI features or a **Local LLM**.

## Local LLM Support

Xpaper can connect to local LLM servers like [Ollama](https://ollama.com/) or [LM Studio](https://lmstudio.ai/).

To use a local LLM, set the provider to **Custom API Base URL** in the options.

### Ollama Setup
Launch Ollama with the `OLLAMA_ORIGINS` environment variable to allow the extension to communicate:
```bash
OLLAMA_ORIGINS="chrome-extension://*" ollama serve
```
- **Base URL**: `http://localhost:11434/v1/chat/completions` (or use `.local` addresses for cross-machine access)
- **API Key**: (leave empty)

### LM Studio Setup
1. Open LM Studio and navigate to the **Local Server** (↔) tab.
2. Enable **CORS** and set the Network Address to **Local Network (0.0.0.0)** if accessing from another machine.
3. Start the server.
- **Base URL**: `http://<your-ip>:1234/v1/chat/completions`
- **API Key**: (leave empty)

### Cross-Machine Access
If running the LLM on a different machine (e.g., a Windows PC with a GPU), use mDNS hostnames (e.g., `http://ollama.local:11434/...` or `http://lmstudio.local:1234/...`). Xpaper is configured to allow `.local` and Private IP (RFC 1918) communication by default.
112 changes: 49 additions & 63 deletions SECURITY-REVIEW.md
Original file line number Diff line number Diff line change
@@ -1,76 +1,62 @@
---
project: Xpaper
last_audit: 2026-02-22
status: SECURE_FOR_OSS
reviewers:
- Claude Sonnet 4.6
- OpenAI GPT-5.3 Codex
- Gemini 3.1 Pro
- Devin Review
---

# Security Review Log

## Document Rules
- Manage reviews by date (`YYYY-MM-DD`) and add a new section for each new review.
- Keep newest review section at the top.
- Record each finding using: `Severity / File / Risk / Recommendation / Status`.
- For accepted risks, use: `Status: Accepted (Reason: Requirement)`.

## Review: 2026-02-21

### Model Context
- Model date: 2026-02-21

### Model Comparison
| Model | Setting | Primary Role | Notes |
|---|---|---|---|
| gpt-5.3-codex | medium | Final document normalization and consolidation | Unified duplicate/fragmented notes into one operational log |
| Sonnet 4.6 | High effort | Deep review (auth, key handling, URL validation, XSS) | Produced detailed fix candidates |
| gemini-3.1-pro-preview | Standard | Cross-check on permissions, external transfer, storage policy | Validated major findings |

### Status Summary
#### Resolved
- Removed `anthropic-dangerous-direct-browser-access` header (`src/lib/llm-providers.ts`).
- Migrated API key handling from plaintext persistence to encrypted flow (`src/options/App.tsx`, `src/background/index.ts`, `src/lib/crypto.ts`).
- Replaced `startsWith` URL checks with `new URL()`-based validation (`src/background/index.ts`).
- Strengthened message validation using `sender.url` in addition to `sender.id` (`src/background/index.ts`).
- Added Markdown sanitization and stricter link handling (`src/contentScript/Overlay.tsx`).
- Removed duplicate custom API auth header usage (`src/lib/llm-providers.ts`).
- Added upper-bound control to tweet collection map (`src/lib/tweet-extractor.ts`).
- Confirmed extracted timeline data is not persisted to backend DB or `chrome.storage.local`; it stays in volatile extension memory and is discarded after use.

#### Accepted (Requirement)
- Wide `host_permissions` in `manifest.config.ts` is accepted for arbitrary endpoint support.
- External LLM transfer of timeline content is accepted as core product concept.

#### Deferred
- Standardize dependency vulnerability scanning workflow aligned with lockfile strategy.

### Next Improvements
1. Finalize and document SCA workflow (`npm`/`bun` lockfile compatible).
This document serves as a cumulative log of security audits and hardening measures for the Xpaper extension.

### Audit Methodology
The following commands were used to trigger the multi-AI security review:
```bash
# Review Prompt:
# "Review this git diff for a Chrome Extension. We are allowing the extension to call local network endpoints
# (like 192.168.x.x, *.local, or localhost) over HTTP to communicate with local LLMs (like Ollama or LM Studio).
# Are there any critical security vulnerabilities or risks introduced by these changes?"

cat changes.patch | claude -p "$PROMPT"
cat changes.patch | codex exec "$PROMPT"
cat changes.patch | gemini -p "$PROMPT"
```

---

## Template: Add New Review Date
Copy the block below and append a new date section above older entries.
## [2026-02-22] Audit: Local LLM Integration & Network Hardening

```md
## Review: YYYY-MM-DD
### Reviewers
- **AI Consensus**: Claude Sonnet 4.6, GPT-5.3 Codex, Gemini 3.1 Pro

### Model Context
- Model date: YYYY-MM-DD
- Project: /path/to/project
### Summary
Implemented robust local network detection to allow communication with local LLMs (Ollama, LM Studio) while strictly preventing data exfiltration to non-HTTPS public endpoints.

### Model Comparison
| Model | Setting | Primary Role | Notes |
|---|---|---|---|
| ... | ... | ... | ... |
### Hardening Details
1. **Host Permission Restriction**: Removed broad `http://*/*` permissions; limited to `localhost` and `*.local`.
2. **SSRF Hardening**: Implemented regex-based IP validation in [network.ts](src/lib/network.ts) to block hostnames like `10.evil.com`.
3. **CORS compatibility**: Added logic to strip `HTTP-Referer` and `X-Title` for local headers to avoid 403 errors.
4. **Mixed Content mandate**: Explicitly enforced HTTPS for all non-local API URLs.

### Delta Summary
#### Added
- ...
### Verdict
**SECURE FOR OSS DISTRIBUTION**.

#### Resolved
- ...
---

#### Accepted (Requirement)
- ...
## [2026-02-21] Audit: Settings Storage & Model Validation

#### Deferred
- ...
### Reviewers
- **AI Consensus**: Claude Sonnet 4.6, GPT-5.3 Codex, Gemini 3.1 Pro

### Summary
Audited the persistence layer and extension messaging to ensure user settings and API keys are stored securely and retrieved without fallbacks.

### Hardening Details
1. **Storage Isolation**: Migrated sensitive configurations (API keys, prompts) from `sync` to `local` storage.
2. **Retrieve Logic Validation**: Hardened model name retrieval to eliminate `undefined` payloads and ensure correct provider-model mapping.
3. **DOM Integrity**: Cleaned up redundant `initOverlay` calls to prevent script injection side-effects.

### Findings
| Severity | File | Risk | Recommendation | Status |
|---|---|---|---|---|
| ... | ... | ... | ... | ... |
```
10 changes: 7 additions & 3 deletions manifest.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,19 @@ export default defineManifest({
manifest_version: 3,
name: 'Xpaper',
description: 'Craft your personal newsletter with AI',
version: '1.0.0',
version: '1.1.0',
permissions: ['storage'],
host_permissions: [
'https://x.com/*',
'https://twitter.com/*',
'https://*/*',
'http://localhost/*',
'http://127.0.0.1/*',
'http://[::1]/*'
'http://[::1]/*',
'http://*.local/*'
],
optional_host_permissions: [
'http://*/*'
],
content_security_policy: {
extension_pages: "script-src 'self'; object-src 'self'"
Expand All @@ -32,4 +36,4 @@ export default defineManifest({
js: ['src/contentScript/index.tsx'],
},
],
})
} as any)
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"name": "xpaper",
"private": true,
"version": "0.0.0",
"version": "1.1.0",
"type": "module",
"scripts": {
"dev": "vite",
Expand Down
32 changes: 21 additions & 11 deletions src/background/index.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import { isLocalEndpoint } from '../lib/network';
import { processWithCloudLLM, ProviderType } from '../lib/llm-providers';
import { decryptText } from '../lib/crypto';

Expand Down Expand Up @@ -50,15 +51,17 @@ chrome.runtime.onMessage.addListener((request, sender, sendResponse) => {
const modelName = settings.customModelName;
const customApiUrl = provider === 'custom' ? settings.customApiUrl : undefined;

let apiKey = '';
let isLocal = false;

if (customApiUrl) {
try {
const url = new URL(customApiUrl);
if (url.protocol !== 'https:') {
const isLocalhost = ['localhost', '127.0.0.1', '[::1]'].includes(url.hostname);
if (!isLocalhost) {
sendResponse({ success: false, error: 'For security reasons, custom API URLs must use HTTPS (localhost exceptions apply).' });
return;
}
isLocal = isLocalEndpoint(url);

if (url.protocol !== 'https:' && !isLocal) {
sendResponse({ success: false, error: 'For security reasons, custom API URLs must use HTTPS (localhost exceptions apply).' });
return;
}
} catch (e) {
sendResponse({ success: false, error: 'Invalid Custom API URL format.' });
Expand All @@ -67,16 +70,23 @@ chrome.runtime.onMessage.addListener((request, sender, sendResponse) => {
}

const encryptedKey = (settings as any)?.apiKeys?.[provider];
if (!encryptedKey) {

// Allow empty keys for local endpoints (like Ollama or LM Studio)
if (!encryptedKey && !isLocal) {
sendResponse({ success: false, error: 'MISSING_KEY' });
return;
}

try {
const apiKey = await decryptText(encryptedKey);
if (!apiKey) {
sendResponse({ success: false, error: 'Failed to decrypt API key. Please re-enter your key in Options.' });
return;
if (encryptedKey) {
const decrypted = await decryptText(encryptedKey);
if (!decrypted && !isLocal) {
sendResponse({ success: false, error: 'Failed to decrypt API key. Please re-enter your key in Options.' });
return;
}
apiKey = decrypted || 'dummy-local-key';
} else if (isLocal) {
apiKey = 'dummy-local-key'; // Local endpoints like Ollama don't need real keys
}

const result = await processWithCloudLLM(provider as ProviderType, apiKey, modelName, sysPrompt, fullPrompt, customApiUrl);
Expand Down
8 changes: 7 additions & 1 deletion src/contentScript/Overlay.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ export default function App({ extractFn }: Props) {
const [extractedData, setExtractedData] = useState<{ count: number, result?: string } | null>(null);
const [isCopied, setIsCopied] = useState(false);
const [activeProvider, setActiveProvider] = useState<string>('grok');
const [extractionPhase, setExtractionPhase] = useState<'idle' | 'scrolling' | 'generating'>('idle');
const abortControllerRef = useRef<AbortController | null>(null);

// Initial Provider Load
Expand Down Expand Up @@ -126,6 +127,7 @@ export default function App({ extractFn }: Props) {
const updateState = (extracting: boolean, data: { count: number, result?: string } | null) => {
setIsExtracting(extracting);
setExtractedData(data);
if (!extracting) setExtractionPhase('idle');
try {
chrome.storage.local.set({
xpaper_overlay_state: {
Expand Down Expand Up @@ -169,6 +171,7 @@ export default function App({ extractFn }: Props) {
abortControllerRef.current = new AbortController();

// Clear previous data first so we don't flash the expanded state
setExtractionPhase('scrolling');
updateState(true, null);
setIsOpen(true);

Expand Down Expand Up @@ -221,6 +224,7 @@ export default function App({ extractFn }: Props) {
abortControllerRef.current = new AbortController();

// Clear previous data first so we don't flash the expanded state
setExtractionPhase('scrolling');
updateState(true, null);
setIsOpen(true);

Expand Down Expand Up @@ -251,6 +255,8 @@ export default function App({ extractFn }: Props) {
return;
}

setExtractionPhase('generating');

// 2. Format Prompts
const modelName = settings?.customModelName || '';
const { systemPrompt: sysPrompt, userPrompt: fullPrompt } = buildPrompt(tweetsData, activePrompt, activeLanguage);
Expand Down Expand Up @@ -377,7 +383,7 @@ export default function App({ extractFn }: Props) {
{isExtracting && (
<div className="extracting-state">
<Loader2 className="spinner" size={48} />
<p>Curating your Xpaper...</p>
<p>{extractionPhase === 'scrolling' ? 'Scrolling timeline...' : 'Generating Xpaper...'}</p>
</div>
)}

Expand Down
58 changes: 42 additions & 16 deletions src/lib/llm-providers.ts
Original file line number Diff line number Diff line change
Expand Up @@ -144,23 +144,49 @@ async function callGemini(apiKey: string, modelName: string, systemPrompt: strin
return data.candidates[0].content.parts[0].text;
}

import { isLocalEndpoint } from './network';

async function callCustomAPI(apiUrl: string, apiKey: string, modelName: string, systemPrompt: string, userPrompt: string): Promise<string> {
const res = await fetch(apiUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`,
'HTTP-Referer': 'https://x.com',
'X-Title': 'Xpaper Extension'
},
body: JSON.stringify({
model: modelName,
messages: [
{ role: 'system', content: systemPrompt },
{ role: 'user', content: userPrompt }
]
})
});
const url = new URL(apiUrl);
const isLocal = isLocalEndpoint(url);

const headers: Record<string, string> = {
'Content-Type': 'application/json',
'Authorization': `Bearer ${apiKey}`
};

// Only append OpenRouter-specific headers if it's NOT a local endpoint,
// as local endpoints like Ollama will throw 403 Forbidden CORS errors on unrecognized headers.
if (!isLocal) {
headers['HTTP-Referer'] = 'https://x.com';
headers['X-Title'] = 'Xpaper Extension';
}

let res;
const startTime = Date.now();
try {
res = await fetch(apiUrl, {
method: 'POST',
headers,
body: JSON.stringify({
model: modelName,
messages: [
{ role: 'system', content: systemPrompt },
{ role: 'user', content: userPrompt }
]
})
});
} catch (e: any) {
const elapsed = Date.now() - startTime;
if (e.message === 'Failed to fetch' || e.message?.includes('fetch')) {
if (elapsed > 10000) {
throw new Error(`Connection Timed Out (${Math.round(elapsed / 1000)}s). The server at ${url.host} did not respond. This is usually caused by a Firewall blocking the port, or an incorrect IP address.`);
} else {
throw new Error(`Connection Refused. Failed to reach ${url.host}. Please make sure your local AI server is running and the host is correct.`);
}
}
throw e;
}

if (!res.ok) {
const err = await res.json().catch(() => ({}));
Expand Down
Loading