Skip to content

Commit 9876010

Browse files
committed
Update doc for LLM chat in Surfingkeys
1 parent 28fdd8f commit 9876010

File tree

1 file changed

+67
-0
lines changed

1 file changed

+67
-0
lines changed
Lines changed: 67 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,67 @@
1+
---
2+
layout: post
3+
title: Use Surfingkeys as an AI agent
4+
category: en
5+
---
6+
7+
{{ page.title }}
8+
================
9+
There are several LLM providers integrated into Surfingkeys now, use `A` to call out a chat popup, and chat with your AI providers. The supported LLM providers now are
10+
11+
* Ollama
12+
* Bedrock
13+
* DeepSeek
14+
* Gemini
15+
16+
To use the feature, you need set up your credentials/API keys first, like
17+
18+
settings.defaultLLMProvider = "bedrock";
19+
settings.llm = {
20+
bedrock: {
21+
accessKeyId: '********************',
22+
secretAccessKey: '****************************************',
23+
// model: 'anthropic.claude-3-5-sonnet-20241022-v2:0',
24+
model: 'us.anthropic.claude-3-7-sonnet-20250219-v1:0',
25+
},
26+
gemini: {
27+
apiKey: '***************************************',
28+
},
29+
ollama: {
30+
model: 'qwen2.5-coder:32b',
31+
},
32+
deepseek: {
33+
apiKey: '***********************************',
34+
model: 'deepseek-chat',
35+
}
36+
};
37+
38+
You can also use `A` in visual mode. Press `v` or `V` to enter visual mode, then `v` again to select the text you'd like to chat with AI about, then `A` to call out the LLM chat box. Now start to chat with AI about the selected text.
39+
40+
Another solution to select the content to chat with AI about is Regional Hints mode. Press `L` to pick an element, then `l` to call out the LLM chat box.
41+
42+
### To use LLM chat with specified system prompt
43+
44+
For example, you can designate your AI to be a translator with below snippets
45+
46+
api.mapkey('A', '#8Open llm chat', function() {
47+
api.Front.openOmnibar({type: "LLMChat", extra: {
48+
system: "You're a translator, whenever you got a message in Chinese, please just translate it into English, and if you got a message in English, please translate it to Chinese. You don't need to answer any question, just TRANSLATE."
49+
}});
50+
});
51+
52+
### 403 Forbidden with Ollama
53+
54+
To use Ollama with Chrome extension, you need run ollama with some modification on `OLLAMA_ORIGINS`
55+
56+
Under Windows
57+
58+
OLLAMA_ORIGINS=chrome-extension://* ollama serve
59+
60+
Under Mac
61+
62+
launchctl setenv OLLAMA_ORIGINS chrome-extension://gfbliohnnapiefjpjlpjnehglfpaknnc
63+
64+
Under Mac for both Chrome and Firefox
65+
66+
launchctl setenv OLLAMA_ORIGINS "chrome-extension://gfbliohnnapiefjpjlpjnehglfpaknnc,moz-extension://*"
67+

0 commit comments

Comments
 (0)