You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add llmProvider, llmModel, and llmApiKeyEnv to the plugin config schema.
These allow users to choose which LLM Hindsight uses directly from
openclaw.json config without needing HINDSIGHT_API_LLM_* env vars.
Priority order (highest to lowest):
1. HINDSIGHT_API_LLM_PROVIDER env var (unchanged)
2. Plugin config llmProvider/llmModel (NEW)
3. Auto-detect from provider env vars (unchanged)
Backward compatible: no config = same behavior as before.
Copy file name to clipboardExpand all lines: hindsight-integrations/openclaw/openclaw.plugin.json
+25Lines changed: 25 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -24,6 +24,19 @@
24
24
"type": "string",
25
25
"description": "hindsight-embed version to use (e.g. 'latest', '0.4.2', or empty for latest)",
26
26
"default": "latest"
27
+
},
28
+
"llmProvider": {
29
+
"type": "string",
30
+
"description": "LLM provider for Hindsight memory (e.g. 'openai', 'anthropic', 'gemini', 'groq', 'ollama'). Takes priority over auto-detection but not over HINDSIGHT_API_LLM_PROVIDER env var.",
0 commit comments