Skip to content

Conversation

@RomneyDa
Copy link
Collaborator

@RomneyDa RomneyDa commented Nov 22, 2025

Description


Summary by cubic

Lazily fetch and cache Ollama model info to cut repeated api/show calls. Cleaned up JetBrains startup and logging to reduce noise.

  • Refactors

    • Added ensureModelInfo with a cached promise; model info fetched on first use, not in constructor.
    • Safer parsing of /api/show parameters; don’t override contextLength, ignore bad stop values, fail silently.
    • Consolidated JetBrains webview startup: single initial config load with polling; removed duplicate init.
    • Simplified IPC/TCP JSON parse error logs to reduce noisy stack traces.
  • Bug Fixes

    • Enable OSR context menu on macOS by removing platform check.

Written for commit 685acbc. Summary will update automatically on new commits.

@RomneyDa RomneyDa requested a review from a team as a code owner November 22, 2025 02:49
@RomneyDa RomneyDa requested review from tingwai and removed request for a team November 22, 2025 02:49
@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Nov 22, 2025
@github-actions
Copy link

github-actions bot commented Nov 22, 2025

✅ Review Complete

Code Review Summary

⚠️ Continue configuration error. Please verify that the assistant exists in Continue Hub.


Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3 issues found across 4 files

Prompt for AI agents (all 3 issues)

Understand the root cause of the following 3 issues and fix them.


<file name="core/llm/llms/Ollama.ts">

<violation number="1" location="core/llm/llms/Ollama.ts:175">
Lazy-loading the model info leaves `supportsFim()` false for the first request, so the initial autocomplete call always skips the FIM path and ignores the suffix context for FIM-capable models.</violation>
</file>

<file name="binary/src/TcpMessenger.ts">

<violation number="1" location="binary/src/TcpMessenger.ts:127">
Dropping the caught error object from the console.error call removes the stack trace and error message, making JSON parsing failures much harder to debug.</violation>
</file>

<file name="binary/src/IpcMessenger.ts">

<violation number="1" location="binary/src/IpcMessenger.ts:91">
Dropping the caught error from the log removes stack/context information, making JSON parse failures hard to diagnose. Please log the error object along with the truncated line.</violation>
</file>

Reply to cubic to teach it or ask questions. Re-run a review with @cubic-dev-ai review this PR

* This is called on first use rather than in the constructor to avoid
* making HTTP requests when models are just being instantiated for config serialization.
*/
private async ensureModelInfo(): Promise<void> {
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Nov 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lazy-loading the model info leaves supportsFim() false for the first request, so the initial autocomplete call always skips the FIM path and ignores the suffix context for FIM-capable models.

Prompt for AI agents
Address the following comment on core/llm/llms/Ollama.ts at line 175:

<comment>Lazy-loading the model info leaves `supportsFim()` false for the first request, so the initial autocomplete call always skips the FIM path and ignores the suffix context for FIM-capable models.</comment>

<file context>
@@ -161,47 +161,61 @@ class Ollama extends BaseLLM implements ModelInstaller {
+   * This is called on first use rather than in the constructor to avoid
+   * making HTTP requests when models are just being instantiated for config serialization.
+   */
+  private async ensureModelInfo(): Promise&lt;void&gt; {
+    if (this.model === &quot;AUTODETECT&quot;) {
       return;
</file context>
Fix with Cubic

line.substring(0, 100) + "..." + line.substring(line.length - 100);
}
console.error("Error parsing line: ", truncatedLine, e);
console.error("Error parsing JSON from line: ", truncatedLine);
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Nov 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dropping the caught error object from the console.error call removes the stack trace and error message, making JSON parsing failures much harder to debug.

Prompt for AI agents
Address the following comment on binary/src/TcpMessenger.ts at line 127:

<comment>Dropping the caught error object from the console.error call removes the stack trace and error message, making JSON parsing failures much harder to debug.</comment>

<file context>
@@ -124,7 +124,7 @@ export class TcpMessenger&lt;
           line.substring(0, 100) + &quot;...&quot; + line.substring(line.length - 100);
       }
-      console.error(&quot;Error parsing line: &quot;, truncatedLine, e);
+      console.error(&quot;Error parsing JSON from line: &quot;, truncatedLine);
       return;
     }
</file context>
Suggested change
console.error("Error parsing JSON from line: ", truncatedLine);
console.error("Error parsing JSON from line: ", truncatedLine, e);
Fix with Cubic

line.substring(0, 100) + "..." + line.substring(line.length - 100);
}
console.error("Error parsing line: ", truncatedLine, e);
console.error("Error parsing JSON from line: ", truncatedLine);
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Nov 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dropping the caught error from the log removes stack/context information, making JSON parse failures hard to diagnose. Please log the error object along with the truncated line.

Prompt for AI agents
Address the following comment on binary/src/IpcMessenger.ts at line 91:

<comment>Dropping the caught error from the log removes stack/context information, making JSON parse failures hard to diagnose. Please log the error object along with the truncated line.</comment>

<file context>
@@ -88,7 +88,7 @@ class IPCMessengerBase&lt;
           line.substring(0, 100) + &quot;...&quot; + line.substring(line.length - 100);
       }
-      console.error(&quot;Error parsing line: &quot;, truncatedLine, e);
+      console.error(&quot;Error parsing JSON from line: &quot;, truncatedLine);
       return;
     }
</file context>
Suggested change
console.error("Error parsing JSON from line: ", truncatedLine);
console.error("Error parsing JSON from line: ", truncatedLine, e);
Fix with Cubic

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Nov 22, 2025
@RomneyDa RomneyDa marked this pull request as draft November 24, 2025 18:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

2 participants