You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The systemMessage property is not being applied to built-in commands (/edit, /comment, etc.) when using models with the provider of openai. This issue assumes that including the systemMessage in those cases is desired/expected.
The systemMessage is being included with general queries in the sidebar.
In my particular case, I'm using a VLLM server rather than OpenAI directly, but the issue should be the same regardless.
To reproduce
There is nothing directly in the Continue output window that will show the issue. For each of the "Look at the messages sent" steps in the write-up below, you can either:
Be running the extension in a debugger with a breakpoint set at this point in the OpenAI LLM class and examining body.messages.
Look at the messages received by the LLM server (possible in my case since I can look at the VLLM logs)
To reproduce:
Setup the configuration to have a model with provider set to openai
Include a systemMessage for that model
Open the Continue output window
Ask a question using the sidebar
Look at the messages sent to see if the systemMessage is present (working)
Select some code
Invoke the /comment command
Look at the messages sent to see if the systemMessage is present (not working)
Select some code
Invoke the /edit command
Look at the messages sent to see if the systemMessage is present (not working)
Log output
No response
The text was updated successfully, but these errors were encountered:
This changes the behavior to what I'd expect for my use case. However, I don't know whether a similar change should apply in the _complete function as well (I haven't tracked down which code pathway(s) use _complete instead of _streamComplete.
Before submitting your bug report
Relevant environment info
Description
The
systemMessage
property is not being applied to built-in commands (/edit
,/comment
, etc.) when using models with the provider ofopenai
. This issue assumes that including thesystemMessage
in those cases is desired/expected.The
systemMessage
is being included with general queries in the sidebar.In my particular case, I'm using a VLLM server rather than OpenAI directly, but the issue should be the same regardless.
To reproduce
There is nothing directly in the Continue output window that will show the issue. For each of the "Look at the messages sent" steps in the write-up below, you can either:
body.messages
.To reproduce:
provider
set toopenai
systemMessage
for that modelsystemMessage
is present (working)/comment
commandsystemMessage
is present (not working)/edit
commandsystemMessage
is present (not working)Log output
No response
The text was updated successfully, but these errors were encountered: