Fix: Incorrect chat title generation#4775
Conversation
✅ Deploy Preview for continuedev canceled.
|
There was a problem hiding this comment.
@sh4shii would it be possible to implement the same fix using chat instead of complete? Something in the latter lines you updated is likely the culprit?
Specifically wondering what the [object Object] actually is in your case. Is it an error because the model doesn't support chat? Is it not parsed correctly?
|
@RomneyDa tried implementing the fix using model.chat, but it's still returning [object Object][object Object] instead of the expected string. It seems like the response from model.chat is not being parsed correctly, as it returns: This suggests that either the model's chat response structure differs, or additional parsing is needed when using model.chat. Do you have any insights on why this might be happening? I can explore further based on that. |
|
I don't have an immediate answer to why |
|
@RomneyDa I've investigated with breakpoints in the Gemini model class as suggested. Here's what I found:
Would you be able to point me to which specific files/methods I should add logging or breakpoints to track this transformation? |
|
@sh4shii looks like could be one of two lines. Probably BaseLLM's chat function: In Same issue actually shows up in
where |
Description
fixes: #4774
The chat title was previously being generated as
[object Object][object Object]. This issue has been fixed by ensuring the correct string representation is used for the title.Checklist
Screenshots
As you can see in the attached screenshot, the chat title is now generated correctly.

Testing instructions
[ For new or modified features, provide step-by-step testing instructions to validate the intended behavior of the change, including any relevant tests to run. ]