Issue
With 0.91.3, I am now getting this when interacting with Copilot:
> hi
Processing...
litellm.ServiceUnavailableError: litellm.MidStreamFallbackError:
litellm.APIConnectionError: APIConnectionError: OpenAIException - 'ChatCompletionChunk'
object is not subscriptable Original exception: APIConnectionError:
litellm.APIConnectionError: APIConnectionError: OpenAIException - 'ChatCompletionChunk'
object is not subscriptable
0.91.2 works.
Version and model info
cecli v0.91.3
Main model: github_copilot/gpt-4 with diff edit format
Weak model: github_copilot/gpt-4o-mini
Git repo: none
Repo-map: disabled
MCP servers configured: beads
Issue
With 0.91.3, I am now getting this when interacting with Copilot:
0.91.2 works.
Version and model info
cecli v0.91.3
Main model: github_copilot/gpt-4 with diff edit format
Weak model: github_copilot/gpt-4o-mini
Git repo: none
Repo-map: disabled
MCP servers configured: beads