New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Exception when running an OpenAI streaming model with complex tool parameters #601
Comments
@Crokoking thanks a lot for reporting! |
did it be fixed?
|
can you help me in below code, I am getting same error, what should i change ? How can i replace tokenizer ? (Replacing the tokenizer with one that uses base Gson fixed the issue for me but that is probably not the "proper" solution). @bean @bean
} |
I made a copy of the OpenAiTokenizer class, added a gson field, and replaced the two calls to Json.fromJson() with calls to gson.fromJson(). Then i just set that new class as tokenizer when creating my ChatModel |
… JSON (#918) ## Context Fixes #601 ## Change Do not restrict Map key/value types when deserializing from JSON ## Checklist Before submitting this PR, please check the following points: - [X] I have added unit and integration tests for my change - [X] All unit and integration tests in the module I have added/changed are green - [X] All unit and integration tests in the [core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core) and [main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j) modules are green - [ ] I have added/updated the [documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs) - [ ] I have added an example in the [examples repo](https://github.com/langchain4j/langchain4j-examples) (only for "big" features) - [ ] I have added my new module in the [BOM](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-bom/pom.xml) (only when a new module is added) ## Checklist for adding new embedding store integration - [ ] I have added a {NameOfIntegration}EmbeddingStoreIT that extends from either EmbeddingStoreIT or EmbeddingStoreWithFilteringIT
Describe the bug
When using the OpenAI streaming model with tool-parameters more complex than a string,
the token-estimation system throws an exception. This seems to be caused by the default JSON-parser being hard-coded to decode Map.class as Map<String, String> instead of the default GSON behavior.
Log and Stack trace
To Reproduce
Expected behavior
There should not be an exception. It works fine with the non-streaming model
Please complete the following information:
Additional context
Replacing the tokenizer with one that uses base Gson fixed the issue for me but that is probably not the "proper" solution.
The text was updated successfully, but these errors were encountered: