Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Custom models not work on OpenAiStreamingChatModel because Model 'xxx' is unknown to jtokkit #804

Closed
nhannht opened this issue Mar 22, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@nhannht
Copy link

nhannht commented Mar 22, 2024

Custom models not work on OpenAiStreamingChatModel because `Model 'xxx' is unknown to jtokkit

Describe the bug

This bug simmilar to this old bug

Log and Stack trace

java.lang.IllegalArgumentException: Model 'undi95/toppy-m-7b:free' is unknown to jtokkit
	at dev.langchain4j.internal.Exceptions.illegalArgument(Exceptions.java:19) ~[langchain4j-core-0.28.0.jar:na]
	at dev.langchain4j.model.openai.OpenAiTokenizer.lambda$unknownModelException$0(OpenAiTokenizer.java:266) ~[langchain4j-open-ai-0.28.0.jar:na]
	at java.base/java.util.Optional.orElseThrow(Optional.java:403) ~[na:na]
	at dev.langchain4j.model.openai.OpenAiTokenizer.estimateTokenCountInText(OpenAiTokenizer.java:56) ~[langchain4j-open-ai-0.28.0.jar:na]
	at dev.langchain4j.model.openai.OpenAiTokenizer.estimateTokenCountIn(OpenAiTokenizer.java:81) ~[langchain4j-open-ai-0.28.0.jar:na]
	at dev.langchain4j.model.openai.OpenAiTokenizer.estimateTokenCountInMessage(OpenAiTokenizer.java:66) ~[langchain4j-open-ai-0.28.0.jar:na]
	at dev.langchain4j.model.openai.OpenAiTokenizer.estimateTokenCountInMessages(OpenAiTokenizer.java:168) ~[langchain4j-open-ai-0.28.0.jar:na]
	at dev.langchain4j.model.openai.OpenAiStreamingChatModel.generate(OpenAiStreamingChatModel.java:137) ~[langchain4j-open-ai-0.28.0.jar:na]
	at dev.langchain4j.model.openai.OpenAiStreamingChatModel.generate(OpenAiStreamingChatModel.java:104) ~[langchain4j-open-ai-0.28.0.jar:na]

Expected behavior
Should work with any custom models. Because non-stream version did

Please complete the following information:

  • LangChain4j version: 0.28.0
  • LLM provider: openrouter.ai
  • Java version: java 21
  • Spring Boot version (if applicable): 3.2.3
@nhannht nhannht added the bug Something isn't working label Mar 22, 2024
langchain4j added a commit that referenced this issue Mar 28, 2024
langchain4j added a commit that referenced this issue Mar 28, 2024
@langchain4j langchain4j mentioned this issue Mar 28, 2024
6 tasks
langchain4j added a commit that referenced this issue Mar 28, 2024
langchain4j added a commit that referenced this issue Mar 28, 2024
<!-- Thank you so much for your contribution! -->

## Context
See #804

## Change
- `OpenAiStreamingChatModel`: in case `modelName` is not one of the
known OpenAI models, do not return `TokenUsage` in the `Response`. This
is done for cases when `OpenAiStreamingChatModel` is used to connect to
other OpenAI-API-compatible LLM providers like Ollama and Groq. In such
cases it is better to not return `TokenUsage` then returning a wrong
one.
- For all OpenAI models, default `Tokenizer` will now use
"gpt-3.5-turbo" model name instead of the one provided by the user in
the `modelName` parameter. This is done to avoid crashing with "Model
'ft:gpt-3.5-turbo:my-org:custom_suffix:id' is unknown to jtokkit" for
fine-tuned OpenAI models. It should be safe to use "gpt-3.5-turbo" by
default with all current OpenAI models, as they all use the same
cl100k_base encoding.


## Checklist
Before submitting this PR, please check the following points:
- [X] I have added unit and integration tests for my change
- [X] All unit and integration tests in the module I have added/changed
are green
- [X] All unit and integration tests in the
[core](https://github.com/langchain4j/langchain4j/tree/main/langchain4j-core)
and
[main](https://github.com/langchain4j/langchain4j/tree/main/langchain4j)
modules are green
- [ ] I have added/updated the
[documentation](https://github.com/langchain4j/langchain4j/tree/main/docs/docs)
- [ ] I have added an example in the [examples
repo](https://github.com/langchain4j/langchain4j-examples) (only for
"big" features)
- [ ] I have added my new module in the
[BOM](https://github.com/langchain4j/langchain4j/blob/main/langchain4j-bom/pom.xml)
(only when a new module is added)
@langchain4j
Copy link
Owner

Fixed in 0.29.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants