-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
invalid option provided option="" #251
Comments
Hello, I cannot find anywhere in the code where the property |
hmm, not really sure where else it would be coming from. At the time i collected these logs, there shouldn't have been anything else hitting the ollama api. Seeing as it seems to be working in most cases, i think we can close this out, thanks for looking into it. |
Running into a similar issue:
|
I get same error via manual POST via python requests lib for /api/generate. I pass |
Has anyone figured this out? Happens to me too. This is the dictionary that's sent: data = {
"model": model,
"messages": [{
"role": "user",
"content": message
}],
"stream": False,
"options": {
"max_tokens": max_tokens,
"temperature": temperature
}
} Both This is the log entry:
|
Describe the bug
I'm seeing this error in my ollama server.log with every auto-complete request from twinny. Auto complete does appear to be working and giving valid completion suggestions, but i'm confused as to why it's generating this error as the options seem to be submitted properly.
To Reproduce
paste this short script.
wait for autocomplete request
Expected behavior
Just trying to determine if Twinny is leveraging the options properly or if the errors mean
options: {}
is being discarded entirely.Logging
API Provider
ollama -v
Warning: could not connect to a running Ollama instance
Warning: client version is 0.1.38
Chat or Auto Complete?
auto complete
Model Name
codellama:7b-code-q4_0
Desktop (please complete the following information):
Additional context
Both Ollama and vscode are running from windows 10, though i have tried this with vscode using a remote linux container and got the same result.
I tried resetting all twinny settings back to default (except for the host ip), as i'm using this in remote containers sometimes, so it needs to be network accessible.
The text was updated successfully, but these errors were encountered: