-
Notifications
You must be signed in to change notification settings - Fork 76
Description
Description
Following docs:
- https://learn.microsoft.com/en-us/microsoft-cloud/dev/dev-proxy/how-to/generate-openapi-spec
- https://github.com/ollama/ollama/blob/main/README.md#ollama
When recording & saving a session, I received the error: OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
◉ Recording...
req ╭ GET https://api.nasa.gov/mars-photos/api/v1/rovers/curiosity/photos?api_key=KzzpfOrja1LWM5QEoExF1d3CQNfGRjXv6WL8I7iw&sol=1000&camera=MAST
api ╰ Passed through
○ Stopped recording
info Creating OpenAPI spec from recorded requests...
fail OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
fail OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
info Created OpenAPI spec file api.nasa.gov-20240627153754.json
info DONE
Expected behaviour
It creates the OpenAPI spec file using the local lang. model support. shown in this post: https://devblogs.microsoft.com/microsoft365dev/dev-proxy-v0-19-with-simulating-llm-apis-and-new-azure-api-center-integrations/?ocid=microsoft365dev_eml_tnp_autoid134_title
Actual behaviour
Error when saving the OpenAPI spec from the recorded requests:
◉ Recording...
req ╭ GET https://api.nasa.gov/mars-photos/api/v1/rovers/curiosity/photos?api_key=KzzpfOrja1LWM5QEoExF1d3CQNfGRjXv6WL8I7iw&sol=1000&camera=MAST
api ╰ Passed through
○ Stopped recording
info Creating OpenAPI spec from recorded requests...
fail OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
fail OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
info Created OpenAPI spec file api.nasa.gov-20240627153754.json
info DONE
Steps to reproduce
-
install Dev Proxy & do the initial run to trust cert
brew tap microsoft/dev-proxy brew install dev-proxy
-
install Ollama & start service
brew install ollama brew services start ollama
-
verify Ollama is running and listening on default port by trying to start it again
ollama serve Error: listen tcp 127.0.0.1:11434: bind: address already in use
NOTE - after i got the Actual Behavior error above, i repeated the process but at this point, I had Ollama download & run the Phi3 model (
ollama run phi3
), but this had no effect - still got the same error. -
update Dev Proxy config file to add the OpenApiSpecGeneratorPlugin plugin, update the
urlsToWatch
, & enable the local language model -
start Dev Proxy, start recording
devproxy --failure-rate 0 r
-
navigate to the following URL: https://api.nasa.gov/mars-photos/api/v1/rovers/spirit/photos?api_key=DEMO_KEY&sol=1&page=1
-
stop recording
s
-
observe the error in the console... but the OpenAPI spec file is successfully created
Dev Proxy Version
0.19.0
Operating system (environment)
macOS
Shell
zsh
Configuration file
{
"$schema": "https://raw.githubusercontent.com/microsoft/dev-proxy/main/schemas/v0.19.0/rc.schema.json",
"plugins": [
{
"name": "RetryAfterPlugin",
"enabled": true,
"pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
},
{
"name": "GenericRandomErrorPlugin",
"enabled": true,
"pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll",
"configSection": "genericRandomErrorPlugin"
},
{
"name": "OpenApiSpecGeneratorPlugin",
"enabled": true,
"pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
}
],
"urlsToWatch": [
"https://jsonplaceholder.typicode.com/*",
"https://api.nasa.gov/*"
],
"genericRandomErrorPlugin": {
"errorsFile": "devproxy-errors.json"
},
"rate": 50,
"logLevel": "information",
"newVersionNotification": "stable",
"languageModel": { "enabled": true }
}
Additional Info
No response