Skip to content

[BUG]: Error when using local lang. model support for OpenAPI spec generation #810

@andrewconnell

Description

@andrewconnell

Description

Following docs:

When recording & saving a session, I received the error: OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.

◉ Recording...

 req   ╭ GET https://api.nasa.gov/mars-photos/api/v1/rovers/curiosity/photos?api_key=KzzpfOrja1LWM5QEoExF1d3CQNfGRjXv6WL8I7iw&sol=1000&camera=MAST
 api   ╰ Passed through
○ Stopped recording
 info    Creating OpenAPI spec from recorded requests...
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 info    Created OpenAPI spec file api.nasa.gov-20240627153754.json
 info    DONE

Expected behaviour

It creates the OpenAPI spec file using the local lang. model support. shown in this post: https://devblogs.microsoft.com/microsoft365dev/dev-proxy-v0-19-with-simulating-llm-apis-and-new-azure-api-center-integrations/?ocid=microsoft365dev_eml_tnp_autoid134_title

Actual behaviour

Error when saving the OpenAPI spec from the recorded requests:

◉ Recording...

 req   ╭ GET https://api.nasa.gov/mars-photos/api/v1/rovers/curiosity/photos?api_key=KzzpfOrja1LWM5QEoExF1d3CQNfGRjXv6WL8I7iw&sol=1000&camera=MAST
 api   ╰ Passed through
○ Stopped recording
 info    Creating OpenAPI spec from recorded requests...
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 fail    OllamaLanguageModelClient: Language model availability is not checked. Call IsEnabled first.
 info    Created OpenAPI spec file api.nasa.gov-20240627153754.json
 info    DONE

Steps to reproduce

  1. install Dev Proxy & do the initial run to trust cert

    brew tap microsoft/dev-proxy
    brew install dev-proxy
  2. install Ollama & start service

    brew install ollama
    brew services start ollama
  3. verify Ollama is running and listening on default port by trying to start it again

    ollama serve
    Error: listen tcp 127.0.0.1:11434: bind: address already in use

    NOTE - after i got the Actual Behavior error above, i repeated the process but at this point, I had Ollama download & run the Phi3 model (ollama run phi3), but this had no effect - still got the same error.

  4. update Dev Proxy config file to add the OpenApiSpecGeneratorPlugin plugin, update the urlsToWatch, & enable the local language model

  5. start Dev Proxy, start recording

    devproxy --failure-rate 0
    r
  6. navigate to the following URL: https://api.nasa.gov/mars-photos/api/v1/rovers/spirit/photos?api_key=DEMO_KEY&sol=1&page=1

  7. stop recording

    s
  8. observe the error in the console... but the OpenAPI spec file is successfully created

Dev Proxy Version

0.19.0

Operating system (environment)

macOS

Shell

zsh

Configuration file

{
  "$schema": "https://raw.githubusercontent.com/microsoft/dev-proxy/main/schemas/v0.19.0/rc.schema.json",
  "plugins": [
    {
      "name": "RetryAfterPlugin",
      "enabled": true,
      "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
    },
    {
      "name": "GenericRandomErrorPlugin",
      "enabled": true,
      "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll",
      "configSection": "genericRandomErrorPlugin"
    },
    {
      "name": "OpenApiSpecGeneratorPlugin",
      "enabled": true,
      "pluginPath": "~appFolder/plugins/dev-proxy-plugins.dll"
    }
  ],
  "urlsToWatch": [
    "https://jsonplaceholder.typicode.com/*",
    "https://api.nasa.gov/*"
  ],
  "genericRandomErrorPlugin": {
    "errorsFile": "devproxy-errors.json"
  },
  "rate": 50,
  "logLevel": "information",
  "newVersionNotification": "stable",
  "languageModel": { "enabled": true }
}

Additional Info

No response

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions