Skip to content

Remove --dmr flag and consolidate preset flags into --url-alias#36

Draft
Copilot wants to merge 2 commits intoadd-options-to-docker-modelfrom
copilot/remove-dmr-and-consolidate-options
Draft

Remove --dmr flag and consolidate preset flags into --url-alias#36
Copilot wants to merge 2 commits intoadd-options-to-docker-modelfrom
copilot/remove-dmr-and-consolidate-options

Conversation

Copy link
Copy Markdown

Copilot AI commented Oct 16, 2025

Overview

This PR consolidates the server preset flags into a single, cleaner --url-alias parameter and removes the deprecated --dmr flag as requested in the issue.

Changes

Removed --dmr flag

The --dmr flag has been completely removed from all commands (list, run) and the backend configuration. This flag was redundant and has been deprecated in favor of the consolidated approach.

Consolidated preset flags into --url-alias

Previously, users had to use separate boolean flags for different OpenAI-compatible server presets:

  • --llamacpp - Use llama.cpp server
  • --ollama - Use ollama server
  • --openrouter - Use openrouter server

These have been consolidated into a single --url-alias string parameter that accepts the values: llamacpp, ollama, or openrouter.

Before:

docker model list --llamacpp
docker model run --ollama model-name

After:

docker model list --url-alias llamacpp
docker model run --url-alias ollama model-name

Benefits

  1. Cleaner API: Single parameter instead of multiple boolean flags
  2. Better error messages: Clear indication of valid options when an invalid alias is provided
  3. More extensible: Easy to add new preset aliases in the future
  4. Consistent behavior: Maintains the same functionality as the previous flags

Implementation Details

  • Updated resolveServerURL() function signature to accept urlAlias string parameter instead of individual boolean flags
  • Added validation for url-alias values with helpful error messages
  • Maintained all existing functionality and behavior
  • Updated all test cases to reflect the new parameter structure
  • All tests pass successfully

Error Handling

The implementation provides clear error messages:

# Invalid alias value
$ docker model list --url-alias invalid
Error: invalid url-alias 'invalid'. Valid options are: llamacpp, ollama, openrouter

# Old flags no longer work
$ docker model list --dmr
Error: unknown flag: --dmr

# Conflicting flags
$ docker model list --url http://test.com --url-alias llamacpp
Error: only one of --url or --url-alias can be specified

Testing

  • All existing tests updated and passing
  • Added new test cases for invalid url-alias values
  • Verified CLI help output shows correct usage
  • Manually tested flag behavior and error messages
Original prompt

Remove this option everywhere:

  --dmr           Use docker model runner (default: http://127.0.0.1:12434/engines/llama.cpp/v1)

Consolidate these options:

  --llamacpp      Use llama.cpp server (default: http://127.0.0.1:8080/v1)
  --ollama        Use ollama server (default: http://127.0.0.1:11434/v1)
  --openrouter    Use openrouter server (default: https://openrouter.ai/api/v1)

Everywhere to:

--url-alias string Use openai alias server output (llamacpp|ollama|openrouter)

When we run:

docker model list --url-alias llamacpp

We shoud list a table in the same format as:

docker model list

💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Co-authored-by: ericcurtin <1694275+ericcurtin@users.noreply.github.com>
Copilot AI changed the title [WIP] Remove docker model runner option and consolidate server options Remove --dmr flag and consolidate preset flags into --url-alias Oct 16, 2025
Copilot AI requested a review from ericcurtin October 16, 2025 12:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants