Skip to content

Fix actions/ai-inference field name from max_tokens to max-tokens and remove unsupported temperature field#489

Merged
pelikhan merged 2 commits intomainfrom
copilot/fix-ai-inference-max-tokens
Sep 6, 2025
Merged

Fix actions/ai-inference field name from max_tokens to max-tokens and remove unsupported temperature field#489
pelikhan merged 2 commits intomainfrom
copilot/fix-ai-inference-max-tokens

Conversation

Copy link
Contributor

Copilot AI commented Sep 6, 2025

Fixes the GitHub Actions workflow failure caused by incorrect field names in the actions/ai-inference@v1 action configuration.

Problem

The workflow was failing with the following error:

Warning: Unexpected input(s) 'max_tokens', 'temperature', valid inputs are ['prompt', 'prompt-file', 'input', 'file_input', 'model', 'endpoint', 'system-prompt', 'system-prompt-file', 'max-tokens', 'token', 'enable-github-mcp', 'github-mcp-token']

The issue was in .github/workflows/ai-inference-github-models.md where the AI inference action was configured with:

  • max_tokens (incorrect field name with underscore)
  • temperature (unsupported field)

Solution

  1. Fixed field name: Changed max_tokens: 1000 to max-tokens: 1000 to match the action's expected input format
  2. Removed unsupported field: Removed temperature: 0.7 as this parameter is not supported by actions/ai-inference@v1
  3. Regenerated workflow: Updated the corresponding .lock.yml file by recompiling the workflow

Files Changed

  • .github/workflows/ai-inference-github-models.md - Source workflow with corrected configuration
  • .github/workflows/ai-inference-github-models.lock.yml - Auto-generated workflow file with proper field names

Validation

  • All existing tests continue to pass
  • Workflow compilation works correctly with the fixed configuration
  • Manual verification confirms the generated YAML uses the correct field names
  • Code formatting and linting validation passes

This resolves the workflow execution error and ensures the AI inference action receives properly formatted inputs.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • api.contoso.com
    • Triggering command: /tmp/go-build2394807079/b278/cli.test -test.testlogfile=/tmp/go-build2394807079/b278/testlog.txt -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)
    • Triggering command: /tmp/go-build3515359546/b278/cli.test -test.testlogfile=/tmp/go-build3515359546/b278/testlog.txt -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)
    • Triggering command: /tmp/go-build2367723421/b278/cli.test -test.testlogfile=/tmp/go-build2367723421/b278/testlog.txt -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)

If you need me to access, download, or install something from one of these locations, you can either:


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

… remove unsupported temperature field

Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
@pelikhan pelikhan marked this pull request as ready for review September 6, 2025 16:18
Copilot AI changed the title [WIP] fix configuration of actions/ai-inference. fieldname "max_tokens" should be "max-tokens" Warning: Unexpected input(s) 'max_tokens', 'temperature', valid inputs are ['prompt', 'prompt-file', 'input', 'file_input', 'model', 'endpoint', 'system-prompt', ... Fix actions/ai-inference field name from max_tokens to max-tokens and remove unsupported temperature field Sep 6, 2025
Copilot AI requested a review from pelikhan September 6, 2025 16:19
@pelikhan pelikhan merged commit ec8c409 into main Sep 6, 2025
22 checks passed
@pelikhan pelikhan deleted the copilot/fix-ai-inference-max-tokens branch September 6, 2025 16:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants