Skip to content

UbhiTS/function-app-arc-run-command

Repository files navigation

Azure Arc Run Command Function App

This Azure Function executes Run Commands on Azure Arc Enabled Servers. It reads a list of servers from a CSV file stored in Azure Blob Storage, executes the specified command on each server, and stores the outputs and errors back to Blob Storage.

Features

  • HTTP-triggered Azure Function
  • Reads server list from CSV file in Azure Blob Storage
  • Executes Run Commands on Azure Arc Enabled Servers
  • Stores individual outputs and errors in Blob Storage
  • Creates a summary of all executions
  • Organized output storage with timestamps

Prerequisites

  • Python 3.9 or later
  • Azure Functions Core Tools
  • Azure subscription with:
    • Azure Arc Enabled Servers
    • Azure Storage Account
    • Appropriate permissions to execute Run Commands

Project Structure

function-app-arc-run-command/
├── .github/
│   └── copilot-instructions.md    # Development guidelines
├── ArcRunCommand/
│   ├── __init__.py                # Main function code
│   └── function.json              # Function binding configuration
├── .gitignore
├── host.json                      # Function app configuration
├── local.settings.json            # Local environment variables
├── requirements.txt               # Python dependencies
├── sample-servers.csv             # Sample CSV file
└── README.md                      # This file

Setup

1. Configure Environment Variables

Update local.settings.json with your Azure configuration:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "AZURE_SUBSCRIPTION_ID": "<your-subscription-id>",
    "AZURE_STORAGE_CONNECTION_STRING": "<your-storage-connection-string>",
    "BLOB_CONTAINER_NAME": "arc-run-command-outputs",
    "CSV_BLOB_NAME": "servers.csv"
  }
}

Required Settings:

  • AZURE_SUBSCRIPTION_ID: Your Azure subscription ID
  • AZURE_STORAGE_CONNECTION_STRING: Connection string for your storage account
  • BLOB_CONTAINER_NAME: Container name for outputs (default: arc-run-command-outputs)
  • CSV_BLOB_NAME: Name of the CSV file containing server list (default: servers.csv)

2. Install Dependencies

pip install -r requirements.txt

3. Prepare CSV File

Create a CSV file with your Arc-enabled server names, resource groups, locations, and commands, then upload it to your Blob Storage container:

server_name,resource_group,location,command
arc-server-01,rg-arc-servers-prod,eastus,hostname
arc-server-02,rg-arc-servers-prod,eastus,"Get-ComputerInfo | Select-Object CsName,OsName"
arc-server-03,rg-arc-servers-dev,westus,uptime

The CSV file must have columns named server_name, resource_group, location, and command.

4. Authentication

The function uses DefaultAzureCredential for authentication. Ensure you're authenticated using one of these methods:

  • Azure CLI: az login
  • Environment variables
  • Managed Identity (when deployed to Azure)

Usage

Local Testing

  1. Start the function locally:
func start
  1. Send an HTTP POST request to trigger the function:
Invoke-RestMethod -Method Post -Uri "http://localhost:7071/api/ArcRunCommand"

Or simply:

curl -X POST http://localhost:7071/api/ArcRunCommand

Request Format

Endpoint: POST /api/ArcRunCommand

Request Body: None required. All configuration is read from the CSV file.

Note: The CSV file in blob storage contains all necessary information: server names, resource groups, and commands to execute.

Response Format

{
  "message": "Successfully processed 3 servers",
  "timestamp": "20250118_143022",
  "results": [
    {
      "server_name": "arc-server-01",
      "resource_group": "rg-arc-servers-prod",
      "command": "hostname",
      "timestamp": "20250118_143022",
      "status": "Success",
      "output": "command output here",
      "error": ""
    }
  ]
}

Output Storage

The function stores results in Azure Blob Storage with the following structure:

outputs/
└── {timestamp}/
    ├── {server-name}_output.json    # Individual server results
    ├── {server-name}_error.json     # Individual server errors (if any)
    └── summary.json                 # Summary of all executions

Deployment to Azure

  1. Create a Function App in Azure:
az functionapp create --resource-group <resource-group> --consumption-plan-location <location> --runtime python --runtime-version 3.9 --functions-version 4 --name <function-app-name> --storage-account <storage-account>
  1. Configure application settings:
az functionapp config appsettings set --name <function-app-name> --resource-group <resource-group> --settings "AZURE_SUBSCRIPTION_ID=<subscription-id>" "AZURE_STORAGE_CONNECTION_STRING=<connection-string>" "BLOB_CONTAINER_NAME=arc-run-command-outputs" "CSV_BLOB_NAME=servers.csv"
  1. Enable system-assigned managed identity:
az functionapp identity assign --name <function-app-name> --resource-group <resource-group>
  1. Grant the managed identity permissions to your Arc servers:
az role assignment create --assignee <principal-id> --role "Azure Connected Machine Resource Administrator" --scope /subscriptions/<subscription-id>/resourceGroups/<resource-group>
  1. Deploy the function:
func azure functionapp publish <function-app-name>

Required Azure RBAC Permissions

The identity running this function needs:

  • Azure Connected Machine Resource Administrator role on the resource group containing Arc servers
  • Storage Blob Data Contributor role on the storage account

Error Handling

  • Individual server failures are logged and stored separately
  • The function continues processing remaining servers even if one fails
  • All errors are captured in both the response and stored in blob storage
  • Failed executions create {server-name}_error.json files

Logging

The function logs to Application Insights (if configured) and includes:

  • Function trigger events
  • Server processing status
  • Individual command execution results
  • Error details

Security Considerations

  • Use Azure Key Vault for sensitive configuration values
  • Enable managed identity for authentication
  • Implement proper RBAC permissions
  • Review and validate commands before execution
  • Consider using Azure Private Endpoints for storage

Troubleshooting

Issue: "Missing required configuration" error

  • Solution: Ensure AZURE_SUBSCRIPTION_ID and AZURE_STORAGE_CONNECTION_STRING are set

Issue: "Error reading CSV file" error

  • Solution: Verify the CSV file exists in blob storage and has the correct format

Issue: Authentication failures

  • Solution: Run az login or configure managed identity properly

Issue: Permission denied errors

  • Solution: Verify RBAC role assignments for Arc servers and storage account

License

MIT License

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages