This Azure Function executes Run Commands on Azure Arc Enabled Servers. It reads a list of servers from a CSV file stored in Azure Blob Storage, executes the specified command on each server, and stores the outputs and errors back to Blob Storage.
- HTTP-triggered Azure Function
- Reads server list from CSV file in Azure Blob Storage
- Executes Run Commands on Azure Arc Enabled Servers
- Stores individual outputs and errors in Blob Storage
- Creates a summary of all executions
- Organized output storage with timestamps
- Python 3.9 or later
- Azure Functions Core Tools
- Azure subscription with:
- Azure Arc Enabled Servers
- Azure Storage Account
- Appropriate permissions to execute Run Commands
function-app-arc-run-command/
├── .github/
│ └── copilot-instructions.md # Development guidelines
├── ArcRunCommand/
│ ├── __init__.py # Main function code
│ └── function.json # Function binding configuration
├── .gitignore
├── host.json # Function app configuration
├── local.settings.json # Local environment variables
├── requirements.txt # Python dependencies
├── sample-servers.csv # Sample CSV file
└── README.md # This file
Update local.settings.json with your Azure configuration:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "python",
"AZURE_SUBSCRIPTION_ID": "<your-subscription-id>",
"AZURE_STORAGE_CONNECTION_STRING": "<your-storage-connection-string>",
"BLOB_CONTAINER_NAME": "arc-run-command-outputs",
"CSV_BLOB_NAME": "servers.csv"
}
}Required Settings:
AZURE_SUBSCRIPTION_ID: Your Azure subscription IDAZURE_STORAGE_CONNECTION_STRING: Connection string for your storage accountBLOB_CONTAINER_NAME: Container name for outputs (default:arc-run-command-outputs)CSV_BLOB_NAME: Name of the CSV file containing server list (default:servers.csv)
pip install -r requirements.txtCreate a CSV file with your Arc-enabled server names, resource groups, locations, and commands, then upload it to your Blob Storage container:
server_name,resource_group,location,command
arc-server-01,rg-arc-servers-prod,eastus,hostname
arc-server-02,rg-arc-servers-prod,eastus,"Get-ComputerInfo | Select-Object CsName,OsName"
arc-server-03,rg-arc-servers-dev,westus,uptime
The CSV file must have columns named server_name, resource_group, location, and command.
The function uses DefaultAzureCredential for authentication. Ensure you're authenticated using one of these methods:
- Azure CLI:
az login - Environment variables
- Managed Identity (when deployed to Azure)
- Start the function locally:
func start- Send an HTTP POST request to trigger the function:
Invoke-RestMethod -Method Post -Uri "http://localhost:7071/api/ArcRunCommand"Or simply:
curl -X POST http://localhost:7071/api/ArcRunCommandEndpoint: POST /api/ArcRunCommand
Request Body: None required. All configuration is read from the CSV file.
Note: The CSV file in blob storage contains all necessary information: server names, resource groups, and commands to execute.
{
"message": "Successfully processed 3 servers",
"timestamp": "20250118_143022",
"results": [
{
"server_name": "arc-server-01",
"resource_group": "rg-arc-servers-prod",
"command": "hostname",
"timestamp": "20250118_143022",
"status": "Success",
"output": "command output here",
"error": ""
}
]
}The function stores results in Azure Blob Storage with the following structure:
outputs/
└── {timestamp}/
├── {server-name}_output.json # Individual server results
├── {server-name}_error.json # Individual server errors (if any)
└── summary.json # Summary of all executions
- Create a Function App in Azure:
az functionapp create --resource-group <resource-group> --consumption-plan-location <location> --runtime python --runtime-version 3.9 --functions-version 4 --name <function-app-name> --storage-account <storage-account>- Configure application settings:
az functionapp config appsettings set --name <function-app-name> --resource-group <resource-group> --settings "AZURE_SUBSCRIPTION_ID=<subscription-id>" "AZURE_STORAGE_CONNECTION_STRING=<connection-string>" "BLOB_CONTAINER_NAME=arc-run-command-outputs" "CSV_BLOB_NAME=servers.csv"- Enable system-assigned managed identity:
az functionapp identity assign --name <function-app-name> --resource-group <resource-group>- Grant the managed identity permissions to your Arc servers:
az role assignment create --assignee <principal-id> --role "Azure Connected Machine Resource Administrator" --scope /subscriptions/<subscription-id>/resourceGroups/<resource-group>- Deploy the function:
func azure functionapp publish <function-app-name>The identity running this function needs:
- Azure Connected Machine Resource Administrator role on the resource group containing Arc servers
- Storage Blob Data Contributor role on the storage account
- Individual server failures are logged and stored separately
- The function continues processing remaining servers even if one fails
- All errors are captured in both the response and stored in blob storage
- Failed executions create
{server-name}_error.jsonfiles
The function logs to Application Insights (if configured) and includes:
- Function trigger events
- Server processing status
- Individual command execution results
- Error details
- Use Azure Key Vault for sensitive configuration values
- Enable managed identity for authentication
- Implement proper RBAC permissions
- Review and validate commands before execution
- Consider using Azure Private Endpoints for storage
Issue: "Missing required configuration" error
- Solution: Ensure
AZURE_SUBSCRIPTION_IDandAZURE_STORAGE_CONNECTION_STRINGare set
Issue: "Error reading CSV file" error
- Solution: Verify the CSV file exists in blob storage and has the correct format
Issue: Authentication failures
- Solution: Run
az loginor configure managed identity properly
Issue: Permission denied errors
- Solution: Verify RBAC role assignments for Arc servers and storage account
MIT License
Contributions are welcome! Please feel free to submit a Pull Request.