Submits/triggers Kubernetes CronJobs or Argo Workflows. This action handles the actual execution of jobs, creating Job resources from CronJobs or submitting Argo Workflows from templates.
- name: Submit job
uses: skyhook-io/submit-job@v1
id: submit
with:
job_type: kubernetes-cronjob
resource_name: my-cronjob
namespace: production
- name: Check created job
run: |
echo "Created: ${{ steps.submit.outputs.created_name }}"| Input | Description | Required | Default |
|---|---|---|---|
job_type |
Type of job | ✅ | - |
resource_name |
Name of the resource to trigger | ✅ | - |
namespace |
Kubernetes namespace | ✅ | - |
parameters |
JSON object of parameters (Argo only) | ❌ | '' |
argo_version |
Argo CLI version to install | ❌ | 3.5.3 |
kubernetes-cronjob- Creates a Job from a CronJobargo-workflow- Submits a Workflow from a WorkflowTemplateargo-cronworkflow- Submits a Workflow from a CronWorkflow
| Output | Description | Example |
|---|---|---|
created_name |
Name of the created Job or Workflow | my-job-manual-1234567890 |
created_kind |
Kind of created resource | job or workflow |
- name: Trigger CronJob
uses: skyhook-io/submit-job@v1
with:
job_type: kubernetes-cronjob
resource_name: nightly-backup
namespace: production- name: Submit workflow
uses: skyhook-io/submit-job@v1
with:
job_type: argo-workflow
resource_name: data-processing-template
namespace: workflows
parameters: '{"input_file":"s3://bucket/data.csv","output_path":"s3://bucket/results/"}'- name: Submit from CronWorkflow
uses: skyhook-io/submit-job@v1
with:
job_type: argo-cronworkflow
resource_name: scheduled-report
namespace: workflows
parameters: '{"report_date":"2024-01-15"}'name: Execute Job
on:
workflow_dispatch:
inputs:
job_name:
description: "Logical job name"
required: true
namespace:
description: "Namespace"
required: true
job_type:
description: "Job type"
required: true
parameters:
description: "Parameters (JSON)"
required: false
jobs:
execute:
runs-on: ubuntu-latest
steps:
- name: Authenticate to cluster
uses: skyhook-io/cloud-login@v1
with:
provider: gcp
account: my-project
location: us-central1
cluster: production
- name: Resolve job resource
id: resolve
uses: skyhook-io/resolve-job-template@v1
with:
job_name: ${{ inputs.job_name }}
namespace: ${{ inputs.namespace }}
job_type: ${{ inputs.job_type }}
- name: Submit job
id: submit
uses: skyhook-io/submit-job@v1
with:
job_type: ${{ inputs.job_type }}
resource_name: ${{ steps.resolve.outputs.name }}
namespace: ${{ inputs.namespace }}
parameters: ${{ inputs.parameters }}
- name: Report
run: |
echo "Submitted: ${{ steps.submit.outputs.created_name }}"- Generates a unique job name with timestamp suffix
- Creates a Job from the CronJob using
kubectl create job --from - Ensures name doesn't exceed 63 characters (K8s limit)
- Outputs the created Job name
- Installs Argo CLI (if not already installed)
- Parses JSON parameters and converts to
-p key=valueflags - Submits workflow using
argo submit --from - Captures the created Workflow name from output
- Outputs the workflow name
- kubectl must be installed and configured
- For Argo workflows: jq must be available (for parameter parsing)
- Cluster authentication must be completed before using this action
- The specified resource (CronJob/WorkflowTemplate/CronWorkflow) must exist in the namespace
The action will fail if:
- Invalid
job_typeis specified - Resource doesn't exist in the namespace
- kubectl/argo commands fail
- Parameter JSON is malformed (for Argo workflows)
- Job names for CronJobs include a timestamp to ensure uniqueness
- Workflow names for Argo are generated automatically by Argo
- Parameters are only used for Argo workflows (ignored for CronJobs)
- The action automatically installs the Argo CLI when needed
- All commands run with
set -euo pipefailfor safety
MIT