en/latest/walk-through/output-parameters/ #9476
Replies: 7 comments 21 replies
-
I am getting the following error for submitting this workflow, any help would be appreciated
|
Beta Was this translation helpful? Give feedback.
-
What if the value of the parameter is an object? How to automatically mapping all the fields in the object? inputs:
parameters:
- name: message
- value: {
name: "Alif",
age: "2"
} How to access it like this?
Because originally it will generate error |
Beta Was this translation helpful? Give feedback.
-
Hi, Is there a simple way to save all parameters in workflow.parameters into a txt file for audit purposes? |
Beta Was this translation helpful? Give feedback.
-
I am getting below error when running workflow using above template
|
Beta Was this translation helpful? Give feedback.
-
Is there a way to pass secrets between two steps? For example, step 1 creates an user with password and step 2 sends it to the end user. How would you pass the password between the two steps without exposing it? |
Beta Was this translation helpful? Give feedback.
-
I have an tricky use case with outputs: I am using a argo manifest with a sparkoperator section, and a json in the raw form to submit a simple python script as a task In this Python file I launch, can I access the outputs parameters files ? I mean can I write in the /tmp/response.txt from my prog.py, or is outputs an argo concept ? ======================================================================== - name: spark-main-app-template
inputs:
parameters:
- name: my-user
- name: my-password
- name: my-namespace
- name: my-app-name
artifacts:
- name: sparkapp
path: /tmp/spark-app.json
raw:
data: |
{
"apiVersion": "sparkoperator.k8s.io/v1beta2",
"kind": "SparkApplication",
"metadata": {
"name": "my-main-python",
"namespace": "ns2023"
},
"spec": {
"type": "Python",
"mode": "cluster",
"image": "repo-0.repos.corp.com/my-user/my-img:0.0",
"imagePullSecrets": "docker-secret"
"imagePullPolicy": "Always",
"mainApplicationFile": "local:///opt/spark/workdir/scripts/prog.py",
"sparkVersion": "3.4",
"restartPolicy": {
"type": "Never"
},
"driver": {
"cores": 1,
"coreLimit": "1200m",
"memory": "512m",
"labels": {
"version": "3.4"
},
"env": [
{
"name": "AWS_ACCESS_KEY_ID",
"valueFrom": {
"secretKeyRef": {
"name": "secret-s3",
"key": "aws-access-key-id"
}
}
},
{
"name": "AWS_SECRET_ACCESS_KEY",
"valueFrom": {
"secretKeyRef": {
"name": "secret-s3",
"key": "aws-secret-access-key"
}
}
}
]
},
"executor": {
"cores": 1,
"coreLimit": "1200m",
"instances": 1,
"memory": "512m",
"labels": {
"version": "3.4"
},
"env": [
{
"name": "AWS_ACCESS_KEY_ID",
"valueFrom": {
"secretKeyRef": {
"name": "secret-s3",
"key": "aws-access-key-id"
}
}
},
{
"name": "AWS_SECRET_ACCESS_KEY",
"valueFrom": {
"secretKeyRef": {
"name": "secret-s3",
"key": "aws-secret-access-key"
}
}
}
]
},
"arguments": [
"> /tmp/resp_status_code.txt"
],
"hadoopConf": {
"fs.s3a.endpoint": "s3.endpoint.mycorp.com",
"fs.defaultFS": "s3a://my-bucket"
},
"sparkConf": {
"spark.ui.view.acls": "acl0"
}
}
}
outputs:
parameters:
- name: response-http-status-code
valueFrom:
path: /tmp/response_status_code.txt
container:
image: repo.repos.mycorp.com/submit-spark-app-image:1.1.0
command:
- python3 |
Beta Was this translation helpful? Give feedback.
-
How to retrieve ENV values from container as Output params? |
Beta Was this translation helpful? Give feedback.
-
en/latest/walk-through/output-parameters/
https://argo-workflows.readthedocs.io/en/latest/walk-through/output-parameters/
Beta Was this translation helpful? Give feedback.
All reactions