You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It was working fine when using steps.step-name.outputs.result when it was just trivially printing out some dummy json structure but the step runs a bunch of scripts that write a lot of output to the command so I switched to using a named output and valueFrom path so that I could keep the useful logs whilst only exporting the specific json I needed for the aggregation step.
However, instead of getting a list of json objects from steps.step-name.outputs.parameters.parameter-name that I can easily drop into a python or node script on the next step it's an array of strings and I therefore have to start the next step by parsing the strings, which is a pretty simple work around, but it seems strange that the default result and outputs are being treated differently.
You can see that at the point of output, the value is indeed JSON
But then when it's aggregated as an input in the next step each JSON object is an escaped string
Version
v3.5.5
Paste a small workflow that reproduces the issue. We must be able to run the workflow; don't enter a workflows that uses private images.
apiVersion: argoproj.io/v1alpha1kind: Workflowmetadata:
generateName: loop-test-spec:
entrypoint: maintemplates:
- name: mainsteps:
- - name: execute-parallel-stepstemplate: print-json-entryarguments:
parameters:
- name: indexvalue: '{{item}}'withParam: '[1, 2, 3]'
- - name: call-print-aggregate-outputtemplate: print-aggregate-outputarguments:
parameters:
- name: aggregate-results# If the value of each loop iteration isn't a valid JSON,# you get a JSON parse error:# value: '{{steps.execute-parallel-steps.outputs.result}}'value: '{{steps.execute-parallel-steps.outputs.parameters.content}}'
- name: print-json-entryinputs:
parameters:
- name: index# The output must be a valid JSONscript:
image: nodecommand: [node]source: | const fs = require('fs'); const content = JSON.stringify({a: 1, b: 2}); fs.writeFileSync('/tmp/outputs.json', content);outputs:
parameters:
- name: contentvalueFrom:
path: /tmp/outputs.json
- name: print-aggregate-outputinputs:
parameters:
- name: aggregate-resultsscript:
image: pythoncommand: [python]source: | print({{inputs.parameters.aggregate-results}}) # prints ['{"a":1,"b":2}', '{"a":1,"b":2}', '{"a":1,"b":2}']
Pre-requisites
:latest
What happened/what did you expect to happen?
I was looking at the documentation for looping here https://argo-workflows.readthedocs.io/en/latest/walk-through/loops/#accessing-the-aggregate-results-of-a-loop as I'm essentially doing some map-reduce type stuff in my pipeline.
It was working fine when using steps.step-name.outputs.result when it was just trivially printing out some dummy json structure but the step runs a bunch of scripts that write a lot of output to the command so I switched to using a named output and valueFrom path so that I could keep the useful logs whilst only exporting the specific json I needed for the aggregation step.
However, instead of getting a list of json objects from steps.step-name.outputs.parameters.parameter-name that I can easily drop into a python or node script on the next step it's an array of strings and I therefore have to start the next step by parsing the strings, which is a pretty simple work around, but it seems strange that the default result and outputs are being treated differently.
You can see that at the point of output, the value is indeed JSON
But then when it's aggregated as an input in the next step each JSON object is an escaped string
Version
v3.5.5
Paste a small workflow that reproduces the issue. We must be able to run the workflow; don't enter a workflows that uses private images.
Logs from the workflow controller
Logs from in your workflow's wait container
The text was updated successfully, but these errors were encountered: