This repository has been archived by the owner on Dec 13, 2023. It is now read-only.
External payload storage is not reducing the workflow & task size #2899
Labels
type: bug
bugs/ bug fixes
Describe the bug
I recently migrated from 2.x version and bumped into this bug on 3.6 (and even in the main branch). I had to implement a workaround to make this functionality work for our product.
When the payload size crosses the threshold value, the affected task and workflow input/output data are uploaded successfully to the S3 bucket. But when the object is converted to JSON to store in the DB, that big JSON is still there. It results in data crossing the MySQL threshold to store. Even though we are using MySQL, but this issue is valid for all data stores.
I believe the code issue is below in both TaskModel and WorkflowModel classes. As the
inputPayload
is populated with the large data,inputData
field returned after Jackson's Object to JSON conversion will still have the data.Details
Conductor version: 3.6
Persistence implementation: MySQL
To Reproduce
Steps to reproduce the behavior:
externalInputPayloadStoragePath
,externalOutputPayloadStoragePath
for TaskModel and WorkflowModel populates correctly.Expected behavior
The data should store in the DB after setting the large data set the field to an empty map.
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: