Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Postgres indexing problem with task_type length #111

Closed
vinize opened this issue Mar 27, 2024 · 5 comments
Closed

Postgres indexing problem with task_type length #111

vinize opened this issue Mar 27, 2024 · 5 comments

Comments

@vinize
Copy link

vinize commented Mar 27, 2024

Describe the bug
When changing the indexing database from elastic to postgres, an error occurs related to the length of the task name.

Caused by: com.netflix.conductor.core.exception.NonTransientException: ERROR: value too long for type character varying(32)
	at com.netflix.conductor.postgres.dao.PostgresBaseDAO.getWithRetriedTransactions(PostgresBaseDAO.java:148) ~[conductor postgres-persistence.jar!/:?]
	at com.netflix.conductor.postgres.dao.PostgresBaseDAO.queryWithTransaction(PostgresBaseDAO.java:210) ~[conductor-postgres-persistence.jar!/:?]
	at com.netflix.conductor.postgres.dao.PostgresIndexDAO.indexTask(PostgresIndexDAO.java:151) ~[conductor-postgres-persistence.jar!/:?]
	at com.netflix.conductor.core.dal.ExecutionDAOFacade.updateTask(ExecutionDAOFacade.java:517) ~[conductor-core.jar!/:?]
	... 18 more

It turns out that there is a problem with the length of the task_type field. According to the database schema, it should be up to 32 characters long. However, in the case of SIMPLE tasks, the task_name values go there instead of task_type

Details
Conductor version: 3.18
Persistence implementation: Postgres
Queue implementation: Redis
Lock: Redis
Workflow definition:

{
  "accessPolicy": {},
  "name": "TestWorkflowWithLongTaskName",
  "description": "Test Workflow to Index long task name in postgres",
  "version": 1,
  "tasks": [
    {
      "name": "TaskWithLongNameInTestWorkflowWithLongTaskName",
      "taskReferenceName": "TaskWithLongNameInTestWorkflowWithLongTaskName",
      "inputParameters": {},
      "type": "SIMPLE",
      "startDelay": 0,
      "optional": false,
      "asyncComplete": false,
      "permissive": false
    }
  ],
  "inputParameters": [],
  "outputParameters": {},
  "schemaVersion": 2,
  "restartable": true,
  "workflowStatusListenerEnabled": false,
  "ownerEmail": "example@email.com",
  "timeoutPolicy": "ALERT_ONLY",
  "timeoutSeconds": 0,
  "variables": {},
  "inputTemplate": {}
}

To Reproduce
Steps to reproduce the behavior:

  1. Use Postgres Index
  2. Create workflow with task which has name more then 32 symbols
  3. Run it and terminate (because lack of worker)

Expected behavior
In case on simple tasks value 'SIMPLE' should be inserted to task_type column, or task_type column should be extended to 255 length (like task_name)

Additional context
bug initially reported at: Netflix/conductor-community#252

@Prebiusta
Copy link

I have faced exactly the same issue, had to revert back to ES indexing

@bjpirt
Copy link
Contributor

bjpirt commented Apr 2, 2024

OK - should be pretty straightforward to change the column length in the database schema for this

@Prebiusta
Copy link

I think this issue can be closed

@vinize
Copy link
Author

vinize commented May 13, 2024

looks good in version 3.19.0

@vinize vinize closed this as completed May 13, 2024
@Robban1980
Copy link

@vinize was the code updated to insert the type instead of the name in the task_type column, i only see a PR for making the field length longer?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants