is it possible to schedule queries and upload the resultset on remote file server or s3 like storage? #240
Replies: 3 comments
-
This could alternatively be solved by other tools that are good at scheduling and workflow management, like Apache Airflow. Airflow and its HTTP provider operators could be used in an Airflow DAG to schedule a call to DB2REST at some point in time or even scheduled based on a task updating a dataset Or Apache DolphinScheduler would be my better pick if you had some complex task dependencies or conditions to manage with firing off on a particular schedule or not. I dunno about this feature request. Seems already solved with many other tools already robust with timezone support, etc. and that deal with state and calling HTTP and dealing with results, status, conditions. |
Beta Was this translation helpful? Give feedback.
-
When it comes to resultset formats and handling as files. Dunno, depends on the needs and environments actually. I can say that PostgreSQL is very powerful in this regard and even provides things like PGResult and PQexecParameters and the details of a query result |
Beta Was this translation helpful? Give feedback.
-
It is indeed a good value addition. We may also use the scheduler provided by Spring. We are already using spring boot, it may be easier to us it in the existing framework. |
Beta Was this translation helpful? Give feedback.
-
(banking and telecom friends question)
Ultimately this question brings about the question of if DB2REST should have state management which is needed I think for scheduling purposes.
Beta Was this translation helpful? Give feedback.
All reactions