Tags: goto/dagger
Tags
[chore] Exclude okhttp3 (#58) * chore: Exclude com.squareup.okhttp3 and com.square.okhttp which breaks InfluxDB functionality * fix: exclude com.squareup.okhttp3 group from depot * fix: revert old group exclusion --------- Co-authored-by: Mayank Rai <mayank.rai@gojek.com>
chore: Get OSS/COS configurations from arguments via paramTool instea… …d of env_vars (#57) * chore: Get OSS/COS configurations from arguments via paramTool instead of env_vars * github actions cache@v4 * bump dagger version to 0.11.6 --------- Co-authored-by: rajuGT <raju.gt@gojek.com>
chore: COS client get credentials using OIDC providers for dart & pyt… …hon udf objects (#54) * chore: COS client get credentials using OIDC providers for dart & python udf objects. * Refactor Cos client to use OIDCRoleArnProvider * checkstyle fixes * checkstyle fix * bump up version to 0.11.3 --------- Co-authored-by: rajuGT <raju.gt@gojek.com>
cos-lib-upgrade: upgraded flink-cos-fs-hadoop for HA and hadoop-cos f… …or parquet source reading (#53) * cos-lib-upgrade: upgraded flink-cos-fs-hadoop for HA and hadoop-cos for parquet source reading Note: The hadoop cos implementation class changes <property> <name>fs.cosn.impl</name> <value>org.apache.hadoop.fs.CosFileSystem</value> </property> <property> <name>fs.AbstractFileSystem.cosn.impl</name> <value>org.apache.hadoop.fs.CosN</value> </property> <property> <name>fs.cosn.credentials.provider</name> <value>org.apache.hadoop.fs.auth.OIDCRoleArnCredentialsProvider</value> </property> * Use hadoop-cos 8.3.17 which has workload identity(keyless) feature --------- Co-authored-by: rajuGT <raju.gt@gojek.com>
Bump up version 0.11.0 (#51) -- Enable Dagger Parquet Source feature using Ali OSS Service (#49) * Add gradle tasks to minimal and dependencies to maven local * Add capability to dagger to read python udfs from Ali(oss) and Tencent(cosn) storage services Given the configuration provided correctly. Set the below environment variables accordingly to access the files stored in the respective bucket. Ali(oss) - OSS_ACCESS_KEY_ID - OSS_ACCESS_KEY_SECRET Tencent(cos) - COS_SECRET_ID - COS_SECRET_KEY - COS_REGION * OSS client endpoint should be configurable via ENV variable * COS filesystem high availability support If you need to use COS filesystem for the dagger, provide the cos bucket/key configuration in the state.backend.fs.checkpointdir, state.savepoints.dir, high-availability.storageDir to flinkdeployment manifest. If the filesystem protocol begins with cosn for the above configurations, dagger uses the below configurations provided in the flinkdeployment manifest file. fs.cosn.impl: org.apache.hadoop.fs.CosFileSystem fs.AbstractFileSystem.cosn.impl: org.apache.hadoop.fs.CosN fs.cosn.userinfo.secretId: <secretID> fs.cosn.userinfo.secretKey: <secretKey> fs.cosn.bucket.region: <region> fs.cosn.bucket.endpoint_suffix: <tencent-provided-prefix.xyz.com> * Fix checkstyle and made constants as static variables * Refactor Dart Feature to plug other object storage service providers * test checkstyle fix * Dart Support for OSS Service Provider * fix checkstyle * Dart Support for COS Service Provider * Dart implementation fix - the object storage client aren't serializable Most of the client implementation including GCS, is not serializable, so fixed this issue by making client implementation not part of the serialization, and when the client is passed over wire and the client doesn't exist, it initializes as and when it is required. // In a distributed system, we don't intend the client to be serialized and most of the implementations like // GCP Storage implementation doesn't implement java.io.Serializable interface and you may see the below error // Caused by: org.apache.flink.api.common.InvalidProgramException: com.google.api.services.storage.Storage@1c666a8f // is not serializable. The object probably contains or references non serializable fields. // Caused by: java.io.NotSerializableException: com.google.api.services.storage.Storage * checkstyle fix * Add unit tests for DartDataStoreClientProvider * Enable Dagger Parquet Source feature using Ali OSS Service Co-authored-by: rajuGT <raju.gt@gojek.com>
feat: Add dynamic source and sink kafka properties (#31) * add kafka security module * aDD SASL class callback config for producer and consumer * Add config map * remove build.gradle * Add dynamic props * Update regex * rename var * Remove redundant imports * Rename prefix * Remove unused import * Update test * Add implementation for sink dynamic props * Add null checking for the additional props * Added validations on source config * Add docs and refactor pattern to enum * chECKSTYLE * Add readme * Make the pattern more specific and embedded to the enum * Add more test * bump version * Add unit tests * Use expected annotation * Assert exception message. Add fail mechanism in case of not throwing any exception * Use rule for asserting exception * Add more test case * add more unit test * feat: Enable multiple underscore parsing * test: Add test on multiple underscore parsing
PreviousNext