-
Notifications
You must be signed in to change notification settings - Fork 16.2k
Airflow 4479 ImapAttachmentToS3Operator s3_overwrite arg not wired up #5311
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add assert to test_sync_to_db to confirm subdag.fileloc == dag.fileloc Add new test test_dag_details_subdag to check dag_details endpoint for subdags.
Airflow 4455
Add replace=self.s3_overwrite to s3_hook.load_bytes
Imap to s3 overwrite not wired up
…_wired_up Revert "Imap to s3 overwrite not wired up"
…5233) Our new environment already sets this env var but we aren't using this yet
Remove _future_ should done in apache#5020, but in the same time our code base refactor models and move some code out of _init.py. This PR to remove __future__ in models.
…pache#5242) The issue raised in https://issues.apache.org/jira/browse/AIRFLOW-2522 was resolved. Scope is not ignored when default credentials are used. This note should be deleted. Scope is read in: https://github.com/apache/airflow/blob/master/airflow/contrib/hooks/gcp_api_base_hook.py#L88-L92 When default credentails is used, then this code is used: https://github.com/apache/airflow/blob/master/airflow/contrib/hooks/gcp_api_base_hook.py#L94-L97 so scope is passed to the external library.
…#5147) models.baseoperator get relatives by attribute, but _downstream_task_ids and _upstream_task_ids is private member This patch will get relatives from exists function get_direct_relative_ids
apache#5248) * [AIRFLOW-4467] Add dataproc_jars to templated fields in Dataproc operators apache#5192 edit docs about: dataproc_pig_jars in DataProcPigOperator dataproc_hive_jars in DataProcHiveOperator dataproc_spark_jars in DataProcSparkSqlOperator dataproc_hadoop_jars in DataProcHadoopOperator The edit mentioned that these fields are tempated but they are not. Raising PR to make them templated as doc suggest. * Fix flake8
…pache#4050) If we are running tasks via sudo then AIRFLOW__ config env vars won't be visible anymore (without them showing up in `ps`) and we likely might not have permission to run the _cmd's specified to find the passwords. But if we are running as the same user then there is no need to "bake" those options in to the temporary config file -- if the operator decided they didn't want those values appearing in a config file on disk, then lets do our best to respect that.
…he#5253) jinja2 cannot use dict/lists as templates hence converting it to json solves this while keeping complexity down.
* [AIRFLOW-4468] add sql_alchemy_max_overflow parameter
… on task pods (apache#4551) * [AIRFLOW-2955] Fix kubernetes pod operator to set requests and limits on task pods. * [AIRFLOW-2955] Remove bare except to follow flake8. * [AIRFLOW-2955] Remove unused library. * [AIRFLOW-2955] Fix kubernetes pod operator to set requests and limits on task pods. * [AIRFLOW-2955] Remove bare except to follow flake8. * [AIRFLOW-2955] Remove unused library. * Resolved conflicts. * [AIRFLOW-2955] Fix kubernetes pod operator to set requests and limits on task pods. * [AIRFLOW-2955] Remove bare except to follow flake8. * [AIRFLOW-2955] Remove unused library. * Resolved conflicts. * Resolve conflicts. * [AIRFLOW-2955] Remove bare except to follow flake8. * [AIRFLOW-2955] Remove unused library. * [AIRFLOW-2955] clear up commits. * Resolve nits form @galuszkak and @dimberman.
Ensure that backfill respects task_concurrency. That is, the number of concurrent running tasks across DAG runs should not exceed task_concurrency.
* added example of a function returning a dag object
Add papermill operator to productize python notebooks.
…pache#5261) * Move k8s executor from contrib folder Considering that the k8s executor is now fully supported by core committers, we should move it from contrib to the primary executor directory.
* HA for Metastore * [AIRFLOW-3888] HA for metastore connection Creating a connection to a metasotor with two hosts for high avitablity (eg connection 1, connection 2) is not possible because the entire value entered is taken. For our needs, it is necessary to go through subsequent hosts and connect to the first working. This change allows you to check and then connect to a working metastor. * add function to base_hook * update webhdfs_hook * back to original version * back to original version * Update hive_hooks.py Thank you. I made a few changes because during the tests I detected several errors. I have a question, when I do marge to my pull it will be still possible to land it in the airflow main branch? * [AIRFLOW-3888] HA for metastore connection flake8 code repair * [AIRFLOW-3888] HA for metastore connection Flake8 repair * [AIRFLOW-3888] HA for metastore connection Code behavior improvements * [AIRFLOW-3888] HA for metastore connection Add test * [AIRFLOW-3888] HA for metastore connection test improvement * [AIRFLOW-3888] HA for metastore connection Add test [AIRFLOW-3888] HA for metastore connection test improvement * [AIRFLOW-3888] HA for metastore connection Add test [AIRFLOW-3888] HA for metastore connection test improvement [AIRFLOW-3888] HA for metastore connection test improvement * [AIRFLOW-3888] HA for metastore connection Improving the typo in the variable name * [AIRFLOW-3888] HA for metastore connection Mock return_value edit * [AIRFLOW-3888] HA for metastore connection Flake8 repair * [AIRFLOW-3888] HA for metastore connection Test repair * [AIRFLOW-3888] HA for metastore connection Flake8 repair [AIRFLOW-3888] HA for metastore connection Test repair
…che#4923) * [AIRFLOW-4092] Add gRPCOperator, unit test and added to auto doc * [AIRFLOW-4092] fix documentation errors * [AIRFLOW-4092] remove hook dispatcher and auth_type as we don't use it now
* AIRFLOW-4174 Fix run with backoff * AIRFLOW-4174 Fix flake 8 issues
* Support fully pig options
…e#5264) * [AIRFLOW-4457] Enhance Task logs by providing the task context
update test pass in s3_overwrite kwarg
[AIRFLOW-4479] Imap to s3 overwrite not wired up
Add replace=self.s3_overwrite to s3_hook.load_bytes
update test pass in s3_overwrite kwarg
Codecov Report
@@ Coverage Diff @@
## master #5311 +/- ##
==========================================
+ Coverage 78.92% 78.92% +<.01%
==========================================
Files 479 479
Lines 30098 30098
==========================================
+ Hits 23755 23756 +1
+ Misses 6343 6342 -1
Continue to review full report at Codecov.
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Make sure you have checked all steps below.
Jira
Description
Fixes ImapAttachementToS3Operator s3_overwrite. Argument was present but not called.
Tests
adjusts existing tests
Commits
Documentation
Code Quality
flake8