New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AIRFLOW-2747] Explicit re-schedule of sensors #3596

Merged
merged 7 commits into from Sep 21, 2018

Conversation

Projects
None yet
6 participants
@seelmann
Copy link
Member

seelmann commented Jul 12, 2018

Make sure you have checked all steps below.

JIRA

Description

  • Here are some details about my PR, including screenshots of any UI changes:
    • Please see Jira for detailed description of the proposal and screenshots

Tests

  • My PR adds the following unit tests
    • test_ready_to_reschedule_dep.py
    • New tests in test_base_sensor.py

Commits

  • My commits all reference JIRA issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "How to write a good git commit message":
    1. Subject is separated from body by a blank line
    2. Subject is limited to 50 characters
    3. Subject does not end with a period
    4. Subject uses the imperative mood ("add", not "adding")
    5. Body wraps at 72 characters
    6. Body explains "what" and "why", not "how"

Documentation

  • In case of new functionality, my PR adds documentation that describes how to use it.
    • When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added.

Code Quality

  • Passes git diff upstream/master -u -- "*.py" | flake8 --diff
@codecov-io

This comment has been minimized.

Copy link

codecov-io commented Jul 12, 2018

Codecov Report

Merging #3596 into master will increase coverage by 0.05%.
The diff coverage is 88.57%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #3596      +/-   ##
==========================================
+ Coverage   77.43%   77.49%   +0.05%     
==========================================
  Files         203      204       +1     
  Lines       15882    15906      +24     
==========================================
+ Hits        12299    12326      +27     
+ Misses       3583     3580       -3
Impacted Files Coverage Δ
airflow/sensors/base_sensor_operator.py 97.61% <100%> (+0.95%) ⬆️
airflow/exceptions.py 100% <100%> (ø) ⬆️
airflow/ti_deps/dep_context.py 100% <100%> (ø) ⬆️
airflow/ti_deps/deps/ready_to_reschedule.py 100% <100%> (ø)
airflow/models.py 88.97% <100%> (+0.22%) ⬆️
airflow/www/views.py 69.13% <40%> (-0.18%) ⬇️
airflow/www_rbac/views.py 72.39% <40%> (-0.23%) ⬇️
airflow/www_rbac/utils.py 67.1% <0%> (-1.84%) ⬇️
airflow/bin/cli.py 64.78% <0%> (-0.7%) ⬇️
airflow/www/utils.py 88.75% <0%> (-0.6%) ⬇️
... and 15 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5c7ecbb...f69b6b4. Read the comment docs.

@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Aug 24, 2018

@seelmann Can you base onto master?

@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Aug 24, 2018

Maybe we could even make the rescheduling default behaviour for Airflow 2.0, and get rid of the blocking tasks. That would also simplify the code/logic.

@seelmann seelmann force-pushed the seelmann:AIRFLOW-2747-explicit-reschedule-of-sensors branch from 80e9c47 to fbd2714 Aug 25, 2018

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Aug 25, 2018

Rebased. 66 tests failed, seems all AWS (S3, Redshift, Dynamo) related, is this a know problem?

@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Aug 26, 2018

@seelmann I think this might have to do with a new release of Boto3: https://pypi.org/project/boto3/#history

@Fokko Fokko referenced this pull request Aug 26, 2018

Merged

[AIRFLOW-2960] Pin boto3 to <1.8 #3810

1 of 1 task complete

@seelmann seelmann force-pushed the seelmann:AIRFLOW-2747-explicit-reschedule-of-sensors branch from fbd2714 to f69b6b4 Aug 26, 2018

[AIRFLOW-2747] Explicit re-schedule of sensors
Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

@seelmann seelmann force-pushed the seelmann:AIRFLOW-2747-explicit-reschedule-of-sensors branch from f69b6b4 to a1f4b52 Sep 16, 2018

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Sep 16, 2018

@Fokko I rebased again and made following changes:

  • Instead of using a reschedule boolean flag I switched to mode property with possible values poke (default) and reschedule to allow other modes in future (e.g. in_scheduler as mentioned by Maxime).
  • Changed the Gantt view based on feedback from Pedro, see screenshots in the Jira issue.
  • Changed the CSS class in Gantt for the NONE state to display as white instead of black, streamlined to graph and tree view.
  • Added more tests and updated documentation

From my PoV it's ready to be merged.

@feng-tao

This comment has been minimized.

Copy link
Contributor

feng-tao commented Sep 17, 2018

I will try to spend some time on the pr tomorrow.

@feng-tao
Copy link
Contributor

feng-tao left a comment

Done a quick pass. Overall make sense. Great work :)

@@ -56,8 +56,8 @@

from sqlalchemy import (
Column, Integer, String, DateTime, Text, Boolean, ForeignKey, PickleType,
Index, Float, LargeBinary, UniqueConstraint)
from sqlalchemy import func, or_, and_, true as sqltrue
Index, Float, LargeBinary, UniqueConstraint, ForeignKeyConstraint)

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

small nit: it would be good to keep the import list sorted(given you are touching on this line :))

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

Hm, the line is already not sorted (Index, Float) ;). Should I then sort the whole 3-line import statement?

Index, Float, LargeBinary, UniqueConstraint)
from sqlalchemy import func, or_, and_, true as sqltrue
Index, Float, LargeBinary, UniqueConstraint, ForeignKeyConstraint)
from sqlalchemy import func, or_, and_, true as sqltrue, asc

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

how does line work(import func, or_, and_, true as sqltrue, asc) ? I wonder whether you want to do from sqlalchemy import asc instead?

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

There are already two from sqlalchemy import ... statements: the first imports types, the second imports (SQL) expressions.
I can introduce a third one.
Or combine all into one like this (lexicographically sorted):

from sqlalchemy import (
    Boolean, Column, DateTime, Float, ForeignKey, ForeignKeyConstraint, Index,
    Integer, LargeBinary, PickleType, String, Text, UniqueConstraint, 
    and_, asc, func, or_, true as sqltrue
)

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

I changed it like this, let me know what you think.

Show resolved Hide resolved airflow/models.py Outdated

if not test_mode:
session.merge(self)
session.commit()

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

do we need some exception handling / rollback in case the db write doesn't go through?

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

and do we want to put session.commit() under if not test_mode block as well?

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

It's the same pattern as in handle_failure, I didn't think much about it, I'll think a bit more about it...

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

I tried to figure out what that test_mode is about. If I get it right it has nothing to do with "unit_test_mode" but only with the CLI's "test" sub command [1]. It says "... without ... recording its state in the database.". I wasn't aware of that.
In handle_error() the TaskFail is written even in test mode to the DB, but the task instance itself is not updated.
For handle_reschedule() I don't think it makes sense to record that event. But I have I'd like to verify it and probably also add some unit tests...
[1] https://airflow.apache.org/cli.html#test

This comment has been minimized.

@seelmann

seelmann Sep 20, 2018

Author Member

I changed it to skip reschedule handling completely when in test_mode, added test and verified behaviour via command line.
Regarding the first question regarding exception handing and rollback, that's is covered by the @provide_session decorator around.

return self.mode == 'reschedule'

@property
def deps(self):

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

could you add a comment on what is this dependency for?

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

Done

IS_TASK_DEP = True

@provide_session
def _get_dep_statuses(self, ti, session, dep_context):

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

maybe a comment on the logic flow how it determine whether it could reschedule or not?

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

Done

Show resolved Hide resolved airflow/www/views.py Outdated
Show resolved Hide resolved airflow/www_rbac/views.py Outdated
# If reschedule, use first start date of current try
task_reschedules = TaskReschedule.find_for_task_instance(context['ti'])
if task_reschedules:
started_at = task_reschedules[0].start_date
while not self.poke(context):
if (timezone.utcnow() - started_at).total_seconds() > self.timeout:

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

maybe a dump question, but now is it possible to have (timezone.utcnow() - started_at) as negative datetime number? will (timezone.utcnow() - started_at).total_seconds() throw exception?

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

Normally that should not be the case because started_at is always in the past. Of course clocks are never in sync (except at Google) so it may happen. But also in such a case datetime.timedelta.total_seconds() doesn't throw an exception but returns a negative number which is still smaller than the configured timeout (if the user configured a positive timeout, lol)

@@ -75,11 +104,24 @@ def execute(self, context):
raise AirflowSkipException('Snap. Time is OUT.')
else:
raise AirflowSensorTimeout('Snap. Time is OUT.')
sleep(self.poke_interval)
if self.reschedule:
reschedule_date = timezone.utcnow() + timedelta(

This comment has been minimized.

@feng-tao

feng-tao Sep 17, 2018

Contributor

and should it be reschedule_date = started_at + timedelta( ?

This comment has been minimized.

@seelmann

seelmann Sep 18, 2018

Author Member

No. In "reschedule" mode started_at is always set to the initial schedule time (when the task instance was scheduled the first time). started_at is only used to determine if timeout is reached.

@feng-tao

This comment has been minimized.

Copy link
Contributor

feng-tao commented Sep 17, 2018

@seelmann , I put some small comments / questions on the pr. Great work!

@codecov-io

This comment has been minimized.

Copy link

codecov-io commented Sep 18, 2018

Codecov Report

Merging #3596 into master will increase coverage by 0.1%.
The diff coverage is 82.53%.

Impacted file tree graph

@@            Coverage Diff            @@
##           master    #3596     +/-   ##
=========================================
+ Coverage   77.52%   77.63%   +0.1%     
=========================================
  Files         198      199      +1     
  Lines       15842    16042    +200     
=========================================
+ Hits        12282    12454    +172     
- Misses       3560     3588     +28
Impacted Files Coverage Δ
airflow/models.py 89.01% <100%> (+0.2%) ⬆️
airflow/sensors/base_sensor_operator.py 97.87% <100%> (+1.2%) ⬆️
airflow/exceptions.py 100% <100%> (ø) ⬆️
airflow/ti_deps/dep_context.py 100% <100%> (ø) ⬆️
airflow/ti_deps/deps/ready_to_reschedule.py 100% <100%> (ø)
airflow/www_rbac/views.py 72.04% <35.29%> (-0.59%) ⬇️
airflow/www/views.py 68.85% <35.29%> (-0.46%) ⬇️
airflow/bin/cli.py 64.74% <0%> (ø) ⬆️
airflow/www/app.py 100% <0%> (ø) ⬆️
... and 5 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0e5eee8...1c0ee70. Read the comment docs.

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Sep 20, 2018

@feng-tao I replied and resolved all your questions and comments, PTAL. Currently I added additional commits on top, when all is fine I'll squash and rebase everything.

@feng-tao

This comment has been minimized.

Copy link
Contributor

feng-tao commented Sep 21, 2018

Will take a look tonight.

@feng-tao

This comment has been minimized.

Copy link
Contributor

feng-tao commented Sep 21, 2018

Thanks @seelmann . LGTM.

@feng-tao feng-tao merged commit dc59d7e into apache:master Sep 21, 2018

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Sep 25, 2018

@mistercrunch PTAL, you can shut down some idling (or should I say poking) machines now ;)

xnuinside added a commit to xnuinside/incubator-airflow that referenced this pull request Oct 1, 2018

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

wmorris75 pushed a commit to modmed/incubator-airflow that referenced this pull request Oct 5, 2018

wayne.morris
[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators apache#3828
Rearranged input parameters for sftp_to_s3_operator.

[AIRFLOW-2988] Run specifically python2 for dataflow (apache#3826)

Apache beam does not yet support python3, so it's best to run dataflow
jobs with python2 specifically until python3 support is complete
(BEAM-1251), in case if the user's 'python' in PATH is python3.

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (apache#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (apache#3892)

[AIRFLOW-3060] DAG context manager fails to exit properly in certain circumstances

[AIRFLOW-2524] Add SageMaker Batch Inference (apache#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2772] Fix Bug in BigQuery hook for Partitioned Table  (apache#3901)

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (apache#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (apache#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (apache#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (apache#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (apache#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (apache#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (apache#3904)

[AIRFLOW-3070] Refine web UI authentication-related docs (apache#3863)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (apache#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (apache#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (apache#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (apache#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (apache#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Reduce DaysUntilStale for probot/stale

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (apache#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (apache#3920)

[AIRFLOW-2407] Use feature detection for reload() (apache#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)

* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (apache#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (apache#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (apache#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (apache#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (apache#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2887] Added BigQueryCreateEmptyDatasetOperator and create_emty_dataset to bigquery_hook (apache#3876)

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (apache#3934)

[AIRFLOW-3090] Specify path of key file in log message (apache#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (apache#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (apache#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (apache#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (apache#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (apache#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (apache#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (apache#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (apache#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (apache#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (apache#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (apache#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (apache#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-3130] Add CLI docs for users command

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (apache#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (apache#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (apache#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (apache#3958)

[AIRFLOW-XXX] Add Compass to companies list (apache#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (apache#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (apache#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (apache#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (apache#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (apache#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (apache#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (apache#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (apache#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (apache#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (apache#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (apache#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (apache#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (apache#3937)

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators apache#3828
Added apply_default decorator.

Added test for operators

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

Formatted code

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

wmorris75 pushed a commit to modmed/incubator-airflow that referenced this pull request Oct 5, 2018

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (apache#3828)
[AIRFLOW-2709] Improve error handling in Databricks hook (apache#3570)

* Use float for default value
* Use status code to determine whether an error is retryable
* Fix wrong type in assertion
* Fix style to prevent lines from exceeding 90 characters
* Fix wrong way of checking exception type

[AIRFLOW-2854] kubernetes_pod_operator add more configuration items (apache#3697)

* kubernetes_pod_operator add more configuration items
* fix test_kubernetes_pod_operator test_faulty_service_account failure case
* fix review comment issues
* pod_operator add hostnetwork config
* add doc example

[AIRFLOW-2994] Fix command status check in Qubole Check operator (apache#3790)

[AIRFLOW-2928] Use uuid4 instead of uuid1 (apache#3779)

for better randomness.

[AIRFLOW-2949] Add syntax highlight for single quote strings (apache#3795)

* AIRFLOW-2949: Add syntax highlight for single quote strings

* AIRFLOW-2949: Also updated new UI main.css

[AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (apache#3793)

There may be different combinations of arguments, and
some processings are being done 'silently', while users
may not be fully aware of them.

For example
- User only needs to provide either `ssh_hook`
  or `ssh_conn_id`, while this is not clear in doc
- if both provided, `ssh_conn_id` will be ignored.
- if `remote_host` is provided, it will replace
  the `remote_host` which wasndefined in `ssh_hook`
  or predefined in the connection of `ssh_conn_id`

These should be documented clearly to ensure it's
transparent to the users. log.info() should also be
used to remind users and provide clear logs.

In addition, add instance check for ssh_hook to ensure
it is of the correct type (SSHHook).

Tests are updated for this PR.

[AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

[AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

[AIRFLOW-2984] Convert operator dates to UTC (apache#3822)

Tasks can have start_dates or end_dates separately
from the DAG. These need to be converted to UTC otherwise
we cannot use them for calculation the next execution
date.

[AIRFLOW-2779] Make GHE auth third party licensed (apache#3803)

This reinstates the original license.

[AIRFLOW-XXX] Add Format to list of companies (apache#3824)

[AIRFLOW-2900] Show code for packaged DAGs (apache#3749)

[AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (apache#3821)

[AIRFLOW-XXX] Fix Docstrings for Operators (apache#3820)

[AIRFLOW-2951] Update dag_run table end_date when state change (apache#3798)

The existing airflow only change dag_run table end_date value when
a user teminate a dag in web UI. The end_date will not be updated
if airflow detected a dag finished and updated its state.

This commit add end_date update in DagRun's set_state function to
make up tho problem mentioned above.

[AIRFLOW-2145] fix deadlock on clearing running TI (apache#3657)

a `shutdown` task is not considered be `unfinished`, so a dag run can
deadlock when all `unfinished` downstreams are all waiting on a task
that's in the `shutdown` state. fix this by considering `shutdown` to
be `unfinished`, since it's not truly a terminal state

[AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (apache#3833)

[AIRFLOW-2476] Allow tabulate up to 0.8.2 (apache#3835)

[AIRFLOW-XXX] Fix typos in faq.rst (apache#3837)

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (apache#3832)

(apache#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (apache#3804)

[AIRFLOW-491] Add feature to pass extra api configs to BQ Hook (apache#3733)

[AIRFLOW-208] Add badge to show supported Python versions (apache#3839)

[AIRFLOW-3007] Update backfill example in Scheduler docs

The scheduler docs at https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated way of passing `schedule_interval`. `schedule_interval` should be pass to DAG as a separate parameter and not as a default arg.

[AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow' (apache#3845)

[AIRFLOW-3002] Fix variable & tests in GoogleCloudBucketHelper (apache#3843)

[AIRFLOW-2991] Log path to driver output after Dataproc job (apache#3827)

[AIRFLOW-XXX] Fix python3 and flake8 errors in dev/airflow-jira

This is a script that checks if the Jira's marked as fixed in a release
are actually merged in - getting this working is helpful to me in
preparing 1.10.1

[AIRFLOW-3006] Add note on using None for schedule_interval

[AIRFLOW-3003] Pull the krb5 image instead of building (apache#3844)

Pull the image instead of building it, this will speed up the CI
process since we don't have to build it every time.

[AIRFLOW-2883] Add import and export for pool cli using JSON

[AIRFLOW-2847] Remove legacy imports support for plugins (apache#3692)

[AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now … (apache#3813)

Add functionality to kick of a Databricks job right away.

* Per feedback: fixed a documentation error,
  reintegrated the execute and on_kill onto the objects.
* Fixed a  documentation issue.

[AIRFLOW-3021] Add Censys to who uses Airflow list

> Censys
> Find and analyze every reachable server and device on the Internet
> https://censys.io/

closes AIRFLOW-3021 https://issues.apache.org/jira/browse/AIRFLOW-3021

[AIRFLOW-3018] Fix Minor issues in Documentation

Add Branch to Company List

[AIRFLOW-3023] Fix docstring datatypes

[AIRFLOW-3008] Move Kubernetes example DAGs to contrib

[AIRFLOW-2997] Support cluster fields in bigquery (apache#3838)

This adds a cluster_fields argument to the bigquery hook, GCS to
bigquery operator and bigquery query operators. This field requests that
bigquery store the result of the query/load operation sorted according
to the specified fields (the order of fields given is significant).

[AIRFLOW-XXX] Redirect FAQ `airflow[crypto]` to How-to Guides.

[AIRFLOW-XXX] Remove redundant space in Kerberos (apache#3866)

[AIRFLOW-3028] Update Text & Images in Readme.md

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (apache#3862)

[AIRFLOW-2985] Operators for S3 object copying/deleting (apache#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.

[AIRFLOW-3030] Fix CLI docs (apache#3872)

[AIRFLOW-XXX] Update kubernetes.rst docs (apache#3875)

Update kubernetes.rst with correct KubernetesPodOperator inputs
for the volumes.

[AIRFLOW-XXX] Add Enigma to list of companies

[AIRFLOW-2965] CLI tool to show the next execution datetime

Cover different cases

- schedule_interval is "@once" or None, then following_schedule
  method would always return None
- If dag is paused, print reminder
- If latest_execution_date is not found, print warning saying
  not applicable.

[AIRFLOW-XXX] Add Bombora Inc using Airflow

[AIRFLOW-2156] Parallelize Celery Executor task state fetching (apache#3830)

[AIRFLOW-XXX] Move Dag level access control out of 1.10 section (apache#3882)

It isn't in 1.10 (and wasn't in this section when the PR was created).

[AIRFLOW-3040] Enable ProBot to clean up stale Pull Requests (apache#3883)

[AIRFLOW-3012] Fix Bug when passing emails for SLA

[AIRFLOW-2797] Create Google Dataproc cluster with custom image (apache#3871)

[AIRFLOW-XXX] Updated README  to include CAVA

Addressed comments in PR with appropriate refactoring of s3-sftp operators.
Added s3-sftp operator links

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators apache#3828

Rearranged input parameters for sftp_to_s3_operator.

[AIRFLOW-2988] Run specifically python2 for dataflow (apache#3826)

Apache beam does not yet support python3, so it's best to run dataflow
jobs with python2 specifically until python3 support is complete
(BEAM-1251), in case if the user's 'python' in PATH is python3.

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (apache#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (apache#3892)

[AIRFLOW-3060] DAG context manager fails to exit properly in certain circumstances

[AIRFLOW-2524] Add SageMaker Batch Inference (apache#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2772] Fix Bug in BigQuery hook for Partitioned Table  (apache#3901)

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (apache#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (apache#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (apache#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (apache#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (apache#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (apache#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (apache#3904)

[AIRFLOW-3070] Refine web UI authentication-related docs (apache#3863)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (apache#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (apache#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (apache#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (apache#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (apache#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Reduce DaysUntilStale for probot/stale

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (apache#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (apache#3920)

[AIRFLOW-2407] Use feature detection for reload() (apache#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)

* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (apache#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (apache#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (apache#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (apache#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (apache#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2887] Added BigQueryCreateEmptyDatasetOperator and create_emty_dataset to bigquery_hook (apache#3876)

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (apache#3934)

[AIRFLOW-3090] Specify path of key file in log message (apache#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (apache#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (apache#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (apache#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (apache#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (apache#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (apache#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (apache#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (apache#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (apache#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (apache#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (apache#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (apache#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-3130] Add CLI docs for users command

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (apache#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (apache#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (apache#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (apache#3958)

[AIRFLOW-XXX] Add Compass to companies list (apache#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (apache#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (apache#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (apache#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (apache#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (apache#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (apache#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (apache#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (apache#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (apache#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (apache#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (apache#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (apache#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (apache#3937)

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators apache#3828
Added apply_default decorator.

Added test for operators

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

Formatted code

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

wmorris75 pushed a commit to modmed/incubator-airflow that referenced this pull request Oct 5, 2018

[AIRFLOW-XXX] Remove residual line in Changelog (apache#3814)
[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (apache#3828)

[AIRFLOW-2709] Improve error handling in Databricks hook (apache#3570)

* Use float for default value
* Use status code to determine whether an error is retryable
* Fix wrong type in assertion
* Fix style to prevent lines from exceeding 90 characters
* Fix wrong way of checking exception type

[AIRFLOW-2854] kubernetes_pod_operator add more configuration items (apache#3697)

* kubernetes_pod_operator add more configuration items
* fix test_kubernetes_pod_operator test_faulty_service_account failure case
* fix review comment issues
* pod_operator add hostnetwork config
* add doc example

[AIRFLOW-2994] Fix command status check in Qubole Check operator (apache#3790)

[AIRFLOW-2928] Use uuid4 instead of uuid1 (apache#3779)

for better randomness.

[AIRFLOW-2949] Add syntax highlight for single quote strings (apache#3795)

* AIRFLOW-2949: Add syntax highlight for single quote strings

* AIRFLOW-2949: Also updated new UI main.css

[AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (apache#3793)

There may be different combinations of arguments, and
some processings are being done 'silently', while users
may not be fully aware of them.

For example
- User only needs to provide either `ssh_hook`
  or `ssh_conn_id`, while this is not clear in doc
- if both provided, `ssh_conn_id` will be ignored.
- if `remote_host` is provided, it will replace
  the `remote_host` which wasndefined in `ssh_hook`
  or predefined in the connection of `ssh_conn_id`

These should be documented clearly to ensure it's
transparent to the users. log.info() should also be
used to remind users and provide clear logs.

In addition, add instance check for ssh_hook to ensure
it is of the correct type (SSHHook).

Tests are updated for this PR.

[AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

[AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

[AIRFLOW-2984] Convert operator dates to UTC (apache#3822)

Tasks can have start_dates or end_dates separately
from the DAG. These need to be converted to UTC otherwise
we cannot use them for calculation the next execution
date.

[AIRFLOW-2779] Make GHE auth third party licensed (apache#3803)

This reinstates the original license.

[AIRFLOW-XXX] Add Format to list of companies (apache#3824)

[AIRFLOW-2900] Show code for packaged DAGs (apache#3749)

[AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (apache#3821)

[AIRFLOW-XXX] Fix Docstrings for Operators (apache#3820)

[AIRFLOW-2951] Update dag_run table end_date when state change (apache#3798)

The existing airflow only change dag_run table end_date value when
a user teminate a dag in web UI. The end_date will not be updated
if airflow detected a dag finished and updated its state.

This commit add end_date update in DagRun's set_state function to
make up tho problem mentioned above.

[AIRFLOW-2145] fix deadlock on clearing running TI (apache#3657)

a `shutdown` task is not considered be `unfinished`, so a dag run can
deadlock when all `unfinished` downstreams are all waiting on a task
that's in the `shutdown` state. fix this by considering `shutdown` to
be `unfinished`, since it's not truly a terminal state

[AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (apache#3833)

[AIRFLOW-2476] Allow tabulate up to 0.8.2 (apache#3835)

[AIRFLOW-XXX] Fix typos in faq.rst (apache#3837)

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (apache#3832)

(apache#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (apache#3804)

[AIRFLOW-491] Add feature to pass extra api configs to BQ Hook (apache#3733)

[AIRFLOW-208] Add badge to show supported Python versions (apache#3839)

[AIRFLOW-3007] Update backfill example in Scheduler docs

The scheduler docs at https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated way of passing `schedule_interval`. `schedule_interval` should be pass to DAG as a separate parameter and not as a default arg.

[AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow' (apache#3845)

[AIRFLOW-3002] Fix variable & tests in GoogleCloudBucketHelper (apache#3843)

[AIRFLOW-2991] Log path to driver output after Dataproc job (apache#3827)

[AIRFLOW-XXX] Fix python3 and flake8 errors in dev/airflow-jira

This is a script that checks if the Jira's marked as fixed in a release
are actually merged in - getting this working is helpful to me in
preparing 1.10.1

[AIRFLOW-3006] Add note on using None for schedule_interval

[AIRFLOW-3003] Pull the krb5 image instead of building (apache#3844)

Pull the image instead of building it, this will speed up the CI
process since we don't have to build it every time.

[AIRFLOW-2883] Add import and export for pool cli using JSON

[AIRFLOW-2847] Remove legacy imports support for plugins (apache#3692)

[AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now … (apache#3813)

Add functionality to kick of a Databricks job right away.

* Per feedback: fixed a documentation error,
  reintegrated the execute and on_kill onto the objects.
* Fixed a  documentation issue.

[AIRFLOW-3021] Add Censys to who uses Airflow list

> Censys
> Find and analyze every reachable server and device on the Internet
> https://censys.io/

closes AIRFLOW-3021 https://issues.apache.org/jira/browse/AIRFLOW-3021

[AIRFLOW-3018] Fix Minor issues in Documentation

Add Branch to Company List

[AIRFLOW-3023] Fix docstring datatypes

[AIRFLOW-3008] Move Kubernetes example DAGs to contrib

[AIRFLOW-2997] Support cluster fields in bigquery (apache#3838)

This adds a cluster_fields argument to the bigquery hook, GCS to
bigquery operator and bigquery query operators. This field requests that
bigquery store the result of the query/load operation sorted according
to the specified fields (the order of fields given is significant).

[AIRFLOW-XXX] Redirect FAQ `airflow[crypto]` to How-to Guides.

[AIRFLOW-XXX] Remove redundant space in Kerberos (apache#3866)

[AIRFLOW-3028] Update Text & Images in Readme.md

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (apache#3862)

[AIRFLOW-2985] Operators for S3 object copying/deleting (apache#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.

[AIRFLOW-3030] Fix CLI docs (apache#3872)

[AIRFLOW-XXX] Update kubernetes.rst docs (apache#3875)

Update kubernetes.rst with correct KubernetesPodOperator inputs
for the volumes.

[AIRFLOW-XXX] Add Enigma to list of companies

[AIRFLOW-2965] CLI tool to show the next execution datetime

Cover different cases

- schedule_interval is "@once" or None, then following_schedule
  method would always return None
- If dag is paused, print reminder
- If latest_execution_date is not found, print warning saying
  not applicable.

[AIRFLOW-XXX] Add Bombora Inc using Airflow

[AIRFLOW-2156] Parallelize Celery Executor task state fetching (apache#3830)

[AIRFLOW-XXX] Move Dag level access control out of 1.10 section (apache#3882)

It isn't in 1.10 (and wasn't in this section when the PR was created).

[AIRFLOW-3040] Enable ProBot to clean up stale Pull Requests (apache#3883)

[AIRFLOW-3012] Fix Bug when passing emails for SLA

[AIRFLOW-2797] Create Google Dataproc cluster with custom image (apache#3871)

[AIRFLOW-XXX] Updated README  to include CAVA

Addressed comments in PR with appropriate refactoring of s3-sftp operators.
Added s3-sftp operator links

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators apache#3828

Rearranged input parameters for sftp_to_s3_operator.

[AIRFLOW-2988] Run specifically python2 for dataflow (apache#3826)

Apache beam does not yet support python3, so it's best to run dataflow
jobs with python2 specifically until python3 support is complete
(BEAM-1251), in case if the user's 'python' in PATH is python3.

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (apache#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (apache#3892)

[AIRFLOW-3060] DAG context manager fails to exit properly in certain circumstances

[AIRFLOW-2524] Add SageMaker Batch Inference (apache#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2772] Fix Bug in BigQuery hook for Partitioned Table  (apache#3901)

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (apache#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (apache#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (apache#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (apache#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (apache#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (apache#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (apache#3904)

[AIRFLOW-3070] Refine web UI authentication-related docs (apache#3863)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (apache#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (apache#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (apache#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (apache#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (apache#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Reduce DaysUntilStale for probot/stale

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (apache#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (apache#3920)

[AIRFLOW-2407] Use feature detection for reload() (apache#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)

* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (apache#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (apache#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (apache#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (apache#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (apache#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2887] Added BigQueryCreateEmptyDatasetOperator and create_emty_dataset to bigquery_hook (apache#3876)

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (apache#3934)

[AIRFLOW-3090] Specify path of key file in log message (apache#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (apache#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (apache#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (apache#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (apache#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (apache#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (apache#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (apache#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (apache#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (apache#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (apache#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (apache#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (apache#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-3130] Add CLI docs for users command

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (apache#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (apache#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (apache#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (apache#3958)

[AIRFLOW-XXX] Add Compass to companies list (apache#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (apache#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (apache#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (apache#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (apache#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (apache#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (apache#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (apache#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (apache#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (apache#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (apache#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (apache#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (apache#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (apache#3937)

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators apache#3828
Added apply_default decorator.

Added test for operators

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

Formatted code

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

[AIRFLOW-2930] Fix celery excecutor scheduler crash (apache#3784)

Caused by an update in PR apache#3740.
execute_command.apply_async(args=command, ...)
-command is a list of short unicode strings and the above code pass multiple
arguments to a function defined as taking only one argument.
-command = ["airflow", "run", "dag323",...]
-args = command = ["airflow", "run", "dag323", ...]
-execute_command("airflow","run","dag3s3", ...) will be error and exit.

[AIRFLOW-2854] kubernetes_pod_operator add more configuration items (apache#3697)

* kubernetes_pod_operator add more configuration items
* fix test_kubernetes_pod_operator test_faulty_service_account failure case
* fix review comment issues
* pod_operator add hostnetwork config
* add doc example

[AIRFLOW-2994] Fix command status check in Qubole Check operator (apache#3790)

[AIRFLOW-2949] Add syntax highlight for single quote strings (apache#3795)

* AIRFLOW-2949: Add syntax highlight for single quote strings

* AIRFLOW-2949: Also updated new UI main.css

[AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (apache#3793)

There may be different combinations of arguments, and
some processings are being done 'silently', while users
may not be fully aware of them.

For example
- User only needs to provide either `ssh_hook`
  or `ssh_conn_id`, while this is not clear in doc
- if both provided, `ssh_conn_id` will be ignored.
- if `remote_host` is provided, it will replace
  the `remote_host` which wasndefined in `ssh_hook`
  or predefined in the connection of `ssh_conn_id`

These should be documented clearly to ensure it's
transparent to the users. log.info() should also be
used to remind users and provide clear logs.

In addition, add instance check for ssh_hook to ensure
it is of the correct type (SSHHook).

Tests are updated for this PR.

[AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

[AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

[AIRFLOW-2779] Make GHE auth third party licensed (apache#3803)

This reinstates the original license.

[AIRFLOW-XXX] Add Format to list of companies (apache#3824)

[AIRFLOW-2900] Show code for packaged DAGs (apache#3749)

[AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (apache#3821)

[AIRFLOW-2951] Update dag_run table end_date when state change (apache#3798)

The existing airflow only change dag_run table end_date value when
a user teminate a dag in web UI. The end_date will not be updated
if airflow detected a dag finished and updated its state.

This commit add end_date update in DagRun's set_state function to
make up tho problem mentioned above.

[AIRFLOW-2145] fix deadlock on clearing running TI (apache#3657)

a `shutdown` task is not considered be `unfinished`, so a dag run can
deadlock when all `unfinished` downstreams are all waiting on a task
that's in the `shutdown` state. fix this by considering `shutdown` to
be `unfinished`, since it's not truly a terminal state

[AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (apache#3833)

[AIRFLOW-2476] Allow tabulate up to 0.8.2 (apache#3835)

[AIRFLOW-XXX] Fix typos in faq.rst (apache#3837)

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (apache#3832)

(apache#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (apache#3804)

[AIRFLOW-3007] Update backfill example in Scheduler docs

The scheduler docs at https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated way of passing `schedule_interval`. `schedule_interval` should be pass to DAG as a separate parameter and not as a default arg.

[AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow' (apache#3845)

[AIRFLOW-3002] Fix variable & tests in GoogleCloudBucketHelper (apache#3843)

[AIRFLOW-2991] Log path to driver output after Dataproc job (apache#3827)

[AIRFLOW-XXX] Fix python3 and flake8 errors in dev/airflow-jira

This is a script that checks if the Jira's marked as fixed in a release
are actually merged in - getting this working is helpful to me in
preparing 1.10.1

[AIRFLOW-2883] Add import and export for pool cli using JSON

[AIRFLOW-3021] Add Censys to who uses Airflow list

> Censys
> Find and analyze every reachable server and device on the Internet
> https://censys.io/

closes AIRFLOW-3021 https://issues.apache.org/jira/browse/AIRFLOW-3021

Add Branch to Company List

[AIRFLOW-3008] Move Kubernetes example DAGs to contrib

[AIRFLOW-2997] Support cluster fields in bigquery (apache#3838)

This adds a cluster_fields argument to the bigquery hook, GCS to
bigquery operator and bigquery query operators. This field requests that
bigquery store the result of the query/load operation sorted according
to the specified fields (the order of fields given is significant).

[AIRFLOW-XXX] Redirect FAQ `airflow[crypto]` to How-to Guides.

[AIRFLOW-XXX] Remove redundant space in Kerberos (apache#3866)

[AIRFLOW-3028] Update Text & Images in Readme.md

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (apache#3862)

[AIRFLOW-2985] Operators for S3 object copying/deleting (apache#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.

[AIRFLOW-3030] Fix CLI docs (apache#3872)

[AIRFLOW-XXX] Update kubernetes.rst docs (apache#3875)

Update kubernetes.rst with correct KubernetesPodOperator inputs
for the volumes.

[AIRFLOW-XXX] Add Enigma to list of companies

[AIRFLOW-2965] CLI tool to show the next execution datetime

Cover different cases

- schedule_interval is "@once" or None, then following_schedule
  method would always return None
- If dag is paused, print reminder
- If latest_execution_date is not found, print warning saying
  not applicable.

[AIRFLOW-XXX] Add Bombora Inc using Airflow

[AIRFLOW-XXX] Move Dag level access control out of 1.10 section (apache#3882)

It isn't in 1.10 (and wasn't in this section when the PR was created).

[AIRFLOW-3012] Fix Bug when passing emails for SLA

[AIRFLOW-2797] Create Google Dataproc cluster with custom image (apache#3871)

[AIRFLOW-XXX] Updated README  to include CAVA

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (apache#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (apache#3892)

[AIRFLOW-2524] Add SageMaker Batch Inference (apache#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (apache#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (apache#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (apache#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (apache#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (apache#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (apache#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (apache#3904)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (apache#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (apache#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (apache#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (apache#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (apache#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (apache#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (apache#3920)

[AIRFLOW-2407] Use feature detection for reload() (apache#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (apache#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (apache#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (apache#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (apache#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (apache#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (apache#3934)

[AIRFLOW-3090] Specify path of key file in log message (apache#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (apache#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (apache#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (apache#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (apache#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (apache#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (apache#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (apache#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (apache#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (apache#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (apache#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (apache#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (apache#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (apache#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (apache#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (apache#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (apache#3958)

[AIRFLOW-XXX] Add Compass to companies list (apache#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (apache#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (apache#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (apache#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (apache#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (apache#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (apache#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (apache#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (apache#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (apache#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (apache#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (apache#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (apache#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (apache#3937)

 [AIRFLOW-XXX] Fixing the issue in Documentation (apache#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (apache#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (apache#3952)

Reformmatted to flaskdiff requirements.

wmorris75 pushed a commit to modmed/incubator-airflow that referenced this pull request Oct 7, 2018

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
[AIRFLOW-XXX] Remove residual line in Changelog (#3814)

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)

[AIRFLOW-2709] Improve error handling in Databricks hook (#3570)

* Use float for default value
* Use status code to determine whether an error is retryable
* Fix wrong type in assertion
* Fix style to prevent lines from exceeding 90 characters
* Fix wrong way of checking exception type

[AIRFLOW-2854] kubernetes_pod_operator add more configuration items (#3697)

* kubernetes_pod_operator add more configuration items
* fix test_kubernetes_pod_operator test_faulty_service_account failure case
* fix review comment issues
* pod_operator add hostnetwork config
* add doc example

[AIRFLOW-2994] Fix command status check in Qubole Check operator (#3790)

[AIRFLOW-2928] Use uuid4 instead of uuid1 (#3779)

for better randomness.

[AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)

* AIRFLOW-2949: Add syntax highlight for single quote strings

* AIRFLOW-2949: Also updated new UI main.css

[AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793)

There may be different combinations of arguments, and
some processings are being done 'silently', while users
may not be fully aware of them.

For example
- User only needs to provide either `ssh_hook`
  or `ssh_conn_id`, while this is not clear in doc
- if both provided, `ssh_conn_id` will be ignored.
- if `remote_host` is provided, it will replace
  the `remote_host` which wasndefined in `ssh_hook`
  or predefined in the connection of `ssh_conn_id`

These should be documented clearly to ensure it's
transparent to the users. log.info() should also be
used to remind users and provide clear logs.

In addition, add instance check for ssh_hook to ensure
it is of the correct type (SSHHook).

Tests are updated for this PR.

[AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

[AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

[AIRFLOW-2984] Convert operator dates to UTC (#3822)

Tasks can have start_dates or end_dates separately
from the DAG. These need to be converted to UTC otherwise
we cannot use them for calculation the next execution
date.

[AIRFLOW-2779] Make GHE auth third party licensed (#3803)

This reinstates the original license.

[AIRFLOW-XXX] Add Format to list of companies (#3824)

[AIRFLOW-2900] Show code for packaged DAGs (#3749)

[AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)

[AIRFLOW-XXX] Fix Docstrings for Operators (#3820)

[AIRFLOW-2951] Update dag_run table end_date when state change (#3798)

The existing airflow only change dag_run table end_date value when
a user teminate a dag in web UI. The end_date will not be updated
if airflow detected a dag finished and updated its state.

This commit add end_date update in DagRun's set_state function to
make up tho problem mentioned above.

[AIRFLOW-2145] fix deadlock on clearing running TI (#3657)

a `shutdown` task is not considered be `unfinished`, so a dag run can
deadlock when all `unfinished` downstreams are all waiting on a task
that's in the `shutdown` state. fix this by considering `shutdown` to
be `unfinished`, since it's not truly a terminal state

[AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (#3833)

[AIRFLOW-2476] Allow tabulate up to 0.8.2 (#3835)

[AIRFLOW-XXX] Fix typos in faq.rst (#3837)

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (#3832)

(#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (#3804)

[AIRFLOW-491] Add feature to pass extra api configs to BQ Hook (#3733)

[AIRFLOW-208] Add badge to show supported Python versions (#3839)

[AIRFLOW-3007] Update backfill example in Scheduler docs

The scheduler docs at https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated way of passing `schedule_interval`. `schedule_interval` should be pass to DAG as a separate parameter and not as a default arg.

[AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow' (#3845)

[AIRFLOW-3002] Fix variable & tests in GoogleCloudBucketHelper (#3843)

[AIRFLOW-2991] Log path to driver output after Dataproc job (#3827)

[AIRFLOW-XXX] Fix python3 and flake8 errors in dev/airflow-jira

This is a script that checks if the Jira's marked as fixed in a release
are actually merged in - getting this working is helpful to me in
preparing 1.10.1

[AIRFLOW-3006] Add note on using None for schedule_interval

[AIRFLOW-3003] Pull the krb5 image instead of building (#3844)

Pull the image instead of building it, this will speed up the CI
process since we don't have to build it every time.

[AIRFLOW-2883] Add import and export for pool cli using JSON

[AIRFLOW-2847] Remove legacy imports support for plugins (#3692)

[AIRFLOW-1998] Implemented DatabricksRunNowOperator for jobs/run-now … (#3813)

Add functionality to kick of a Databricks job right away.

* Per feedback: fixed a documentation error,
  reintegrated the execute and on_kill onto the objects.
* Fixed a  documentation issue.

[AIRFLOW-3021] Add Censys to who uses Airflow list

> Censys
> Find and analyze every reachable server and device on the Internet
> https://censys.io/

closes AIRFLOW-3021 https://issues.apache.org/jira/browse/AIRFLOW-3021

[AIRFLOW-3018] Fix Minor issues in Documentation

Add Branch to Company List

[AIRFLOW-3023] Fix docstring datatypes

[AIRFLOW-3008] Move Kubernetes example DAGs to contrib

[AIRFLOW-2997] Support cluster fields in bigquery (#3838)

This adds a cluster_fields argument to the bigquery hook, GCS to
bigquery operator and bigquery query operators. This field requests that
bigquery store the result of the query/load operation sorted according
to the specified fields (the order of fields given is significant).

[AIRFLOW-XXX] Redirect FAQ `airflow[crypto]` to How-to Guides.

[AIRFLOW-XXX] Remove redundant space in Kerberos (#3866)

[AIRFLOW-3028] Update Text & Images in Readme.md

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (#3862)

[AIRFLOW-2985] Operators for S3 object copying/deleting (#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.

[AIRFLOW-3030] Fix CLI docs (#3872)

[AIRFLOW-XXX] Update kubernetes.rst docs (#3875)

Update kubernetes.rst with correct KubernetesPodOperator inputs
for the volumes.

[AIRFLOW-XXX] Add Enigma to list of companies

[AIRFLOW-2965] CLI tool to show the next execution datetime

Cover different cases

- schedule_interval is "@once" or None, then following_schedule
  method would always return None
- If dag is paused, print reminder
- If latest_execution_date is not found, print warning saying
  not applicable.

[AIRFLOW-XXX] Add Bombora Inc using Airflow

[AIRFLOW-2156] Parallelize Celery Executor task state fetching (#3830)

[AIRFLOW-XXX] Move Dag level access control out of 1.10 section (#3882)

It isn't in 1.10 (and wasn't in this section when the PR was created).

[AIRFLOW-3040] Enable ProBot to clean up stale Pull Requests (#3883)

[AIRFLOW-3012] Fix Bug when passing emails for SLA

[AIRFLOW-2797] Create Google Dataproc cluster with custom image (#3871)

[AIRFLOW-XXX] Updated README  to include CAVA

Addressed comments in PR with appropriate refactoring of s3-sftp operators.
Added s3-sftp operator links

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators #3828

Rearranged input parameters for sftp_to_s3_operator.

[AIRFLOW-2988] Run specifically python2 for dataflow (#3826)

Apache beam does not yet support python3, so it's best to run dataflow
jobs with python2 specifically until python3 support is complete
(BEAM-1251), in case if the user's 'python' in PATH is python3.

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (#3892)

[AIRFLOW-3060] DAG context manager fails to exit properly in certain circumstances

[AIRFLOW-2524] Add SageMaker Batch Inference (#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2772] Fix Bug in BigQuery hook for Partitioned Table  (#3901)

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (#3904)

[AIRFLOW-3070] Refine web UI authentication-related docs (#3863)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Reduce DaysUntilStale for probot/stale

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (#3920)

[AIRFLOW-2407] Use feature detection for reload() (#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-2747] Explicit re-schedule of sensors (#3596)

* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2887] Added BigQueryCreateEmptyDatasetOperator and create_emty_dataset to bigquery_hook (#3876)

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (#3934)

[AIRFLOW-3090] Specify path of key file in log message (#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-3130] Add CLI docs for users command

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (#3958)

[AIRFLOW-XXX] Add Compass to companies list (#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (#3937)

[AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators #3828
Added apply_default decorator.

Added test for operators

 [AIRFLOW-XXX] Fixing the issue in Documentation (#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (#3952)

Formatted code

 [AIRFLOW-XXX] Fixing the issue in Documentation (#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (#3952)

[AIRFLOW-2930] Fix celery excecutor scheduler crash (#3784)

Caused by an update in PR #3740.
execute_command.apply_async(args=command, ...)
-command is a list of short unicode strings and the above code pass multiple
arguments to a function defined as taking only one argument.
-command = ["airflow", "run", "dag323",...]
-args = command = ["airflow", "run", "dag323", ...]
-execute_command("airflow","run","dag3s3", ...) will be error and exit.

[AIRFLOW-2854] kubernetes_pod_operator add more configuration items (#3697)

* kubernetes_pod_operator add more configuration items
* fix test_kubernetes_pod_operator test_faulty_service_account failure case
* fix review comment issues
* pod_operator add hostnetwork config
* add doc example

[AIRFLOW-2994] Fix command status check in Qubole Check operator (#3790)

[AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)

* AIRFLOW-2949: Add syntax highlight for single quote strings

* AIRFLOW-2949: Also updated new UI main.css

[AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793)

There may be different combinations of arguments, and
some processings are being done 'silently', while users
may not be fully aware of them.

For example
- User only needs to provide either `ssh_hook`
  or `ssh_conn_id`, while this is not clear in doc
- if both provided, `ssh_conn_id` will be ignored.
- if `remote_host` is provided, it will replace
  the `remote_host` which wasndefined in `ssh_hook`
  or predefined in the connection of `ssh_conn_id`

These should be documented clearly to ensure it's
transparent to the users. log.info() should also be
used to remind users and provide clear logs.

In addition, add instance check for ssh_hook to ensure
it is of the correct type (SSHHook).

Tests are updated for this PR.

[AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

[AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

[AIRFLOW-2779] Make GHE auth third party licensed (#3803)

This reinstates the original license.

[AIRFLOW-XXX] Add Format to list of companies (#3824)

[AIRFLOW-2900] Show code for packaged DAGs (#3749)

[AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)

[AIRFLOW-2951] Update dag_run table end_date when state change (#3798)

The existing airflow only change dag_run table end_date value when
a user teminate a dag in web UI. The end_date will not be updated
if airflow detected a dag finished and updated its state.

This commit add end_date update in DagRun's set_state function to
make up tho problem mentioned above.

[AIRFLOW-2145] fix deadlock on clearing running TI (#3657)

a `shutdown` task is not considered be `unfinished`, so a dag run can
deadlock when all `unfinished` downstreams are all waiting on a task
that's in the `shutdown` state. fix this by considering `shutdown` to
be `unfinished`, since it's not truly a terminal state

[AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (#3833)

[AIRFLOW-2476] Allow tabulate up to 0.8.2 (#3835)

[AIRFLOW-XXX] Fix typos in faq.rst (#3837)

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (#3832)

(#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (#3804)

[AIRFLOW-3007] Update backfill example in Scheduler docs

The scheduler docs at https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated way of passing `schedule_interval`. `schedule_interval` should be pass to DAG as a separate parameter and not as a default arg.

[AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow' (#3845)

[AIRFLOW-3002] Fix variable & tests in GoogleCloudBucketHelper (#3843)

[AIRFLOW-2991] Log path to driver output after Dataproc job (#3827)

[AIRFLOW-XXX] Fix python3 and flake8 errors in dev/airflow-jira

This is a script that checks if the Jira's marked as fixed in a release
are actually merged in - getting this working is helpful to me in
preparing 1.10.1

[AIRFLOW-2883] Add import and export for pool cli using JSON

[AIRFLOW-3021] Add Censys to who uses Airflow list

> Censys
> Find and analyze every reachable server and device on the Internet
> https://censys.io/

closes AIRFLOW-3021 https://issues.apache.org/jira/browse/AIRFLOW-3021

Add Branch to Company List

[AIRFLOW-3008] Move Kubernetes example DAGs to contrib

[AIRFLOW-2997] Support cluster fields in bigquery (#3838)

This adds a cluster_fields argument to the bigquery hook, GCS to
bigquery operator and bigquery query operators. This field requests that
bigquery store the result of the query/load operation sorted according
to the specified fields (the order of fields given is significant).

[AIRFLOW-XXX] Redirect FAQ `airflow[crypto]` to How-to Guides.

[AIRFLOW-XXX] Remove redundant space in Kerberos (#3866)

[AIRFLOW-3028] Update Text & Images in Readme.md

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (#3862)

[AIRFLOW-2985] Operators for S3 object copying/deleting (#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.

[AIRFLOW-3030] Fix CLI docs (#3872)

[AIRFLOW-XXX] Update kubernetes.rst docs (#3875)

Update kubernetes.rst with correct KubernetesPodOperator inputs
for the volumes.

[AIRFLOW-XXX] Add Enigma to list of companies

[AIRFLOW-2965] CLI tool to show the next execution datetime

Cover different cases

- schedule_interval is "@once" or None, then following_schedule
  method would always return None
- If dag is paused, print reminder
- If latest_execution_date is not found, print warning saying
  not applicable.

[AIRFLOW-XXX] Add Bombora Inc using Airflow

[AIRFLOW-XXX] Move Dag level access control out of 1.10 section (#3882)

It isn't in 1.10 (and wasn't in this section when the PR was created).

[AIRFLOW-3012] Fix Bug when passing emails for SLA

[AIRFLOW-2797] Create Google Dataproc cluster with custom image (#3871)

[AIRFLOW-XXX] Updated README  to include CAVA

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (#3892)

[AIRFLOW-2524] Add SageMaker Batch Inference (#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (#3904)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (#3920)

[AIRFLOW-2407] Use feature detection for reload() (#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (#3934)

[AIRFLOW-3090] Specify path of key file in log message (#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (#3958)

[AIRFLOW-XXX] Add Compass to companies list (#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (#3937)

 [AIRFLOW-XXX] Fixing the issue in Documentation (#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (#3952)

Reformmatted to flaskdiff requirements.

[AIRFLOW-XXX] Remove residual line in Changelog (#3814)

[AIRFLOW-2930] Fix celery excecutor scheduler crash (#3784)

Caused by an update in PR #3740.
execute_command.apply_async(args=command, ...)
-command is a list of short unicode strings and the above code pass multiple
arguments to a function defined as taking only one argument.
-command = ["airflow", "run", "dag323",...]
-args = command = ["airflow", "run", "dag323", ...]
-execute_command("airflow","run","dag3s3", ...) will be error and exit.

[AIRFLOW-2854] kubernetes_pod_operator add more configuration items (#3697)

* kubernetes_pod_operator add more configuration items
* fix test_kubernetes_pod_operator test_faulty_service_account failure case
* fix review comment issues
* pod_operator add hostnetwork config
* add doc example

[AIRFLOW-2994] Fix command status check in Qubole Check operator (#3790)

[AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)

* AIRFLOW-2949: Add syntax highlight for single quote strings

* AIRFLOW-2949: Also updated new UI main.css

[AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793)

There may be different combinations of arguments, and
some processings are being done 'silently', while users
may not be fully aware of them.

For example
- User only needs to provide either `ssh_hook`
  or `ssh_conn_id`, while this is not clear in doc
- if both provided, `ssh_conn_id` will be ignored.
- if `remote_host` is provided, it will replace
  the `remote_host` which wasndefined in `ssh_hook`
  or predefined in the connection of `ssh_conn_id`

These should be documented clearly to ensure it's
transparent to the users. log.info() should also be
used to remind users and provide clear logs.

In addition, add instance check for ssh_hook to ensure
it is of the correct type (SSHHook).

Tests are updated for this PR.

[AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md

[AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference

[AIRFLOW-2779] Make GHE auth third party licensed (#3803)

This reinstates the original license.

[AIRFLOW-XXX] Add Format to list of companies (#3824)

[AIRFLOW-2900] Show code for packaged DAGs (#3749)

[AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)

[AIRFLOW-2951] Update dag_run table end_date when state change (#3798)

The existing airflow only change dag_run table end_date value when
a user teminate a dag in web UI. The end_date will not be updated
if airflow detected a dag finished and updated its state.

This commit add end_date update in DagRun's set_state function to
make up tho problem mentioned above.

[AIRFLOW-2145] fix deadlock on clearing running TI (#3657)

a `shutdown` task is not considered be `unfinished`, so a dag run can
deadlock when all `unfinished` downstreams are all waiting on a task
that's in the `shutdown` state. fix this by considering `shutdown` to
be `unfinished`, since it's not truly a terminal state

[AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (#3833)

[AIRFLOW-2476] Allow tabulate up to 0.8.2 (#3835)

[AIRFLOW-XXX] Fix typos in faq.rst (#3837)

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (#3832)

(#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (#3804)

[AIRFLOW-3007] Update backfill example in Scheduler docs

The scheduler docs at https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated way of passing `schedule_interval`. `schedule_interval` should be pass to DAG as a separate parameter and not as a default arg.

[AIRFLOW-3005] Replace 'Airbnb Airflow' with 'Apache Airflow' (#3845)

[AIRFLOW-3002] Fix variable & tests in GoogleCloudBucketHelper (#3843)

[AIRFLOW-2991] Log path to driver output after Dataproc job (#3827)

[AIRFLOW-XXX] Fix python3 and flake8 errors in dev/airflow-jira

This is a script that checks if the Jira's marked as fixed in a release
are actually merged in - getting this working is helpful to me in
preparing 1.10.1

[AIRFLOW-2883] Add import and export for pool cli using JSON

[AIRFLOW-3021] Add Censys to who uses Airflow list

> Censys
> Find and analyze every reachable server and device on the Internet
> https://censys.io/

closes AIRFLOW-3021 https://issues.apache.org/jira/browse/AIRFLOW-3021

Add Branch to Company List

[AIRFLOW-3008] Move Kubernetes example DAGs to contrib

[AIRFLOW-2997] Support cluster fields in bigquery (#3838)

This adds a cluster_fields argument to the bigquery hook, GCS to
bigquery operator and bigquery query operators. This field requests that
bigquery store the result of the query/load operation sorted according
to the specified fields (the order of fields given is significant).

[AIRFLOW-XXX] Redirect FAQ `airflow[crypto]` to How-to Guides.

[AIRFLOW-XXX] Remove redundant space in Kerberos (#3866)

[AIRFLOW-3028] Update Text & Images in Readme.md

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (#3862)

[AIRFLOW-2985] Operators for S3 object copying/deleting (#3823)

1. Copying:
Under the hood, it's `boto3.client.copy_object()`.
It can only handle the situation in which the
S3 connection used can access both source and
destination bucket/key.

2. Deleting:
2.1 Under the hood, it's `boto3.client.delete_objects()`.
It supports either deleting one single object or
multiple objects.
2.2 If users try to delete a non-existent object, the
request will still succeed, but there will be an
entry 'Errors' in the response. There may also be
other reasons which may cause similar 'Errors' (
request itself would succeed without explicit
exception). So an argument `silent_on_errors` is added
to let users decide if this sort of 'Errors' should
fail the operator.

The corresponding methods are added into S3Hook, and
these two operators are 'wrappers' of these methods.

[AIRFLOW-3030] Fix CLI docs (#3872)

[AIRFLOW-XXX] Update kubernetes.rst docs (#3875)

Update kubernetes.rst with correct KubernetesPodOperator inputs
for the volumes.

[AIRFLOW-XXX] Add Enigma to list of companies

[AIRFLOW-2965] CLI tool to show the next execution datetime

Cover different cases

- schedule_interval is "@once" or None, then following_schedule
  method would always return None
- If dag is paused, print reminder
- If latest_execution_date is not found, print warning saying
  not applicable.

[AIRFLOW-XXX] Add Bombora Inc using Airflow

[AIRFLOW-XXX] Move Dag level access control out of 1.10 section (#3882)

It isn't in 1.10 (and wasn't in this section when the PR was created).

[AIRFLOW-3012] Fix Bug when passing emails for SLA

[AIRFLOW-2797] Create Google Dataproc cluster with custom image (#3871)

[AIRFLOW-XXX] Updated README  to include CAVA

[AIRFLOW-3035] Allow custom 'job_error_states' in dataproc ops (#3884)

Allow caller to pass in custom list of Dataproc job states into the
DataProc*Operator classes that should result in the
_DataProcJob.raise_error() method raising an Exception.

[AIRFLOW-3034]: Readme updates : Add Slack & Twitter, remove Gitter

[AIRFLOW-3056] Add happn to Airflow user list

[AIRFLOW-3052] Add logo options to Airflow (#3892)

[AIRFLOW-2524] Add SageMaker Batch Inference (#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-XXX] Added Jeitto as one of happy Airflow users! (#3902)

[AIRFLOW-XXX] Add Jeitto as one happy Airflow user!

[AIRFLOW-3044] Dataflow operators accept templated job_name param (#3887)

* Default value of new job_name param is templated task_id, to match the
existing behavior as much as possible.
* Change expected value in test_mlengine_operator_utils.py to match
default for new job_name param.

[AIRFLOW-2707] Validate task_log_reader on upgrade from <=1.9 (#3881)

We changed the default logging config and config from 1.9 to 1.10, but
anyone who upgrades and has an existing airflow.cfg won't know they need
to change this value - instead they will get nothing displayed in the UI
(ajax request fails) and see "'NoneType' object has no attribute 'read'"
in the error log.

This validates that config section at start up, and seamlessly upgrades
the old previous value.

[AIRFLOW-3025] Enable specifying dns and dns_search options for DockerOperator (#3860)

Enable specifying dns and dns_search options for DockerOperator

[AIRFLOW-1298] Clear UPSTREAM_FAILED using the clean cli (#3886)

* [AIRFLOW-1298] Fix 'clear only_failed'

* [AIRFLOW-1298] Fix 'clear only_failed'

[AIRFLOW-3059] Log how many rows are read from Postgres (#3905)

To know how many data is being read from Postgres, it is nice to log
this to the Airflow log.

Previously when there was no data, it would still create a single file.
This is not something that we want, and therefore we've changed this
behaviour.

Refactored the tests to make use of Postgres itself since we have it
running. This makes the tests more realistic, instead of mocking
everything.

[AIRFLOW-XXX] Fix typo in docs/timezone.rst (#3904)

[AIRFLOW-3068] Remove deprecated imports

[AIRFLOW-3036] Add relevant ECS options to ECS operator. (#3908)

The ECS operator currently supports only a subset of available options
for running ECS tasks. This patch adds all ECS options that could be
relevant to airflow; options that wouldn't make sense here, like
`count`, were skipped.

[AIRFLOW-1195] Add feature to clear tasks in Parent Dag (#3907)

[AIRFLOW-3073] Add note-Profiling feature not supported in new webserver (#3909)

Adhoc queries and Charts features are no longer supported in new
FAB-based webserver and UI. But this is not mentioned at all in the doc
"Data Profiling" (https://airflow.incubator.apache.org/profiling.html)

This commit adds a note to remind users for this.

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-1441] Fix inconsistent tutorial code (#2466)

[AIRFLOW-XXX] Add 90 Seconds to companies

[AIRFLOW-3096] Further reduce DaysUntilStale for probo/stale

[AIRFLOW-3072] Assign permission get_logs_with_metadata to viewer role (#3913)

[AIRFLOW-3090] Demote dag start/stop log messages to debug (#3920)

[AIRFLOW-2407] Use feature detection for reload() (#3298)

* [AIRFLOW-2407] Use feature detection for reload()

[Use feature detection instead of version detection](https://docs.python.org/3/howto/pyporting.html#use-feature-detection-instead-of-version-detection) is a Python porting best practice that avoids a flake8 undefined name error...

flake8 testing of https://github.com/apache/incubator-airflow on Python 3.6.3

[AIRFLOW-XXX] Fix a wrong sample bash command, a display issue & a few typos (#3924)

[AIRFLOW-3090] Make No tasks to consider for execution debug (#3923)

During normal operation, it is not necessary to see the message.  This
can only be useful when debugging an issue.

AIRFLOW-2952 Fix Kubernetes CI (#3922)

The current dockerised CI pipeline doesn't run minikube and the
Kubernetes integration tests. This starts a Kubernetes cluster
using minikube and runs k8s integration tests using docker-compose.

[AIRFLOW-2918] Fix Flake8 violations (#3931)

[AIRFLOW-3076] Remove preloading of MySQL testdata (#3911)

One of the things for tests is being self contained. This means that
it should not depend on anything external, such as loading data.

This PR will use the setUp and tearDown to load the data into MySQL
and remove it afterwards. This removes the actual bash mysql commands
and will make it easier to dockerize the whole testsuite in the future

[AIRFLOW-2918] Remove unused imports

[AIRFLOW-3099] Stop Missing Section Errors for optional sections (#3934)

[AIRFLOW-3090] Specify path of key file in log message (#3921)

[AIRFLOW-3067] Display www_rbac Flask flash msg properly (#3903)

The Flask flash messages are not displayed properly.

When we don't give a category for a flash message, defautl
value will be 'message'. In some cases, we specify 'error'
category.

Using Flask-AppBuilder, the flash message will be given
a CSS class 'alert-[category]'. But We don't have
'alert-message' or 'alert-error' in the current
'bootstrap-theme.css' file.

This makes the the flash messages in www_rbac UI come with
no background color.

This commit addresses this issue by adding 'alert-message'
(using specs of existing CSS class 'alert-info') and
'alert-error' (using specs of existing CSS class 'alert-danger')
into 'bootstrap-theme.css'.

[AIRFLOW-3109] Bugfix to allow user/op roles to clear task intance via UI by default

add show statements to hql filtering.

[AIRFLOW-3051] Change CLI to make users ops similar to connections

The ability to manipulate users from the  command line is a bit clunky.  Currently 'airflow create_user' and 'airflow delete_user' and 'airflow list_users'.  It seems that these ought to be made more like connections, so that it becomes 'airflow users list ...', 'airflow users delete ...' and 'airflow users create ...'

[AIRFLOW-3009] Import Hashable from collection.abc to fix Python 3.7 deprecation warning (#3849)

[AIRFLOW-XXX] Add Tesla as an Apache Airflow user (#3947)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3117] Add instructions to allow GPL dependency (#3949)

The installation instructions failed to mention how to proceed with the GPL dependency. For those who are not concerned by GPL, it is useful to know how to proceed with GPL dependency.

[AIRFLOW-XXX] Add Square to the companies lists

[AIRFLOW-XXX] Add Fathom Health to readme

[AIRFLOW-XXX] Pin Click to 6.7 to Fix CI (#3962)

[AIRFLOW-XXX] Fix SlackWebhookOperator execute method comment (#3963)

[AIRFLOW-3100][AIRFLOW-3101] Improve docker compose local testing (#3933)

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-XXX] Fix PythonVirtualenvOperator tests (#3968)

The recent update to the CI image changed the default
python from python2 to python3. The PythonVirtualenvOperator
tests expected python2 as default and fail due to
serialisation errors.

[AIRFLOW-2952] Fix Kubernetes CI (#3957)

- Update outdated cli command to create user
- Remove `airflow/example_dags_kubernetes` as the dag already exists in `contrib/example_dags/`
- Update the path to copy K8s dags

[AIRFLOW-3104] Add .airflowignore info into doc (#3939)

.airflowignore is a nice feature, but it was not mentioned at all in the documentation.

[AIRFLOW-XXX] Add Delete for CLI Example in UPDATING.md

[AIRFLOW-3123] Use a stack for DAG context management (#3956)

[AIRFLOW-3125] Monitor Task Instances creation rates (#3966)

Montor Task Instances creation rates by Operator type.
These stats can provide some visibility on how much workload Airflow is
getting. They can be used for resource allocation in the long run (i.e.
to determine when we should scale up workers) and debugging in scenarios
like the creation rate of certain type of Task Instances spikes.

[AIRFLOW-3129] Backfill mysql hook unit tests. (#3970)

[AIRFLOW-3124] Fix RBAC webserver debug mode (#3958)

[AIRFLOW-XXX] Add Compass to companies list (#3972)

We're using Airflow at Compass now.

[AIRFLOW-XXX] Speed up DagBagTest cases (#3974)

I noticed that many of the tests of DagBags operate on a specific DAG
only, and don't need to load the example or test dags. By not loading
the dags we don't need to this shaves about 10-20s of test time.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-1390] Update Alembic to 0.9 (#3935)

[AIRFLOW-2238] Update PR tool to remove outdated info (#3978)

[AIRFLOW-XXX] Don't spam test logs with "bad cron expression" messages (#3973)

We needed these test dags to check the behaviour of invalid cron
expressions, but by default we were loading them every time we create a
DagBag (which many, many tests to).

Instead we ignore these known-bad dags by default, and the test checking
those (tests/models.py:DagBagTest.test_process_file_cron_validity_check)
is already explicitly processing those DAGs directly, so it remains
tested.

[AIRFLOW-XXX] Fix undocumented params in S3_hook

Some function parameters were undocumented. Additional docstrings
were added for clarity.

[AIRFLOW-3079] Improve migration scripts to support MSSQL Server (#3964)

There were two problems for MSSQL.  First, 'timestamp' data type in MSSQL Server
is essentially a row-id, and not a timezone enabled date/time stamp. Second, alembic
creates invalid SQL when applying the 0/1 constraint to boolean values. MSSQL should
enforce this constraint by simply asserting a boolean value.

[AIRFLOW-XXX] Add DoorDash to README.md (#3980)

DoorDash uses Airflow https://softwareengineeringdaily.com/2018/09/28/doordash/

[AIRFLOW-3062] Add Qubole in integration docs (#3946)

[AIRFLOW-3129] Improve test coverage of airflow.models. (#3982)

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3137] Make ProxyFix middleware optional. (#3983)

The ProxyFix middleware should only be used when airflow is running
behind a trusted proxy. This patch adds a `USE_PROXY_FIX` flag that
defaults to `False`.

[AIRFLOW-3004] Add config disabling scheduler cron (#3899)

[AIRFLOW-3103][AIRFLOW-3147] Update flask-appbuilder (#3937)

 [AIRFLOW-XXX] Fixing the issue in Documentation (#3998)

Fixing the operator name from DataFlowOperation  to DataFlowJavaOperator  in Documentation

[AIRFLOW-3088] Include slack-compatible emoji image

[AIRFLOW-3161] fix TaskInstance log link in RBAC UI

[AIRFLOW-3148] Remove unnecessary arg "parameters" in RedshiftToS3Transfer (#3995)

"Parameters" are used to help render the SQL command.
But in this operator, only "schema" and "table" are needed.
There is no SQL command to render.

By checking the code,we can also find argument
"parameters" is never really used.

(Fix a minor issue in the docstring as well)

[AIRFLOW-3159] Update GCS logging docs for latest code (#3952)

[AIRFLOW-XXX] Fix  airflow.models.DAG docstring mistake

Closes #4004 from Sambeth/sambeth

Updated the tests written for s3/sftp operators

Fixed the flask diff errors.

Fixed aws connection test

Fixed flask diff errors.

Updated test_s3_to_sftp_operator with correct class name.

Fixed test_s3_to_sftp_operator error reported in travis.

Fixed test_s3_to_sftp_operator error reported in travis.

Changed default values for s3_to_sftp_operator

Updated test for checking for sftp file content.

Fixed flask diff error.

[AIRFLOW-XXX] Adding Home Depot as users of Apache airflow (#4013)

* Adding Home Depot as users of Apache airflow

[AIRFLOW-XXX] Added ThoughtWorks as user of Airflow in README (#4012)

[AIRFLOW-XXX] Added DataCamp to list of companies in README (#4009)

[AIRFLOW-3165] Document interpolation of '%' and warn (#4007)

[AIRFLOW-3099] Complete list of optional airflow.cfg sections (#4002)

[AIRFLOW-3162] Fix HttpHook URL parse error when port is specified (#4001)

[AIRFLOW-3055] add get_dataset and get_datasets_list to bigquery_hook (#3894)

* [AIRFLOW-3055] add get_dataset and get_datasets_list to bigquery_hook
@gsilk

This comment has been minimized.

Copy link
Contributor

gsilk commented Oct 24, 2018

Will this work equally well on sensors as well as regular tasks?

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Oct 25, 2018

To make it work for regular operators I think the only thing to do is to override the dep function in your operator and add the ReadyToRescheduleDep, just like in https://github.com/apache/incubator-airflow/pull/3596/files#diff-81901d44cd480538f1f35b641b9ede0cR126. But I didn't test that yet.

Fokko added a commit to Fokko/incubator-airflow that referenced this pull request Dec 6, 2018

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

Fokko added a commit to Fokko/incubator-airflow that referenced this pull request Dec 6, 2018

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

ashb added a commit to ashb/airflow that referenced this pull request Dec 15, 2018

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode
@ashb

This comment has been minimized.

Copy link
Member

ashb commented Dec 15, 2018

I think this feature needs some work

  1. the tasks that are in "reschedule" just show up as "None" in the tree view:
    screen shot 2018-12-15 at 18 59 07

This makes it hard-to-impossible to know what is going on and they probably need to show up as a new state, or just make them show up as running, queued or something? Showing as None isn't helpful. (I even knew what was going on and was still confused!

  1. The scheduler (when using SequentialExecutor, but that isn't relevant) logs this task as Success!

    [2018-12-15 18:59:13,635] {jobs.py:1100} INFO - 1 tasks up for execution:
         <TaskInstance: hello_world.wait 2018-12-15 18:50:00+00:00 [scheduled]>
    [2018-12-15 18:59:13,649] {jobs.py:1135} INFO - Figuring out tasks to run in Pool(name=None) with 128 open slots and 1 task instances in queue
    [2018-12-15 18:59:13,656] {jobs.py:1171} INFO - DAG hello_world has 0/16 running and queued tasks
    [2018-12-15 18:59:13,656] {jobs.py:1209} INFO - Setting the follow tasks to queued state:
         <TaskInstance: hello_world.wait 2018-12-15 18:50:00+00:00 [scheduled]>
    [2018-12-15 18:59:13,698] {jobs.py:1293} INFO - Setting the following 1 tasks to queued state:
         <TaskInstance: hello_world.wait 2018-12-15 18:50:00+00:00 [queued]>
    [2018-12-15 18:59:13,699] {jobs.py:1335} INFO - Sending ('hello_world', 'wait', datetime.datetime(2018, 12, 15, 18, 50, tzinfo=<Timezone [UTC]>), 1) to executor with priority 2 and queue default
    [2018-12-15 18:59:13,701] {base_executor.py:56} INFO - Adding to queue: airflow run hello_world wait 2018-12-15T18:50:00+00:00 --local -sd /Users/ash/airflow/dags/foo.py
    [2018-12-15 18:59:13,742] {sequential_executor.py:45} INFO - Executing command: airflow run hello_world wait 2018-12-15T18:50:00+00:00 --local -sd /Users/ash/airflow/dags/foo.py
    [2018-12-15 18:59:15,558] {__init__.py:51} INFO - Using executor SequentialExecutor
    [2018-12-15 18:59:15,755] {models.py:273} INFO - Filling up the DagBag from /Users/ash/airflow/dags/foo.py
    [2018-12-15 18:59:15,833] {cli.py:530} INFO - Running <TaskInstance: hello_world.wait 2018-12-15T18:50:00+00:00 [queued]> on host themisto.localdomain
    [2018-12-15 18:59:21,427] {jobs.py:1439} INFO - Executor reports hello_world.wait execution_date=2018-12-15 18:50:00+00:00 as success for try_number 1
    

The dag I was testing this with as:

from airflow import DAG
import datetime
import airflow
from airflow.operators.dummy_operator import DummyOperator
from airflow.sensors.time_sensor import TimeSensor

start = (
    airflow.utils.dates.days_ago(2)
)

dag2 = DAG('hello_world',
           schedule_interval='*/5 * * * *',
           start_date=start,
           catchup=False)

with dag2:
    (
        TimeSensor(
            task_id='wait',
            target_time=datetime.time(20),
            mode='reschedule',
        ) >>
        DummyOperator(task_id='dummy')
    )

@seelmann Are you able to work on fixing these two issues?

ashb added a commit that referenced this pull request Dec 15, 2018

[AIRFLOW-2747] Explicit re-schedule of sensors (#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode
@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Dec 15, 2018

@ashb I can work on those during the holidays.

Regarding 1 (the "None" state): I agree it's not optimal that there is not indication about what's going on. Do you want to have a new dedicated state (in state.py)? Or is is just about the visualization in the tree view (and other views make also sense IMHO)? Adding a new state was discussed but decided against. Changing the visualization in views should be possible, in Gantt view it's already done.

Regarding 2 (the success log): What do you expect should be logged instead of success? If I look into sequential_executor.py it always returns sets success or failed, depending if the command execution was successful or not. Should we make all executors aware of the reschedule state?

PS: We run 1.10.0 with this patch successfully in production since November :)

@ashb

This comment has been minimized.

Copy link
Member

ashb commented Dec 15, 2018

@seelmann Thanks, that would be ace! This feature is very nice.

I'm mostly concerned with the visualisation on the Graph, Tree, and Task Instance Detail pages. Gannt isn't one I look at very often, but looking at it now I don't see anything indicating retries? (That or I cherry-picked it on to 1-10-test wrong) Could you give some screenshots?

As for the log? Hmmm, There are only a few executors so updating them to be aware of it and log not log something incorrect shouldn't be too much work?

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Dec 15, 2018

Screenshots are attached at the Jira.

The Gantt view should show a white bar when it's rescheduled in thus in "None" state, see https://issues.apache.org/jira/browse/AIRFLOW-2747#comment-16616842

In a previous version all rescheduled executions were visible, but that ended up in too many small bars: https://issues.apache.org/jira/browse/AIRFLOW-2747#comment-16541539

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Dec 15, 2018

Another thing that should be improved: The start date, end date, and duration in the tooltip when hovering over the task only shows the values of the last run, should be from first to last.

@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Dec 16, 2018

@ashb @seelmann To simplify things, I'm all in for setting the re-schedule method as the default scheduling method for sensors, and get rid of the blocking sensors. This will also enable the use of sensors on the SequentialExecutor (apart from the reschedule state). But more important from my perspective, it will greatly simplify the logic since we don't have to maintain two branches of the sensor execution (legacy, and re-scheduling). Curious what you guys think.

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Dec 16, 2018

@Fokko If the intention is to cherry-pick into 1.10.2 I'd not make it the default yet. It would change the behaviour of sensors, e.g. short poke intervals below 5 seconds are no longer possible and likely increase scheduler load. For 2.0 I think it's ok, let's just add a note to UPDATING.md.

@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Dec 20, 2018

I fully agree, @seelmann. My suggestion was to make it default for Airflow 2.0

@seelmann

This comment has been minimized.

Copy link
Member Author

seelmann commented Dec 29, 2018

@Fokko

This comment has been minimized.

Copy link
Contributor

Fokko commented Jan 2, 2019

Thanks @seelmann

aliceabe pushed a commit to aliceabe/incubator-airflow that referenced this pull request Jan 3, 2019

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

kaxil added a commit that referenced this pull request Jan 9, 2019

[AIRFLOW-2747] Explicit re-schedule of sensors (#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

ashb added a commit to ashb/airflow that referenced this pull request Jan 10, 2019

[AIRFLOW-2747] Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode

cfei18 pushed a commit to cfei18/incubator-airflow that referenced this pull request Jan 23, 2019

Explicit re-schedule of sensors (apache#3596)
* [AIRFLOW-2747] Explicit re-schedule of sensors

Add `mode` property to sensors. If set to `reschedule` an
AirflowRescheduleException is raised instead of sleeping which sets
the task back to state `NONE`. Reschedules are recorded in new
`task_schedule` table and visualized in the Gantt view. New TI
dependency checks if a sensor task is ready to be re-scheduled.

* Reformat sqlalchemy imports

* Make `_handle_reschedule` private

* Remove print

* Add comment

* Add comment

* Don't record reschule request in test mode
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment