apache-airflow-providers-apache-hive
Bring back mysql-connector-python as required depednency (#32989)
Fix Pandas2 compatibility for Hive (#32752)
Add more accurate typing for DbApiHook.run method (#31846)
Move Hive configuration to Apache Hive provider (#32777)
Add proxy_user template check (#32334)
Note
This release dropped support for Python 3.7
Sanitize beeline principal parameter (#31983)
Replace unicodecsv with standard csv library (#31693)
Note
This release of provider is only available for Airflow 2.4+ as explained in the Apache Airflow providers support policy.
Bump minimum Airflow version in providers (#30917)
Update return types of 'get_key' methods on 'S3Hook' (#30923)
The auth option is moved from the extra field to the auth parameter in the Hook. If you have extra parameters defined in your connections as auth, you should move them to the DAG where your HiveOperator or other Hive related operators are used.
Move auth parameter from extra to Hook parameter (#30212)
Validate Hive Beeline parameters (#29502)
Fixed MyPy errors introduced by new mysql-connector-python (#28995)
Move local_infile option from extra to hook parameter (#28811)
The apache.hive
provider provides now hive macros that used to be provided by Airflow. As of 5.1.0 version of apache.hive
the hive macros are provided by the Provider.
Move Hive macros to the provider (#28538)
Make pandas dependency optional for Amazon Provider (#28505)
The hive_cli_params
from connection were moved to the Hook. If you have extra parameters defined in your connections as hive_cli_params
extra, you should move them to the DAG where your HiveOperator is used.
Move hive_cli_params to hook parameters (#28101)
Improve filtering for invalid schemas in Hive hook (#27808)
Bump common.sql provider to 1.3.1 (#27888)
Note
This release of provider is only available for Airflow 2.3+ as explained in the Apache Airflow providers support policy.
Move min airflow version to 2.3.0 for all providers (#27196)
Filter out invalid schemas in Hive hook (#27647)
Add common-sql lower bound for common-sql (#25789)
- The
hql
parameter inget_records
ofHiveServer2Hook
has been renamed to sql to match theget_records
DbApiHook signature. If you used it as a positional parameter, this is no change for you, but if you used it as keyword one, you need to rename it. hive_conf
parameter has been renamed toparameters
and it is now second parameter, to matchget_records
signature from the DbApiHook. You need to rename it if you used it.schema
parameter inget_records
is an optional kwargs extra parameter that you can add, to match the schema ofget_records
from DbApiHook.Deprecate hql parameters and synchronize DBApiHook method APIs (#25299)
Remove Smart Sensors (#25507)
Move all SQL classes to common-sql provider (#24836)
fix connection extra parameter 'auth_mechanism' in 'HiveMetastoreHook' and 'HiveServer2Hook' (#24713)
Note
This release of provider is only available for Airflow 2.2+ as explained in the Apache Airflow providers support policy.
chore: Refactoring and Cleaning Apache Providers (#24219)
AIP-47 - Migrate hive DAGs to new design #22439 (#24204)
Fix HiveToMySqlOperator's wrong docstring (#23316)
Fix mistakenly added install_requires for all providers (#22382)
Add Trove classifiers in PyPI (Framework :: Apache Airflow :: Provider)
Set larger limit get_partitions_by_filter in HiveMetastoreHook (#21504)
Fix Python 3.9 support in Hive (#21893)
Fix key typo in 'template_fields_renderers' for 'HiveOperator' (#21525)
Support for Python 3.10
Add how-to guide for hive operator (#21590)
Add more SQL template fields renderers (#21237)
Add conditional 'template_fields_renderers' check for new SQL lexers (#21403)
hive provider: restore HA support for metastore (#19777)
fix get_connections deprecation warn in hivemetastore hook (#18854)
HiveHook fix get_pandas_df() failure when it tries to read an empty table (#17777)
Optimise connection importing for Airflow 2.2.0
Add Python 3.9 support (#15515)
Auto-apply apply_default decorator (#15667)
Warning
Due to apply_default decorator removal, this version of the provider requires Airflow 2.1.0+. If your Airflow version is < 2.1.0, and you want to install this provider version, first upgrade Airflow to at least version 2.1.0. Otherwise your Airflow package version will be upgraded automatically and you will have to manually run airflow upgrade db
to complete the migration.
Fix mistake and typos in doc/docstrings (#15180)
Fix grammar and remove duplicate words (#14647)
Resolve issue related to HiveCliHook kill (#14542)
Corrections in docs and tools after releasing provider RCs (#14082)
Updated documentation and readme files.
Remove password if in LDAP or CUSTOM mode HiveServer2Hook (#11767)
Initial version of the provider.