Apache Airflow Provider(s)
sftp
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.6.0
apache-airflow-providers-apache-hdfs==2.2.0
apache-airflow-providers-apache-hive==2.1.0
apache-airflow-providers-apache-livy==2.1.0
apache-airflow-providers-apache-spark==2.0.3
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-elasticsearch==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==4.0.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-jira==2.0.1
apache-airflow-providers-postgres==2.4.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sftp==2.4.0
apache-airflow-providers-slack==4.1.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
Apache Airflow version
2.2.3 (latest released)
Operating System
Ubuntu 20.04
Deployment
Other Docker-based deployment
Deployment details
No response
What happened
I believe this was introduced in this commit f35ad27
In File airflow/providers/sftp/hooks/sftp.py
In lines 74-79
def __init__(
self,
ssh_conn_id: Optional[str] = 'sftp_default',
ftp_conn_id: Optional[str] = 'sftp_default',
*args,
**kwargs,
) -> None:
if ftp_conn_id:
warnings.warn(
'Parameter `ftp_conn_id` is deprecated.' 'Please use `ssh_conn_id` instead.',
DeprecationWarning,
stacklevel=2,
)
kwargs['ssh_conn_id'] = ftp_conn_id
self.ssh_conn_id = ssh_conn_id
super().__init__(*args, **kwargs)
Since ftp_conn_id has a default value of sftp_default, it will always override ssh_conn_id unless explicitly set to None during init.
What you expected to happen
If you initialise the hook with ssh_conn_id parameter it should use that connection instead of ignoring the parameter and using the default value.
The deprecation message is also triggered despite only passing in ssh_conn_id
<stdin>:1 DeprecationWarning: Parameter ftp_conn_id is deprecated.Please use ssh_conn_id instead.
How to reproduce
from airflow.providers.sftp.hooks.sftp import SFTPHook
hook = SFTPHook(ssh_conn_id='some_custom_sftp_conn_id')
There will be a log message stating
[2022-01-07 01:12:05,234] {base.py:70} INFO - Using connection to: id: sftp_default.
Anything else
This breaks the SFTPSensor and SFTPOperator
Are you willing to submit PR?
Code of Conduct
Apache Airflow Provider(s)
sftp
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.6.0
apache-airflow-providers-apache-hdfs==2.2.0
apache-airflow-providers-apache-hive==2.1.0
apache-airflow-providers-apache-livy==2.1.0
apache-airflow-providers-apache-spark==2.0.3
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-elasticsearch==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==4.0.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-jira==2.0.1
apache-airflow-providers-postgres==2.4.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sftp==2.4.0
apache-airflow-providers-slack==4.1.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
Apache Airflow version
2.2.3 (latest released)
Operating System
Ubuntu 20.04
Deployment
Other Docker-based deployment
Deployment details
No response
What happened
I believe this was introduced in this commit f35ad27
In File airflow/providers/sftp/hooks/sftp.py
In lines 74-79
Since
ftp_conn_idhas a default value ofsftp_default, it will always overridessh_conn_idunless explicitly set to None during init.What you expected to happen
If you initialise the hook with
ssh_conn_idparameter it should use that connection instead of ignoring the parameter and using the default value.The deprecation message is also triggered despite only passing in
ssh_conn_id<stdin>:1 DeprecationWarning: Parameter ftp_conn_id is deprecated.Please use ssh_conn_id instead.How to reproduce
There will be a log message stating
[2022-01-07 01:12:05,234] {base.py:70} INFO - Using connection to: id: sftp_default.Anything else
This breaks the SFTPSensor and SFTPOperator
Are you willing to submit PR?
Code of Conduct