-
Notifications
You must be signed in to change notification settings - Fork 13.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix postgres part of pipeline example of tutorial #21586
Fix postgres part of pipeline example of tutorial #21586
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks really cool!
The PR is likely ready to be merged. No tests are needed as no important environment files, nor python files were modified by it. However, committers might decide that full test matrix is needed and add the 'full tests needed' label. Then you should rebase it to the latest main or amend the last commit of the PR, and push it with --force-with-lease. |
I think It's even better to leave postgres_default. It's a good practice for anyone who starts their adventure with Airflow. I will make a suggestion for it. |
docs/apache-airflow/tutorial.rst
Outdated
We need to add a connection to Postgres. Go to the UI and click "Admin" >> "Connections". Specify the following for each field: | ||
|
||
- Conn id: LOCAL | ||
- Conn Type: postgres | ||
- Host: postgres | ||
- Schema: <DATABASE_NAME> | ||
- Login: airflow | ||
- Password: airflow | ||
- Port: 5432 | ||
|
||
Alternatively, you can use the postgres_default connection. If one uses this, one should change the value of ``postgres_conn_id`` in the DAG definition: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to add a connection to Postgres. Go to the UI and click "Admin" >> "Connections". Specify the following for each field: | |
- Conn id: LOCAL | |
- Conn Type: postgres | |
- Host: postgres | |
- Schema: <DATABASE_NAME> | |
- Login: airflow | |
- Password: airflow | |
- Port: 5432 | |
Alternatively, you can use the postgres_default connection. If one uses this, one should change the value of ``postgres_conn_id`` in the DAG definition: | |
You can use the postgres_default connection. You should change the value of ``postgres_conn_id`` in the DAG definition: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's OK to use "postgres_default" only for it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the review!
One small comment: In this case, maybe in the code postgres_conn_id=LOCAL
should be changed to postgres_conn_id=postgres_default
for complete correctness :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah. Feel free to fixup :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now, it should be fine!
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
(cherry picked from commit 40028f3)
related: #21457
After a discussion with @potiuk, we figured out how to include the explicit creation of the postgres tables into the tutorial.
One remark that I need to write here is that I couldn't manage it to work with a LOCAL connection, but instead I used the postgres_default. Maybe it is a bad practice and should be corrected, I just put it as an alternative, because it worked for me.