-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 Source Amazon Seller Partner: Add new streams #13604
🎉 Source Amazon Seller Partner: Add new streams #13604
Conversation
@vladimir-remar thanks for the contribution. Could you help with these things
|
/test connector=connectors/source-amazon-seller-partner
Build PassedTest summary info:
|
Hello @harshithmullapudi, this is my format output vladimir@vremar:airbyte$ ./gradlew :airbyte-integrations:connectors:source-amazon-seller-partner:airbytePythonFormat
Starting a Gradle Daemon, 1 incompatible and 1 stopped Daemons could not be reused, use --status for details
Building all of Airbyte.
/home/vladimir/Work/Airbyte_projects/Fork/airbyte/airbyte-integrations/connectors
Type-safe dependency accessors is an incubating feature.
> Task :airbyte-integrations:connectors:source-amazon-seller-partner:checkPython
Using python 3.9.13 from /home/vladimir/Work/Airbyte_projects/Fork/airbyte/airbyte-integrations/connectors/source-amazon-seller-partner/.venv (.venv/bin/python)
Using pip 21.3.1 from /home/vladimir/Work/Airbyte_projects/Fork/airbyte/airbyte-integrations/connectors/source-amazon-seller-partner/.venv/lib/python3.9/site-packages/pip (python 3.9)
> Task :airbyte-integrations:connectors:source-amazon-seller-partner:pipInstall
[python] .venv/bin/python -m pip install pip==21.3.1
Requirement already satisfied: pip==21.3.1 in ./.venv/lib/python3.9/site-packages (21.3.1)
WARNING: You are using pip version 21.3.1; however, version 22.1.2 is available.
You should consider upgrading via the '/home/vladimir/Work/Airbyte_projects/Fork/airbyte/airbyte-integrations/connectors/source-amazon-seller-partner/.venv/bin/python -m pip install --upgrade pip' command.
[python] .venv/bin/python -m pip list --format=columns
Package Version
----------------- -------
attrs 21.4.0
black 22.3.0
click 8.1.3
coverage 6.3.1
flake8 4.0.1
iniconfig 1.1.1
isort 5.6.4
mccabe 0.6.1
mypy 0.930
mypy-extensions 0.4.3
packaging 21.3
pathspec 0.9.0
pip 21.3.1
platformdirs 2.5.2
pluggy 0.13.1
py 1.11.0
pycodestyle 2.8.0
pyflakes 2.4.0
pyparsing 3.0.9
pyproject-flake8 0.0.1a2
pytest 6.1.2
setuptools 62.1.0
toml 0.10.2
tomli 2.0.1
typing_extensions 4.2.0
wheel 0.37.1
WARNING: You are using pip version 21.3.1; however, version 22.1.2 is available.
You should consider upgrading via the '/home/vladimir/Work/Airbyte_projects/Fork/airbyte/airbyte-integrations/connectors/source-amazon-seller-partner/.venv/bin/python -m pip install --upgrade pip' command.
> Task :airbyte-integrations:connectors:source-amazon-seller-partner:isortFormat
[python] .venv/bin/python -m isort --settings-file=/home/vladimir/Work/Airbyte_projects/Fork/airbyte/pyproject.toml ./
Skipped 6 files
> Task :airbyte-integrations:connectors:source-amazon-seller-partner:blackFormat
[python] .venv/bin/python -m black --config /home/vladimir/Work/Airbyte_projects/Fork/airbyte/pyproject.toml ./
All done! ✨ 🍰 ✨
14 files left unchanged.
> Task :airbyte-integrations:connectors:source-amazon-seller-partner:flakeCheck
[python] .venv/bin/python -m pflake8 --config /home/vladimir/Work/Airbyte_projects/Fork/airbyte/pyproject.toml ./
Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins.
See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
BUILD SUCCESSFUL in 44s
16 actionable tasks: 9 executed, 7 up-to-date
|
|
||
return {"reportId": stream_slice.get("report_id")} | ||
|
||
def stream_slices( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vladimir-remar could you help me to understand this part
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, since those reports are not created or scheduled, instead of use _report_data
normal functionability I've modify to read report_id
from slices which contains the report_id
generate by Amazon, I did that in order to not modify the actual behavior in read_records
method and make it compatible with other streams.
Line 339 in a82c01b
report_id = self._create_report(sync_mode, cursor_field, stream_slice, stream_state)["reportId"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vladimir-remar I don't think it's right way to use the stream_slices here. @marcosmarxm do you have any suggestions here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not the main usability for stream slices, but if it's solving the problem I don't see way not use it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also probably would be good to implement a unit test mocking the request but making clear what this function is doing.
|
||
return {"reportId": stream_slice.get("report_id")} | ||
|
||
def stream_slices( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not the main usability for stream slices, but if it's solving the problem I don't see way not use it.
``` | ||
""" | ||
|
||
strict_start_date = pendulum.now("utc").subtract(days=90) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain why the strict start date? Also update the connector documentation explaining for this report won't follow the start date in spec?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure, That strict limit comes from documentation:
createdSince
The earliest report creation date and time for reports to include in the response, in ISO 8601 date time format. The default is 90 days ago. Reports are retained for a maximum of 90 days.
For testing porpuses if a date is less than 90 days ago the status from response is 400.
|
||
return {"reportId": stream_slice.get("report_id")} | ||
|
||
def stream_slices( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also probably would be good to implement a unit test mocking the request but making clear what this function is doing.
Hey @vladimir-remar could you help on those changes Marcos requested? |
Thanks for your patience, I will check it ASAP. |
/test connector=connectors/source-amazon-seller-partner
Build PassedTest summary info:
|
/publish connector=connectors/source-amazon-seller-partner
if you have connectors that successfully published but failed definition generation, follow step 4 here |
…rbytehq-master * 'master' of https://github.com/airbytehq/airbyte: (1141 commits) pass USE_STREAM_CAPABLE_STATE env var to containers/deployments (airbytehq#14737) Bump mqtt connector (airbytehq#14648) Add error code to ManualOperationResult (airbytehq#14657) Bump elasticsearch version (airbytehq#14640) Ryan/sync oracle version number (airbytehq#14736) Fixed linter issue with add_fields.py comments (airbytehq#14742) 🎉Redshift, Databricks, Snowflake, S3 Destinations: Make S3 output filename configurable (airbytehq#14494) 🐛Source-mssql: aligned regular and cdc syncs and its datatype tests (airbytehq#14379) 🎉 Source Amazon Seller Partner: Add new streams (airbytehq#13604) bump source-file-secure (airbytehq#14704) 🎉 New source: Timely airbytehq#13292 (airbytehq#14335) 🪟🔧 Refactor feature service (airbytehq#14559) [low code cdk] add a transformation for adding fields into an outgoing record (airbytehq#14638) Bump destination-postgres to 0.3.21 (airbytehq#14479) Remove `additionalProperties: false` from JDBC destination connectors (airbytehq#14618) 🎉 Source Notion: add OAuth authorization for source-notion connector (airbytehq#14706) Use the configuration diff calculation in the update endpoint (airbytehq#14626) 🪟 🐛 Fix input validation on blur and cleanup signup error handling (airbytehq#14724) lower sleep after wait for successful job (airbytehq#14725) Add configuration diff (airbytehq#14603) ...
What
Add two new report types:
How
Describe the solution
Recommended reading order
x.java
y.python
🚨 User Impact 🚨
Are there any breaking changes? What is the end result perceived by the user? If yes, please merge this PR with the 🚨🚨 emoji so changelog authors can further highlight this if needed.
Pre-merge Checklist
Expand the relevant checklist and delete the others.
Updating a connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampleAirbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing/publish
command described hereTests
Unit
Integration
Put your integration tests output here.
Acceptance
Put your acceptance tests output here.