Skip to content

Conversation

tswast
Copy link
Collaborator

@tswast tswast commented Sep 23, 2024

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • Ensure the tests and linter pass
  • Code coverage does not decrease (if any source code was changed)
  • Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

@tswast tswast requested review from a team as code owners September 23, 2024 21:50
@tswast tswast requested a review from GarrettWu September 23, 2024 21:50
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@product-auto-label product-auto-label bot added the size: l Pull request size is large. label Sep 23, 2024
@product-auto-label product-auto-label bot added the api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. label Sep 23, 2024
@tswast
Copy link
Collaborator Author

tswast commented Sep 24, 2024

Notebook failure is "FAILED notebooks/remote_functions/remote_function_usecases.ipynb:: - ResourceExhausted: 429 Too many operations are currently being executed, try again later." not this notebook.

@@ -40,6 +40,16 @@
"execution_count": 2,
Copy link
Contributor

@GarrettWu GarrettWu Sep 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Line #5.    last_30_days = now - datetime.timedelta(days=30)

This is not needed then.


Reply via ReviewNB

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. I just noticed that the max() call is doing a 4+ TB query. I will try to fix that. The trouble with using last_7_days for both is that I was getting empty results for the deps DataFrame. I will use the 30 day comparison for that to avoid the expensive max() call.

@tswast tswast merged commit 3c54399 into main Sep 24, 2024
22 of 23 checks passed
@tswast tswast deleted the tswast-partial-ordering branch September 24, 2024 20:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. size: l Pull request size is large.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants