Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix FastAI crash edge-case #3416

Merged
merged 1 commit into from Jul 24, 2023

Conversation

Innixma
Copy link
Contributor

@Innixma Innixma commented Jul 24, 2023

Issue #, if available:

Description of changes:

  • Fixed extremely rare crash when the batch time estimate was so fast that batch_time_tracker_callback.batch_measured_time returns None, causing downstream numerical operations to crash.
  • Added type hints

Bug found when fitting on the visualizing_soil OpenML dataset.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@Innixma Innixma requested review from tonyhoo and yinweisu July 24, 2023 16:56
@github-actions
Copy link

Job PR-3416-e74a332 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-3416/e74a332/index.html

@Innixma Innixma merged commit 4137140 into autogluon:master Jul 24, 2023
29 checks passed
ddelange added a commit to ddelange/autogluon that referenced this pull request Jul 26, 2023
* 'master' of https://github.com/awslabs/autogluon:
  Fix crash when `time_limit` is None and overwritten via `max_time_limit` (autogluon#3418)
  Upgrade torch 2.0 (autogluon#3404)
  Fix FastAI crash edge-case (autogluon#3416)
  Add log for fit time adjustments (autogluon#3408)
  [fix][eda] Anomaly Detection - disable bps_flag for SUOD (autogluon#3406)
  [timeseries] Remove all references to MXNet (autogluon#3396)
  Remove ray lightning (autogluon#3398)
  [AutoMM] Add unit tests for trimming sequence lengths (autogluon#3399)
  [FIX] fix detection tutorial index (autogluon#3397)
  Refactor path (autogluon#3355)
  [Doc] Add AutoMM FAQs (autogluon#3388)
  [AutoMM] Fix the input keys of categorical MLP (autogluon#3384)
  [AutoMM] Clean up text processor (autogluon#3383)
  [AutoMM] Support customizing use_fast for AutoTokenizer (autogluon#3379)
  vw version bump (autogluon#3373)
  0.8.2 post release (autogluon#3377)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants