Skip to content

Add warning messages when algorithms are skipped in autotune benchmarking #723

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

ChrisRackauckas-Claude
Copy link
Contributor

Summary

  • Adds explicit warning messages when algorithms are skipped for larger matrix sizes during autotune benchmarking
  • Improves user visibility into the autotuning process by showing when and why algorithms are being skipped

Problem

When algorithms exceed maxtime during benchmarking, they are correctly skipped for larger matrix sizes. However, users only saw the initial "exceeded maxtime" warning but not when algorithms were subsequently skipped. This made it unclear why some algorithms weren't being tested on larger matrices.

Users would see logs like:

┌ Warning: Algorithm GenericLUFactorization exceeded maxtime (130.29s > 100.0s) for size 9000, eltype Float64. Will skip for larger matrices.
┌ Warning: Algorithm SimpleLUFactorization exceeded maxtime (134.76s > 100.0s) for size 9000, eltype Float64. Will skip for larger matrices.

But then would not see any indication that these algorithms were actually being skipped on the larger 15000×15000 matrices, leading to confusion about whether the skipping logic was working.

Solution

Added a warning message in benchmarking.jl:151 when algorithms are skipped:

@warn "Algorithm $name skipped for size $n (exceeded maxtime on size $max_allowed_size matrix)"

Now users see both:

  1. Initial maxtime exceeded warning: Algorithm X exceeded maxtime (Y.Zs > maxtime) for size N
  2. Skip warnings for larger sizes: Algorithm X skipped for size M (exceeded maxtime on size N matrix)

Test Plan

  • Verified the fix works by creating test scenarios with very small maxtime values
  • Confirmed warning messages appear in console output when algorithms are skipped
  • Ensured the skipping logic continues to work correctly (algorithms that exceed maxtime are properly excluded from larger matrix tests)
  • Verified no regression in existing functionality

The fix is a simple one-line addition that improves user experience without changing any algorithmic behavior.

🤖 Generated with Claude Code

ChrisRackauckas and others added 2 commits August 13, 2025 09:51
…king

When algorithms exceed maxtime during benchmarking, they are correctly skipped
for larger matrix sizes. However, users only saw the initial "exceeded maxtime"
warning but not when algorithms were subsequently skipped, making it unclear
why some algorithms weren't being tested on larger matrices.

This adds explicit warning messages when algorithms are skipped, providing
better visibility into the autotuning process.

Fixes the issue where users would see:
- "Algorithm X exceeded maxtime" warnings
- But no indication that Algorithm X was being skipped for larger sizes

Now users will see both:
- "Algorithm X exceeded maxtime (Y.Zs > maxtime) for size N"
- "Algorithm X skipped for size M (exceeded maxtime on size N matrix)"

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
Add a newline before skip warnings to ensure they appear cleanly
separated from progress bar output, making them much easier to read.

Before: Skip warnings were mixed inline with progress bar
After: Skip warnings appear on clean new lines

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
@ChrisRackauckas ChrisRackauckas merged commit 3851731 into SciML:main Aug 13, 2025
112 of 117 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants