Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GH-39663: [C++] Ensure top-level benchmarks present informative metrics #40091

Merged
merged 1 commit into from
Feb 19, 2024

Conversation

pitrou
Copy link
Member

@pitrou pitrou commented Feb 15, 2024

Rationale for this change

Some benchmarks may present only an iteration time, or not present sufficiently informative metrics.

What changes are included in this PR?

Add bytes/second and/or items/second metrics to top-level benchmarks where applicable.

This PR only tackles miscellaneous benchmarks from the top-level Arrow directory, as well as IO, IPC and utilities.

Are these changes tested?

Manually.

Are there any user-facing changes?

No.

… metrics

Add bytes/second and/or items/second metrics where applicable.
Copy link

⚠️ GitHub issue #39663 has been automatically assigned in GitHub to PR creator.

@pitrou
Copy link
Member Author

pitrou commented Feb 15, 2024

@ursabot please benchmark lang=C++

@ursabot
Copy link

ursabot commented Feb 15, 2024

Benchmark runs are scheduled for commit c10547a. Watch https://buildkite.com/apache-arrow and https://conbench.ursa.dev for updates. A comment will be posted here when the runs are complete.

Copy link

Thanks for your patience. Conbench analyzed the 1 benchmarking run that has been run so far on PR commit c10547a.

There were 2 benchmark results indicating a performance regression:

The full Conbench report has more details.

Copy link
Member

@kou kou left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

@github-actions github-actions bot added awaiting merge Awaiting merge and removed awaiting review Awaiting review labels Feb 15, 2024
@pitrou pitrou merged commit b224c58 into apache:main Feb 19, 2024
34 of 37 checks passed
@pitrou pitrou removed the awaiting merge Awaiting merge label Feb 19, 2024
@pitrou pitrou deleted the gh39663-benchmark-metrics branch February 19, 2024 15:10
Copy link

After merging your PR, Conbench analyzed the 6 benchmarking runs that have been run so far on merge-commit b224c58.

There were no benchmark performance regressions. 🎉

The full Conbench report has more details. It also includes information about 2 possible false positives for unstable benchmarks that are known to sometimes produce them.

zanmato1984 pushed a commit to zanmato1984/arrow that referenced this pull request Feb 28, 2024
… metrics (apache#40091)

### Rationale for this change

Some benchmarks may present only an iteration time, or not present sufficiently informative metrics.

### What changes are included in this PR?

Add bytes/second and/or items/second metrics to top-level benchmarks where applicable.

This PR only tackles miscellaneous benchmarks from the top-level Arrow directory, as well as IO, IPC and utilities.

### Are these changes tested?

Manually.

### Are there any user-facing changes?

No.
* Closes: apache#39663

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
thisisnic pushed a commit to thisisnic/arrow that referenced this pull request Mar 8, 2024
… metrics (apache#40091)

### Rationale for this change

Some benchmarks may present only an iteration time, or not present sufficiently informative metrics.

### What changes are included in this PR?

Add bytes/second and/or items/second metrics to top-level benchmarks where applicable.

This PR only tackles miscellaneous benchmarks from the top-level Arrow directory, as well as IO, IPC and utilities.

### Are these changes tested?

Manually.

### Are there any user-facing changes?

No.
* Closes: apache#39663

Authored-by: Antoine Pitrou <antoine@python.org>
Signed-off-by: Antoine Pitrou <antoine@python.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[C++] Top-level Arrow benchmarks should present a items/s or bytes/s metric
4 participants