Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

benchmarks.yml: Run on addition of label instead of comment #2002

Merged
merged 1 commit into from
Jan 26, 2024

Conversation

EwoutH
Copy link
Contributor

@EwoutH EwoutH commented Jan 25, 2024

Instead of triggering on a specific comment, which caused issues with running in a different scope (issue_comment instead of pull_request_target), the workflow can now instead be triggered by adding a label "trigger-benchmarks" to the PR.

The label can be repeatedly added and removed to rerun benchmarks.

Screenshot_363

Instead of triggering on a specific comment, which caused issues with running in a different scope (issue_comment instead of pull_request_target), the workflow can now be triggered by adding a label "trigger-benchmarks" to the PR.
@EwoutH EwoutH added the ci Release notes label label Jan 25, 2024
@EwoutH EwoutH requested review from rht and Corvince January 25, 2024 11:30
Copy link

Performance benchmarks:

Model Size Init time [95% CI] Run time [95% CI]
Schelling small 🔵 +1.3% [+1.0%, +1.5%] 🔵 +0.2% [+0.0%, +0.3%]
Schelling large 🔵 +0.7% [+0.4%, +1.0%] 🔵 -1.2% [-2.4%, +0.0%]
WolfSheep small 🔵 +0.5% [+0.1%, +0.8%] 🔵 +0.5% [+0.4%, +0.6%]
WolfSheep large 🔵 +0.1% [-1.2%, +1.5%] 🔵 +0.1% [-1.8%, +2.0%]
BoidFlockers small 🔵 +1.1% [+0.8%, +1.4%] 🔵 +1.9% [+1.3%, +2.5%]
BoidFlockers large 🔵 +0.9% [+0.5%, +1.3%] 🔵 +1.1% [+0.6%, +1.6%]

@Corvince
Copy link
Contributor

If this works this is probably a good solution. While your changing the trigger, maybe also now deactivate the trigger on open pull requests? I think a lot of PRs a not related to performance and the performance benchmark is a bit distracting there. Or maybe run on every PR, but don't comment without the label?

@rht
Copy link
Contributor

rht commented Jan 25, 2024

What about reacting to the existing benchmark comment with a rocket emoji to retrigger the benchmark? Figuring out a more convenient way to do the retrigger than removing and readding the label.

@EwoutH
Copy link
Contributor Author

EwoutH commented Jan 25, 2024

@rht Good idea, but unfortunatly this only works if it's specifically in the PR scope. Everything related to comments isn't. Just an limitation of events in GitHub Actions.

A label is related to the PR scope, so thus it works.

@Corvince I like the explicit confirmation that a PR doesn't change performance. But I also recognize it can add noise. Let me think about it a bit.

@quaquel
Copy link
Contributor

quaquel commented Jan 25, 2024

I also like the explicit performance check. Another hidden benefit is that it is a form of integration testing because we run three full models through the benchmark. Some of the weird backward compatibility issues we have seen would have been caught if this machinery had been in place.

@EwoutH
Copy link
Contributor Author

EwoutH commented Jan 26, 2024

I can’t think of a more elegant solution than using the label, consider the restrictions of GitHub Actions.

In #1999 I ensured the benchmarks don’t change if there weren’t Python code changes. That should help a little against the noise.

I would be in favor of merging this.

@quaquel
Copy link
Contributor

quaquel commented Jan 26, 2024

looks good to me

@rht rht merged commit 2cdc9f0 into projectmesa:main Jan 26, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci Release notes label
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants