Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create benchmark and GitHub workflow to run it on PRs #50

Merged
merged 1 commit into from
Aug 21, 2023

Conversation

abelsiqueira
Copy link
Member

@abelsiqueira abelsiqueira commented Aug 18, 2023

Pull request details

Describe the changes made in this pull request

Creates a benchmark suite in file benchmark/benchmarks.jl.
Creates a workflow to run the benchmark when there is a new relevant pull request.
The workflow compares the current pull request with origin/main.
The comparison is posted in the pull request. An example can be seen here: abelsiqueira#2 (comment)

One alternative would be running only when a specific comment is written at a pull request. For instance, writing "/benchmark" starts the benchmark. This adds an extra step and, in return, we avoid running benchmarks for every new PR. Should we leave for later?

List of related issues or pull requests

Closes #13

Collaboration confirmation

As a contributor I confirm

  • I read and followed the instructions in README.dev.md
  • The documentation is up to date with the changes introduced in this Pull Request (or NA)
  • Tests are passing
  • Lint is passing

@codecov
Copy link

codecov bot commented Aug 18, 2023

Codecov Report

Patch coverage: 100.00% and no project coverage change.

Comparison is base (86f58b5) 100.00% compared to head (066d113) 100.00%.
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff            @@
##              main       #50   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files            3         3           
  Lines           67        67           
=========================================
  Hits            67        67           
Files Changed Coverage Δ
src/model.jl 100.00% <100.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@abelsiqueira
Copy link
Member Author

The new benchmark workflow is failing because there is no benchmark/benchmarks.jl file in the current main. It should properly after that.

@abelsiqueira abelsiqueira marked this pull request as ready for review August 21, 2023 07:40
@datejada datejada self-requested a review August 21, 2023 07:52
Copy link
Member

@datejada datejada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I approve the PR to avoid the benchmark failing, but we need to discuss later how often we want to run the benchmark depending on how much time takes.

@datejada datejada merged commit 4efb592 into TulipaEnergy:main Aug 21, 2023
13 of 14 checks passed
@abelsiqueira abelsiqueira deleted the 13-benchmark branch August 21, 2023 08:23
@suvayu
Copy link
Member

suvayu commented Sep 5, 2023

IMO this should be a bot, and a cron job. Bot to manually trigger it in a PR, and cron that checks performance regression, say, once a week.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Create script to benchmark model using BenchmarkTools.jl
3 participants