-
Notifications
You must be signed in to change notification settings - Fork 321
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comparing Enso benchmarks results with Enso #5165
Comments
We should look to do this as part of case study work |
Pavel Marek reports a new STANDUP for today (2023-03-10): Progress: Extending the python script to be able to fill in Jinja html template so that it displays all the benchmarks as line charts in a single static page. There will be an option to just create an intermediate CSV from the script, without generating the HTML page, so that someone can analyse it in the Enso project. Enso is not fit for this task though, as this is only about visualization and no analysis (no data transformation is done, only downloaded and plotted). It should be finished by 2023-03-13. |
Pavel Marek reports a new STANDUP for today (2023-03-11): Progress: Playing around with the google charts so that every point in the chart has an URL to the commit for convenience. At the end, in the charts, we should be able to display difference and percentual difference as tooltip, and also commit ID, date, author, etc, as additional information below the chart. It should be finished by 2023-03-13. |
Pavel Marek reports a new STANDUP for today (2023-03-13): Progress: Almost finished up the script to generate the single web page. We have got a working selection handler that displays information once you select a certain point, also fixed cases when the script wants to download an artifact that has already been expired. We should keep artifacts from the engine brenchmark jobs for the longer period than the default one, which is 90 days. We can push |
Pavel Marek reports a new STANDUP for the provided date (2023-03-08): Progress: Created PR. Python script for downloading benchmarks refreshed and tested. Enso project for analysis. Probably will use Python Jinja + Google charts to display all benchmarks on a generated static page. Benchmark jobs are successfull again. It should be finished by 2023-03-13. |
Pavel Marek reports a new 🔴 DELAY for today (2023-03-15): Summary: There is 3 days delay in implementation of the Comparing Enso benchmarks results with Enso (#5165) task. Delay Cause: Need to add some more docs, and probably also functionality to compare two different branches, rather than hard-coding only benchmarks for a develop branch. |
Pavel Marek reports a new STANDUP for today (2023-03-15): Progress: Trying to add a functionality to compare results from two different branches. This would allow to check for benchmark regressions on a PR before it is merged to develop - just run benchmark for that PR's branch, and check it via this script. It should be finished by 2023-03-16. |
This task is automatically imported from the old Task Issue Board and it was originally created by jaroslavtulach.
Original issue is here.
Based on following discord discussion.
We are able to execute benchmarks for the engine on a dedicated CI machine. However we don't have a way to be notified of regressions or simplify comparing of the result.
As Enso is data processing oriented system, it would be great testcase of Enso capabilities and stability to process the available data in Enso and render them in a chart rather than manually compare two XML files and search for unacceptable differences.
As an initial step let's use Enso and its libraries for head-less processing:
Future Ideas
As a primary tool, let's use the Enso IDE for displaying the charts:
Tasks:
The text was updated successfully, but these errors were encountered: