Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

integrate simulationCompare script #492

Closed
mwetter opened this issue Nov 4, 2022 · 2 comments · Fixed by #493
Closed

integrate simulationCompare script #492

mwetter opened this issue Nov 4, 2022 · 2 comments · Fixed by #493
Assignees

Comments

@mwetter
Copy link
Member

mwetter commented Nov 4, 2022

This issue is to integrate the script that compares translation and simulation performance across branches or tools.

@mwetter mwetter self-assigned this Nov 4, 2022
@mwetter
Copy link
Member Author

mwetter commented Nov 7, 2022

Development branch is issue492_simulationCompare

@JayHuLBL : It looks like the failed models are not reported. They don't seem to part of the list that is processed in _generateHtmlTable.
Do reproduce, uncomment in buildingspy/tests/test_development_Comparison.py the line

#s.run()

and run

 make unittest_development_Comparison 

Open results/html/tools_compare_master.html and then comment the above line, and change in comparison-openmodelica.log the entry "success": true to "success": false for IBPSA.Utilities.Psychrometrics.Examples.Density_pTX".
Then, this model is no longer part of the html file.

Can you please reproduce this and correct it. Please also uncomment the LaTeX output as this is not maintained.

@JayHuLBL
Copy link
Contributor

JayHuLBL commented Nov 8, 2022

@mwetter
Through commit 1ae12a0, the failed models will be successfully listed at the top of the html output. These models then are no longer listed in the tables below.
For the LaTex output, do you mean comment the output so there is no LaTex output? For now, there would be LaTex output if there is any model being flagged (big difference). If not, the LaTex output folder will be empty.

mwetter added a commit that referenced this issue Nov 14, 2022
Added module to compare simulation performance.
For #492

Refactored regression tests for Dymola to allow specifying a time out for each tests, and set the default time out to 300 seconds
For #495

Co-authored-by: JayHuLBL <jianjunhu@lbl.gov>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants