-
Notifications
You must be signed in to change notification settings - Fork 298
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Assert from within algorithm section during model check #10828
Comments
The reason why the assert triggers is because the compiler tries to evaluate the function while evaluating the binding of So the question is whether the assert triggering is correct or not, which I'm not sure about. Can you use the model in a simulation if you do any of the changes you listed? I tried doing it myself, but I'm not sure how to actually use the model correctly in a simulation. If the assert triggers during a simulation too it's probably correct, otherwise there might be some issue with the function evaluation. |
I added a testcase to two of the minimum examples above. Here is the minimum example where I changed |
- Remove variability prefixes from function parameters since they have no semantic meaning, to avoid e.g. function inputs being constant evaluated. - Add some debug output from EvalFunction using the existing evalFuncDump debug flag. Fixes OpenModelica#10828
Fixed in #10836. The issue was that the parameter inputs of the function confused the compiler and caused it to constant evaluate However, variability prefixes on function inputs/outputs doesn't actually have any semantic meaning in Modelica, so declaring an input as parameter shouldn't have any effect. So the fix I implemented is to just remove such prefixes and treat them like normal inputs/outputs. |
Thanks for the fix and the side lesson regarding the variability of function inputs |
- Remove variability prefixes from function parameters since they have no semantic meaning, to avoid e.g. function inputs being constant evaluated. - Add annotation '__OpenModelica_functionVariability' to keep the variability regardless, which is useful for defining some of the builtin functions. - Add some debug output from EvalFunction using the existing evalFuncDump debug flag. Fixes OpenModelica#10828
- Remove variability prefixes from function parameters since they have no semantic meaning, to avoid e.g. function inputs being constant evaluated. - Add some debug output from EvalFunction using the existing evalFuncDump debug flag. Fixes #10828
Description
During work on our BRSL I found another peculiar thing, where I cannot wrap my head around. An assert from within an algorithm section is triggered when checking the model. But I assumed that such asserts are only checked when the model is simulated. I am not sure as how to describe this problem other than that this model does not produce the same problems in other Modelica tools (namely SimulationX and Dymola). If you do not have access to a BRSL version, where there is only one main package named
Rexroth_BRSL
, please let me know.Steps to Reproduce
AssertDuringCompilation.txt
AssertDuringCompilation.ModelWithFunctions
throws the following error when being checked:3a. The extend statement in
AssertDuringCompilation.ModelWithFunctions
is removed, whereflowTablePassed
is usedNoAssertDuringCompilation1.txt
3b. The
evaluate=true
annotation offlowTablePassed
inAssertDuringCompilation.ModelWithFunctions
is removedNoAssertDuringCompilation2.txt
3c.
twoSided
inNoAssertDuringCompilation3.Function1
is changed from a parameter input to an input. As a result,NoAssertDuringCompilation3.Function1
cannot be checked anymore butAssertDuringCompilation.ModelWithFunctions
where the function is used checks successfully.NoAssertDuringCompilation3.txt
Expected Behavior
The error message does not appear when checking the model and the model can be used in simulations.
Version and OS
Thanks for looking into this
Aaron Buntrock - Bosch Rexroth
The text was updated successfully, but these errors were encountered: